Supercharge Your Laravel Development and Get AI to Understand Your Models
Hey there, Laravel enthusiasts! Today, I’m diving into a nifty trick that’ll make getting AI to understand your Laravel model structures a breeze. We’ll harness the power of bash scripting and AI to analyze our migrations quickly and efficiently. Let’s get started!
The Problem: Getting AI to Return Code For your Project Setup Easily
As our Laravel projects grow, so does the complexity of our database structures. Migrations pile up, relationships intertwine, and before you know it, you’re drowning in a sea of Schema::create
 and up
 functions. Wouldn’t it be great to get a bird’s-eye view of our models without manually sifting through dozens of files? What if you could get AI to understand your Laravel project without adding every file to the chat window?
The Solution: Bash + AI = Developer’s Best Friend
Here’s where our dynamic duo comes in a clever bash one-liner and your favorite AI chat tool (like ChatGPT or Claude). We’ll use bash to extract the relevant parts of our migrations and then feed that information to an AI for analysis and insights.
Step 1: The Magical Bash One-Liner
Here’s the one-liner code to copy and paste:
for file in *.php; do echo -e "\n\`\`\`\n${file%.*}:\n\`\`\`\n"; sed -n '/Schema::create/,/^ }/p' "$file"; echo -e "\n\`\`\`"; sed -n '/public function up/,/^ }/p' "$file" | sed '1d;$d'; echo -e "\`\`\`\n"; done
First, let’s break down our bash sorcery to understand what’s going on in an easier-to-read format:
for file in *.php; do
echo -e "\n\`\`\`\n${file%.*}:\n\`\`\`\n"
sed -n '/Schema::create/,/^ }/p' "$file"
echo -e "\n\`\`\`"
sed -n '/public function up/,/^ }/p' "$file" | sed '1d;$d'
echo -e "\`\`\`\n"
done
Just add | pbcopy to have it copied directly to your clipboard on Mac:
for file in *.php; do echo -e "\n\`\`\`\n${file%.*}:\n\`\`\`\n"; sed -n '/Schema::create/,/^ }/p' "$file"; echo -e "\n\`\`\`"; sed -n '/public function up/,/^ }/p' "$file" | sed '1d;$d'; echo -e "\`\`\`\n"; done | pbcopy
This command does the following:
- Loops through all PHP files in the current directory
- Extracts the
Schema::create
function contents - Extracts the
up
function contents (minus the function declaration) - Formats the output with the filename and proper Markdown code blocks
Step 2: Running the Command
- Open your terminal and navigate to your Laravel project’s migrations directory.
- Copy the bash one-liner above.
- Paste it into your terminal and hit Enter.
- Watch as it generates a nicely formatted output of your migrations!
Step 3: Feeding the Output to AI
Please copy the entire output from your terminal and paste it into your AI chat of choice with your preferred prompt. I usually start with this prompt:
"Analyze these Laravel migrations and understand the database and model structure, including relationships. For all conversations in this chat, refer to these migrations:"
{output here}
Generate a diagram
Here are some additional prompts you can optionally pair with the output to extract valuable insights and improvements:
- “Identify any potential relationships between these models based on the migrations.”
- “Suggest improvements or potential issues in this database design.”
- “Take this example of JSON output and generate the code to create the [model name] with its relations.”
The Benefits: Why This Approach Rocks
- Time-saver: No more manual searching through migration files or copying content manually or individually.
- Comprehensive view: Get a complete picture of your database structure in one go.
- AI-powered insights: Leverage AI to spot patterns, suggest optimizations, explain complex relationships, and generate code faster.
- Learning tool: Great for understanding legacy projects or onboarding new team members.
Wrapping Up
There you have it, folks! With this simple bash one-liner and the power of AI, you can transform how you analyze and understand your Laravel database structures and output code. Give it a try on your next project, and watch your productivity soar!
Remember, tools like these are meant to augment your skills, not replace them. Always review AI suggestions critically and trust your developer instincts.
How to Take Ownership of Files and Folders Using PowerShell
Ever had to take ownership of a bunch of files and folders? It’s a pain, right? Well, not anymore!
The Problem
Picture this: You’ve just gotten an external hard drive from a dead computer and need to access the files. But wait! You don’t have the correct permissions. I had to do this, and setting the file permissions through Explorer was failing randomly. It appears that the folders all had different permissions, and the propagation was failing.
The Solution
I’ve got a PowerShell script that’ll save you time. It does two things:
- Takes ownership of a folder and all its subdirectories and files (recursively) and sets the owner to the current logged-in user
- Adds the current user to the permissions with full control
The Script
First, here’s the script. Don’t worry, I’ll break it down for you:
# Ensure the script is running with administrator privileges
if (-NOT ([Security.Principal.WindowsPrincipal][Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator"))
{
Write-Warning "You do not have Administrator rights to run this script!`nPlease re-run this script as an Administrator!"
Break
}
# Get the current user
$currentUser = [System.Security.Principal.WindowsIdentity]::GetCurrent().Name
# Function to take ownership and set permissions
function Set-OwnershipAndPermissions {
param (
[string]$path
)
try {
# Take ownership
$acl = Get-Acl $path
$owner = New-Object System.Security.Principal.NTAccount($currentUser)
$acl.SetOwner($owner)
Set-Acl -Path $path -AclObject $acl -ErrorAction Stop
# Set permissions
$acl = Get-Acl $path
if (Test-Path -Path $path -PathType Container) {
# It's a directory
$permission = $currentUser, "FullControl", "ContainerInherit,ObjectInherit", "None", "Allow"
} else {
# It's a file
$permission = $currentUser, "FullControl", "None", "None", "Allow"
}
$accessRule = New-Object System.Security.AccessControl.FileSystemAccessRule $permission
$acl.SetAccessRule($accessRule)
Set-Acl -Path $path -AclObject $acl -ErrorAction Stop
Write-Host "Successfully processed: $path"
}
catch {
Write-Warning "Failed to process $path. Error: $_"
}
# Process subdirectories and files if it's a directory
if (Test-Path -Path $path -PathType Container) {
try {
Get-ChildItem $path -Force -ErrorAction Stop | ForEach-Object {
Set-OwnershipAndPermissions $_.FullName
}
}
catch {
Write-Warning "Failed to access contents of $path. Error: $_"
}
}
}
# Prompt for the folder path
$folderPath = Read-Host "Enter the full path of the folder"
# Check if the folder exists
if (Test-Path $folderPath) {
# Run the function
Set-OwnershipAndPermissions $folderPath
Write-Host "Process completed for $folderPath and all accessible contents."
} else {
Write-Host "The specified folder does not exist."
}
How to Use It
- Copy this script and save it as a
.ps1
file (liketake_ownership.ps1
). - Right-click on PowerShell and select “Run as administrator” (this is crucial!).
- Navigate to where you saved the script.
- Run it by typing
.\take_ownership.ps1
. - When prompted, enter the full path of the folder you want to process.
What’s Going On Here?
Let’s break this down a bit:
- Admin Check: The script starts by ensuring you run it as an admin. No admin rights? No dice.
- Current User: It grabs the current user’s name. This is who’s going to own everything.
- The Magic Function:
Set-OwnershipAndPermissions
is where the real magic happens. It:- Takes ownership of the item (file or folder)
- Sets the current user as the owner
- Gives the current user full control
- If it’s a folder, it does all this recursively for everything inside
- Error Handling: The script’s got your back with some neat error handling. It’ll let you know if it can’t process something and keep on truckin’.
- File vs. Folder: It’s smart enough to know the difference between files and folders and set the right permissions for each.
Why This is Awesome
- Time Saver: Imagine doing all this manually. Yikes!
- Consistency: It applies the same permissions everywhere, no mistakes.
- Flexibility: Works on any folder you point it to.
Wrapping Up
There you have it, folks! A powerful little script to take control of your files and folders. No more permission headaches, no more “access denied” nightmares — just pure, unadulterated file access bliss.
Got questions? Hit me up in the comments. And don’t forget to share this with your IT buddies – they’ll thank you later!
Happy scripting!
How to Move Files by Partial File Name to New Directory on Mac using a shell script
Streamline Your Photo Editing Workflow: How to Use a Shell Script to Prioritize Selected Images
When it comes to photo editing, efficiency is key. Recently, I faced a challenge where a subject picked out 100 priority photos from a larger batch. Given the sheer number of images, manually sorting them wasn’t practical. All pictures from my SLR were named in the DCIM_# format, so I noted the last five numbers of the selected files. To streamline the process, I wrote a shell script that automatically moves the chosen files to a new folder, allowing me to focus on the priority images first. I’m sharing this script to help others enhance their photo editing workflow.
Step-by-Step Instructions on How to Run a Shell Script
Create the Shell Script:
- Open your favorite text editor (e.g., VSCode, Sublime Text, or even a basic text editor like Notepad).
- Write or paste your shell script. Here’s a simple example for moving selected files:
#!/bin/bash
# Directory containing the files
DIR="{Path}"
# Subfolder to move matching files to
TO_PROCESS_DIR="${DIR}/To Process"
# Ensure the "To Process" directory exists
mkdir -p "$TO_PROCESS_DIR"
# List of substrings to check
substrings=(
06340 06422 06500 06518 06521 06524 06539 06542 06545 06548 06553 06581
06584 06587 06608 06611 06617 06629 06649 06662 06980 06695 06733 06734
06737 06740 06743 06758 06761 06767 06773 06779 06787 06788 06791 06794
06797 06815 06818 06821 06845 06878 06890 06898 06899 06941 06947 06959
06962 06965 06970 06971 06977 06985 06986 07009 07040 07055 07062 07068
07074 07080 07083 07086 07089 07092 07097 07101 07104 07107 07113 07119
07128 07143 07146 07182 07185 07188 07191 07197 07200 07215 07218 07221
07230 07291 07302 07323 07329 07332 07338 07350 07356 07359 07476 07362
07365 07368 07389 07413 07435 07431 07461 07497 07500 07503 07506 07509
07512 07518 07524 07527 07530 07536 07542 07548 07557 07578 07584 07593
)
# Debugging: Check if the directory exists
if [ ! -d "$DIR" ]; then
echo "Directory $DIR does not exist."
exit 1
fi
# Debugging: Check if there are files in the directory
if [ -z "$(ls -A "$DIR")" ]; then
echo "No files found in the directory $DIR."
exit 1
fi
# Loop through all files in the directory
for file in "$DIR"/*; do
filename=$(basename "$file")
echo "Processing file: $filename"
moved=false
for substring in "${substrings[@]}"; do
if [[ "$filename" == *"$substring"* ]]; then
echo "Found substring '$substring' in file: $filename"
# Move the file to the "To Process" directory
mv "$file" "$TO_PROCESS_DIR"
moved=true
break
fi
done
if [ "$moved" = false ]; then
echo "No matching substring found in file: $filename"
fi
done
Save the Script:
- Save the file with a
.sh
extension, for example,move_photos.sh
.
Make the Script Executable:
- Open a terminal window.
- Navigate to the directory where your script is saved using the
cd
command. For example:cd /path/to/your/script
- Make your script executable by running:
chmod +x move_photos.sh
Run the Script:
- Still in the terminal, run your script by typing:
./move_photos.sh
Verify the Results:
- Check the destination directory to ensure that the files have been moved successfully.
Additional Tips:
- Editing the File List: Modify the
FILES
array in the script to include the specific last five digits of the filenames you want to move. - Updating Directories: Ensure that the
SOURCE_DIR
andDEST_DIR
variables are set to the correct paths on your system.
By following these steps, you can efficiently manage and prioritize your photo editing tasks, saving valuable time and effort. This shell script can be customized and expanded to suit various file management needs, making it a versatile tool in your workflow.
How to enable MacFuse/PCloud Drive on Mac Sonoma 14.2.1
I recently upgraded to Mac Sonoma 14.2.1 and MacFuse stopped loading which affected my ability to load PCloud and NTFS drives. I spent a few days trying to troubleshoot everything and in the end it turned out I had to disable Mac System Integrity Protection to get everything to load. I’m sharing in case it helps anyone else.
To disable SIP on your Mac Sonoma for extensions like MacFuse and pCloud Drive:
- Shut down your Mac: Turn off your Mac completely by pressing and holding the power button until the screen goes black.
- Restart in Recovery Mode: Enabling Recovery Mode depends on your Mac. You can try the steps below or follow Apple’s guide here.
Power on your Mac and immediately press and hold Command (⌘) + R to enter Recovery Mode.
Power on your Mac and hold the power button down until Other Options displays. - Open Terminal: In Recovery Mode, select “Utilities” in the macOS Utilities window and select “Terminal.”
- Disable SIP: Type
csrutil disable
in Terminal and press Enter. This disables System Integrity Protection (SIP). - Restart Mac: Click the Apple menu and select “Restart” to reboot your Mac with SIP disabled.
- Re-enable SIP (if needed): Follow the same steps but use
csrutil enable
in Terminal to re-enable SIP.
Note: Disabling SIP poses security risks. Re-enable it once you’ve installed the necessary extensions. Only disable SIP when necessary and understand the potential risks involved.
How to Delete a Row in Excel Using the Elgato Stream Deck
Recently I was working on a massive Excel Spreadsheet and needed to manually review each entry and clean up rows that were no longer needed. The Elgato Stream Deck came in handy for a quick shortcut so I thought I’d share it in case anyone else can use it.
I took this opportunity to practice creating the first of what I hope are many training videos. This took me around 30 minutes to do from start to finish as I had to learn the video editing software including how to record, how to split and edit, and how to add text overlays. Hopefully the next videos will be faster but it was a fun exercise and I hope someone else finds it useful.
Installing font awesome pro with bun
After recently switching to bun.sh, I was trying to install Font Awesome Pro. It uses a private registry but their docs have not been updated to support non-npm package managers and bun does not yet support .npmrc files.
You can configure a private registry using an organization scope. First, you must get your auth token from your paid Font Awesome account by going to your account page, scrolling down to the Tokens section, and copying the token.
Copy and paste this string, and replace YOUR_TOKEN_HERE with the token you copied above:
[install.scopes]
"@fortawesome" = { token="YOUR_TOKEN_HERE" , url =
"https://npm.fontawesome.com/"}
Open the terminal and enter these commands:
touch $HOME/.bunfig.toml
nano $HOME/.bunfig.toml
Paste in the config above with your token and then hit CTRL+X to quit, and Y to save when prompted. Now you should be able to run
bun add @fortawesome/fontawesome-pro
How to Delete Folder with Special Character in Windows 10/11
I ran into an issue where a folder was created by some application with a special Unicode character that Windows Explorer doesn’t seem to play nicely with. I also was unable to tell what the character was since nothing would reveal it. The folder’s there, but you can’t rename or delete it. If I tried to remove or delete it, I’d get an error saying the folder doesn’t exist:
I have LockHunter installed but it wasn’t able to delete it for some reason. The easiest way I found to delete the folder was to use Git Bash and then use the appropriate commands to rename or delete the folder.
Browse to the folder where the offending folder is located. For example purposes, I’ll use c:\temp\folder1
cd c:/temp
Rename:
mv fol (hit tab to autocomplete) folder1
Delete:
del fol (hit tab to autocomplete)
If you don’t have Git Bash or are not a developer/power user, you can download the portable version from https://git-scm.com/download/win to use temporarily. Once you decompress the files to a folder, you’ll find git-bash.exe which you can double-click to run and use the above commands.
Using Laravel Vite with MAMP
As seen in other posts, I use MAMP quite a bit for my web development environment. I know I can run docker, or any of the other platforms out there but they use more memory and resources that I’d prefer to devote to my dev tools.
I started a new Laravel project and wanted to use MAMP but Vite was throwing errors due to the SSL not matching out of the box.
When adding
@vite('resources/js/app.js')
to my blade file, I’d get errors including:
This request has been blocked; the content must be served over HTTPS.
I found articles saying to add –https or –host to my package json command but then I got this error:
net::ERR_SSL_VERSION_OR_CIPHER_MISMATCH
Load MAMP Pro, add your host and generate your SSL certificates. For this example, we’ll use set the host name to my-app.test, and assume you’re storing the SSL keys in the default location.
Open vite.config.js and add the following 2 lines:
import fs from 'fs'; const host = 'my-app.test';
Then add this to defineConfig section:
server: { host, hmr: { host }, https: { key: fs.readFileSync(`/Applications/MAMP/Library/OpenSSL/certs/${host}.key`), cert: fs.readFileSync(`/Applications/MAMP/Library/OpenSSL/certs/${host}.crt`), }, },
You should now be able to run npm run dev and have no issues.
Sample full vite.config.js file for easy reference:
import { defineConfig } from 'vite'; import laravel from 'laravel-vite-plugin'; import fs from 'fs'; const host = 'my-app.text'; export default defineConfig({ server: { host, hmr: { host }, https: { key: fs.readFileSync(`/Applications/MAMP/Library/OpenSSL/certs/${host}.key`), cert: fs.readFileSync(`/Applications/MAMP/Library/OpenSSL/certs/${host}.crt`), }, }, plugins: [ laravel([ 'resources/sass/app.scss', 'resources/js/app.js', ]), ], });
How to deploy a React app to Amazon S3 using Gitlab CI/CD
I’ve been trying to build more CI/CD scripts using Gitlab to automate pipeline deployments for work. Here’s a useful one for building and deploying a React app to Amazon S3.
You’ll need to add a variable called S3_BUCKET_NAME to your repo or replace the variable with your bucket path.
stages:
- build
- deploy
build react-app:
#I'm using node:latest, but be sure to test or change to a version you know works. Sometimes node updates break the npm script.
image: node:latest
stage: build
only:
- master
script:
# Set PATH
- export PATH=$PATH:/usr/bin/npm
# Install dependencies
- npm install
# Build App
- CI=false npm run build
artifacts:
paths:
# Build folder
- build/
expire_in: 1 hour
deploy master:
image: python:latest
stage: deploy
only:
- master
script:
- pip3 install awscli
- aws s3 sync ./build s3://$S3_BUCKET_NAME --acl public-read
How to Setup CI/CD of Jigsaw Site to Digital Ocean Droplet Using Bitbucket Pipelines
I created a new personal resume site and decided I wanted to build a static site since it wouldn’t be frequently updated. I evaluated Nuxt, Gatsby, and a few others but settled on Jigsaw, a static site generator based on Laravel. I had never used it before and figured this would be a good learning experience while building something I needed. I was pleasantly surprised by how easy it is to use and setup, so kudos to the Tighten team for putting together such an elegant solution.
I wanted to get a CI/CD pipeline configured to handle the site’s deployment but couldn’t find any working tutorials, so I’m sharing my solution in case it helps others. I’m using Bitbucket for this since it’s a personal private repo, so I’m using Bitbucket Pipelines.
Instructions
After you enable pipelines for your project, you’ll need to configure a Pipelines Repository Variable in your project. Go to the settings tab in your repo, and then select Repository variables:
Add 3 variables:
- USER_NAME – The SSH user name you want Bitbucket to use to connect to your server.
- PRODUCTION_HOST – Your domain that Bitbucket should connect to
- FOLDER – Folder Path where the site should be deployed to
Generate an SSH key (or use your own) and add it to your server under the SSH Keys tab in Pipelines:
I generated a new key and then added it to ~/.ssh/authorized_keys for the account.
Add this YAML snippet to your bitbucket-pipelines.yml in your root. This will use PHP 7.4, install rsync, node + npm, composer, and build the production version of the site to deploy to the specified folder.
The -aVP switch for rsync is to give me verbose progress feedback so I can see what’s happening. If you don’t need the detail, switch it to -a.
image: php:7.4-fpm
pipelines:
branches:
master:
- step:
name: Jigsaw Build
script:
- apt-get update && apt-get install -y unzip
- apt-get install rsync openssh-client nodejs npm -y
- curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
- composer install
- npm install
- npm run production
- rsync -avP build_production/ $USER_NAME@$PRODUCTION_HOST:$FOLDER --exclude=bitbucket-pipelines.yml --chown=www-data:www-data
I received a few errors when rsync ran. In case you run into them as well, here’s the list and fixes. The first was:
rsync: failed to set times on "$FOLDER": Operation not permitted (1)
I added --no-t
to resolve that and then got a new error:
rsync: failed to set permissions on "$FOLDER": Operation not permitted (1)
which was fixed with adding the switch --no-perms
. My final rsync command became:
rsync -avP --no-t --no-perms build_production/ $USER_NAME@$PRODUCTION_HOST:$FOLDER --exclude=bitbucket-pipelines.yml --chown=www-data:www-data