Supercharge Your Laravel Development and Get AI to Understand Your Models
Hey there, Laravel enthusiasts! Today, I’m diving into a nifty trick that’ll make getting AI to understand your Laravel model structures a breeze. We’ll harness the power of bash scripting and AI to analyze our migrations quickly and efficiently. Let’s get started!
The Problem: Getting AI to Return Code For your Project Setup Easily
As our Laravel projects grow, so does the complexity of our database structures. Migrations pile up, relationships intertwine, and before you know it, you’re drowning in a sea of Schema::create
 and up
 functions. Wouldn’t it be great to get a bird’s-eye view of our models without manually sifting through dozens of files? What if you could get AI to understand your Laravel project without adding every file to the chat window?
The Solution: Bash + AI = Developer’s Best Friend
Here’s where our dynamic duo comes in a clever bash one-liner and your favorite AI chat tool (like ChatGPT or Claude). We’ll use bash to extract the relevant parts of our migrations and then feed that information to an AI for analysis and insights.
Step 1: The Magical Bash One-Liner
Here’s the one-liner code to copy and paste:
for file in *.php; do echo -e "\n\`\`\`\n${file%.*}:\n\`\`\`\n"; sed -n '/Schema::create/,/^ }/p' "$file"; echo -e "\n\`\`\`"; sed -n '/public function up/,/^ }/p' "$file" | sed '1d;$d'; echo -e "\`\`\`\n"; done
First, let’s break down our bash sorcery to understand what’s going on in an easier-to-read format:
for file in *.php; do
echo -e "\n\`\`\`\n${file%.*}:\n\`\`\`\n"
sed -n '/Schema::create/,/^ }/p' "$file"
echo -e "\n\`\`\`"
sed -n '/public function up/,/^ }/p' "$file" | sed '1d;$d'
echo -e "\`\`\`\n"
done
Just add | pbcopy to have it copied directly to your clipboard on Mac:
for file in *.php; do echo -e "\n\`\`\`\n${file%.*}:\n\`\`\`\n"; sed -n '/Schema::create/,/^ }/p' "$file"; echo -e "\n\`\`\`"; sed -n '/public function up/,/^ }/p' "$file" | sed '1d;$d'; echo -e "\`\`\`\n"; done | pbcopy
This command does the following:
- Loops through all PHP files in the current directory
- Extracts the
Schema::create
function contents - Extracts the
up
function contents (minus the function declaration) - Formats the output with the filename and proper Markdown code blocks
Step 2: Running the Command
- Open your terminal and navigate to your Laravel project’s migrations directory.
- Copy the bash one-liner above.
- Paste it into your terminal and hit Enter.
- Watch as it generates a nicely formatted output of your migrations!
Step 3: Feeding the Output to AI
Please copy the entire output from your terminal and paste it into your AI chat of choice with your preferred prompt. I usually start with this prompt:
"Analyze these Laravel migrations and understand the database and model structure, including relationships. For all conversations in this chat, refer to these migrations:"
{output here}
Generate a diagram
Here are some additional prompts you can optionally pair with the output to extract valuable insights and improvements:
- “Identify any potential relationships between these models based on the migrations.”
- “Suggest improvements or potential issues in this database design.”
- “Take this example of JSON output and generate the code to create the [model name] with its relations.”
The Benefits: Why This Approach Rocks
- Time-saver: No more manual searching through migration files or copying content manually or individually.
- Comprehensive view: Get a complete picture of your database structure in one go.
- AI-powered insights: Leverage AI to spot patterns, suggest optimizations, explain complex relationships, and generate code faster.
- Learning tool: Great for understanding legacy projects or onboarding new team members.
Wrapping Up
There you have it, folks! With this simple bash one-liner and the power of AI, you can transform how you analyze and understand your Laravel database structures and output code. Give it a try on your next project, and watch your productivity soar!
Remember, tools like these are meant to augment your skills, not replace them. Always review AI suggestions critically and trust your developer instincts.
How to Take Ownership of Files and Folders Using PowerShell
Ever had to take ownership of a bunch of files and folders? It’s a pain, right? Well, not anymore!
The Problem
Picture this: You’ve just gotten an external hard drive from a dead computer and need to access the files. But wait! You don’t have the correct permissions. I had to do this, and setting the file permissions through Explorer was failing randomly. It appears that the folders all had different permissions, and the propagation was failing.
The Solution
I’ve got a PowerShell script that’ll save you time. It does two things:
- Takes ownership of a folder and all its subdirectories and files (recursively) and sets the owner to the current logged-in user
- Adds the current user to the permissions with full control
The Script
First, here’s the script. Don’t worry, I’ll break it down for you:
# Ensure the script is running with administrator privileges
if (-NOT ([Security.Principal.WindowsPrincipal][Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator"))
{
Write-Warning "You do not have Administrator rights to run this script!`nPlease re-run this script as an Administrator!"
Break
}
# Get the current user
$currentUser = [System.Security.Principal.WindowsIdentity]::GetCurrent().Name
# Function to take ownership and set permissions
function Set-OwnershipAndPermissions {
param (
[string]$path
)
try {
# Take ownership
$acl = Get-Acl $path
$owner = New-Object System.Security.Principal.NTAccount($currentUser)
$acl.SetOwner($owner)
Set-Acl -Path $path -AclObject $acl -ErrorAction Stop
# Set permissions
$acl = Get-Acl $path
if (Test-Path -Path $path -PathType Container) {
# It's a directory
$permission = $currentUser, "FullControl", "ContainerInherit,ObjectInherit", "None", "Allow"
} else {
# It's a file
$permission = $currentUser, "FullControl", "None", "None", "Allow"
}
$accessRule = New-Object System.Security.AccessControl.FileSystemAccessRule $permission
$acl.SetAccessRule($accessRule)
Set-Acl -Path $path -AclObject $acl -ErrorAction Stop
Write-Host "Successfully processed: $path"
}
catch {
Write-Warning "Failed to process $path. Error: $_"
}
# Process subdirectories and files if it's a directory
if (Test-Path -Path $path -PathType Container) {
try {
Get-ChildItem $path -Force -ErrorAction Stop | ForEach-Object {
Set-OwnershipAndPermissions $_.FullName
}
}
catch {
Write-Warning "Failed to access contents of $path. Error: $_"
}
}
}
# Prompt for the folder path
$folderPath = Read-Host "Enter the full path of the folder"
# Check if the folder exists
if (Test-Path $folderPath) {
# Run the function
Set-OwnershipAndPermissions $folderPath
Write-Host "Process completed for $folderPath and all accessible contents."
} else {
Write-Host "The specified folder does not exist."
}
How to Use It
- Copy this script and save it as a
.ps1
file (liketake_ownership.ps1
). - Right-click on PowerShell and select “Run as administrator” (this is crucial!).
- Navigate to where you saved the script.
- Run it by typing
.\take_ownership.ps1
. - When prompted, enter the full path of the folder you want to process.
What’s Going On Here?
Let’s break this down a bit:
- Admin Check: The script starts by ensuring you run it as an admin. No admin rights? No dice.
- Current User: It grabs the current user’s name. This is who’s going to own everything.
- The Magic Function:
Set-OwnershipAndPermissions
is where the real magic happens. It:- Takes ownership of the item (file or folder)
- Sets the current user as the owner
- Gives the current user full control
- If it’s a folder, it does all this recursively for everything inside
- Error Handling: The script’s got your back with some neat error handling. It’ll let you know if it can’t process something and keep on truckin’.
- File vs. Folder: It’s smart enough to know the difference between files and folders and set the right permissions for each.
Why This is Awesome
- Time Saver: Imagine doing all this manually. Yikes!
- Consistency: It applies the same permissions everywhere, no mistakes.
- Flexibility: Works on any folder you point it to.
Wrapping Up
There you have it, folks! A powerful little script to take control of your files and folders. No more permission headaches, no more “access denied” nightmares — just pure, unadulterated file access bliss.
Got questions? Hit me up in the comments. And don’t forget to share this with your IT buddies – they’ll thank you later!
Happy scripting!
How to enable MacFuse/PCloud Drive on Mac Sonoma 14.2.1
I recently upgraded to Mac Sonoma 14.2.1 and MacFuse stopped loading which affected my ability to load PCloud and NTFS drives. I spent a few days trying to troubleshoot everything and in the end it turned out I had to disable Mac System Integrity Protection to get everything to load. I’m sharing in case it helps anyone else.
To disable SIP on your Mac Sonoma for extensions like MacFuse and pCloud Drive:
- Shut down your Mac: Turn off your Mac completely by pressing and holding the power button until the screen goes black.
- Restart in Recovery Mode: Enabling Recovery Mode depends on your Mac. You can try the steps below or follow Apple’s guide here.
Power on your Mac and immediately press and hold Command (⌘) + R to enter Recovery Mode.
Power on your Mac and hold the power button down until Other Options displays. - Open Terminal: In Recovery Mode, select “Utilities” in the macOS Utilities window and select “Terminal.”
- Disable SIP: Type
csrutil disable
in Terminal and press Enter. This disables System Integrity Protection (SIP). - Restart Mac: Click the Apple menu and select “Restart” to reboot your Mac with SIP disabled.
- Re-enable SIP (if needed): Follow the same steps but use
csrutil enable
in Terminal to re-enable SIP.
Note: Disabling SIP poses security risks. Re-enable it once you’ve installed the necessary extensions. Only disable SIP when necessary and understand the potential risks involved.
How to Delete a Row in Excel Using the Elgato Stream Deck
Recently I was working on a massive Excel Spreadsheet and needed to manually review each entry and clean up rows that were no longer needed. The Elgato Stream Deck came in handy for a quick shortcut so I thought I’d share it in case anyone else can use it.
I took this opportunity to practice creating the first of what I hope are many training videos. This took me around 30 minutes to do from start to finish as I had to learn the video editing software including how to record, how to split and edit, and how to add text overlays. Hopefully the next videos will be faster but it was a fun exercise and I hope someone else finds it useful.
How to Delete Folder with Special Character in Windows 10/11
I ran into an issue where a folder was created by some application with a special Unicode character that Windows Explorer doesn’t seem to play nicely with. I also was unable to tell what the character was since nothing would reveal it. The folder’s there, but you can’t rename or delete it. If I tried to remove or delete it, I’d get an error saying the folder doesn’t exist:
I have LockHunter installed but it wasn’t able to delete it for some reason. The easiest way I found to delete the folder was to use Git Bash and then use the appropriate commands to rename or delete the folder.
Browse to the folder where the offending folder is located. For example purposes, I’ll use c:\temp\folder1
cd c:/temp
Rename:
mv fol (hit tab to autocomplete) folder1
Delete:
del fol (hit tab to autocomplete)
If you don’t have Git Bash or are not a developer/power user, you can download the portable version from https://git-scm.com/download/win to use temporarily. Once you decompress the files to a folder, you’ll find git-bash.exe which you can double-click to run and use the above commands.
How to deploy a React app to Amazon S3 using Gitlab CI/CD
I’ve been trying to build more CI/CD scripts using Gitlab to automate pipeline deployments for work. Here’s a useful one for building and deploying a React app to Amazon S3.
You’ll need to add a variable called S3_BUCKET_NAME to your repo or replace the variable with your bucket path.
stages:
- build
- deploy
build react-app:
#I'm using node:latest, but be sure to test or change to a version you know works. Sometimes node updates break the npm script.
image: node:latest
stage: build
only:
- master
script:
# Set PATH
- export PATH=$PATH:/usr/bin/npm
# Install dependencies
- npm install
# Build App
- CI=false npm run build
artifacts:
paths:
# Build folder
- build/
expire_in: 1 hour
deploy master:
image: python:latest
stage: deploy
only:
- master
script:
- pip3 install awscli
- aws s3 sync ./build s3://$S3_BUCKET_NAME --acl public-read
How to Setup CI/CD of Jigsaw Site to Digital Ocean Droplet Using Bitbucket Pipelines
I created a new personal resume site and decided I wanted to build a static site since it wouldn’t be frequently updated. I evaluated Nuxt, Gatsby, and a few others but settled on Jigsaw, a static site generator based on Laravel. I had never used it before and figured this would be a good learning experience while building something I needed. I was pleasantly surprised by how easy it is to use and setup, so kudos to the Tighten team for putting together such an elegant solution.
I wanted to get a CI/CD pipeline configured to handle the site’s deployment but couldn’t find any working tutorials, so I’m sharing my solution in case it helps others. I’m using Bitbucket for this since it’s a personal private repo, so I’m using Bitbucket Pipelines.
Instructions
After you enable pipelines for your project, you’ll need to configure a Pipelines Repository Variable in your project. Go to the settings tab in your repo, and then select Repository variables:
Add 3 variables:
- USER_NAME – The SSH user name you want Bitbucket to use to connect to your server.
- PRODUCTION_HOST – Your domain that Bitbucket should connect to
- FOLDER – Folder Path where the site should be deployed to
Generate an SSH key (or use your own) and add it to your server under the SSH Keys tab in Pipelines:
I generated a new key and then added it to ~/.ssh/authorized_keys for the account.
Add this YAML snippet to your bitbucket-pipelines.yml in your root. This will use PHP 7.4, install rsync, node + npm, composer, and build the production version of the site to deploy to the specified folder.
The -aVP switch for rsync is to give me verbose progress feedback so I can see what’s happening. If you don’t need the detail, switch it to -a.
image: php:7.4-fpm
pipelines:
branches:
master:
- step:
name: Jigsaw Build
script:
- apt-get update && apt-get install -y unzip
- apt-get install rsync openssh-client nodejs npm -y
- curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
- composer install
- npm install
- npm run production
- rsync -avP build_production/ $USER_NAME@$PRODUCTION_HOST:$FOLDER --exclude=bitbucket-pipelines.yml --chown=www-data:www-data
I received a few errors when rsync ran. In case you run into them as well, here’s the list and fixes. The first was:
rsync: failed to set times on "$FOLDER": Operation not permitted (1)
I added --no-t
to resolve that and then got a new error:
rsync: failed to set permissions on "$FOLDER": Operation not permitted (1)
which was fixed with adding the switch --no-perms
. My final rsync command became:
rsync -avP --no-t --no-perms build_production/ $USER_NAME@$PRODUCTION_HOST:$FOLDER --exclude=bitbucket-pipelines.yml --chown=www-data:www-data
How to Setup a CI/CD Pipeline for Storybook.js using Gitlab
I just spent a few hours setting up a Gitlab pipeline to deploy a Storybook.js site. Of course the end result ended up being much simpler than I made it out to be. Like everything else on my blog, I’m sharing in case anyone else can use the information to save time.
Just put this in your gitlab-ci.yml and it’ll take care of caching the node modules and building your static version of Storybook to deploy.
image: node:latest
cache:
paths:
- node_modules/
stages:
- build
- deploy
build:
stage: build
script:
- npm install
- npm run build-storybook -- -o storybook-static
artifacts:
paths:
- storybook-static
only:
- qa
- develop
- master
deploy:
stage: deploy_to_aws
# add your deploy code here
How to Clear Archive & Read-only flags on Files in Windows in Bulk
I ran into an issue where I had to move files from one system to another and was running into issues because files had been set as read-only, had the archive flag set, or both. It was causing the system to skip files which wasn’t acceptable. Normally you could just use Windows to clear it in bulk, but that could potentially mess up file permissions. I needed a way to automatically just clear all flags but respect permissions.
I did some searching and didn’t find a utility that would do the job and most of the solutions I found required Powershell which wasn’t available on the system I was on. I ended up writing a quick console application in C# to do the trick. I’ve made it free and open sourced it in case anyone wants to use it.
If you need just the app, you can find the release build here with instructions. The app also prompts for input to make things a bit easier to use. There’s no install, no tracking or metrics, or anything else related to privacy concerns in this app. It’s a simple throwaway utility to get the job done and move on.
https://github.com/gregvarghese/clearflags/releases/tag/1.0.0
If you want to see the source code, that is available here:
https://github.com/gregvarghese/clearflags/
Please note that I did this in about 10 minutes for my own use so error handling is pretty much non-existent. I mention this because I did run into one issue where Windows was somehow seeing a folder with files in it as a file and it couldn’t be deleted or renamed and the utility couldn’t get past it until it was resolved. I didn’t spend much time debugging and just used my Mac to rename the folder and Windows was able to recognize it after the change, so the utility was able to continue processing.
How to execute SSH command with Bitbucket Pipelines
I inherited an old site that someone else setup that is just a basic static HTML, which was deployed using a git pull on the server. I wanted to automate the deployment, and instead of using rsync as the site will be re-built, I realized I could just configure the Bitbucket Pipeline to use SSH and run the pull command. This is probably a fringe case but here’s the bitbucket-pipelines.yml in case anyone finds it useful.
Add the repository variables for $USER, $SERVER, and $FOLDER with the appropriate values and then you should be able to run the deployment.
pipelines:
default:
- step:
script:
- pipe: atlassian/ssh-run:0.2.8
variables:
SSH_USER: '$USER'
MODE: 'command'
SERVER: '$SERVER'
COMMAND: 'cd $FOLDER && git pull'