Change File and Folder Permissions for WordPress

Sometimes copying files produces the wrong file permissions on the server. These two commands will get the majority of your files and folders back to where they need to be. You’ll also need to update the .htaccess and probably the updates folder with more restrictive permissions.

cd ~/public_html
chmod -R 0755 *
chmod 0644 $(find . ! -type d)

Batch Resize Images from Command Line

Quick tip if you need to batch resize a number of photos from the command line:

Step #1: Install ImageMagick

Step #2: Run the following script, replacing the brackets with the actual paths. This will resize to 1024px wide and auto-resize the height in ratio.

mogrify -resize 1024 -quality 100 -path [full path to new file folder] [path to photos to resize]    

Here is an example of the full script:

mogrify -resize 1024 -quality 100 -path /home/user/resized /home/user/original/*.jpg

PostgreSQL Automatic Backups with Redundant Storage

We recently installed Canvas, a LMS system by Instructure, for a local area school. It was a suprisingly painless setup process for the application itself using the Production Start Guide. Because we’re on a VPS though, we needed a way to have a backup of the user files and the databases. AWS S3 took care of the files. However, the database was on the VPS. We view our VPS (or EC2 instances) as temporary so storing information that needs to be durable is risky. To mitigate this risk, we’ve implemented the following solution:

Solution

Create a backup daily of the database and transfer the files via SSH to an external server which provides redundant file storage.

How to Setup the Solution

Step #1: Create a SSH Key for the user that will run the crons

Follow these steps to setup the SSH key. You will NOT want a password so that it can be done automatically by the server. After creating the SSH keys, you will want to copy the contents of the “.pub” key file and move that to the authorized_keys of the external server you are using.

Step #2: Create a .pgpass file

In order for the database backup to run without asking a user for the password you need to provide it within the user that will run the scripts home directory. First, SSH into the server with this user. Then follow the instructions below.

cd ~
touch .pgpass
vi .pgpass

Insert the following into the .pgpass file:

*:*:*:username:password

After you are back to the bash, type the following command.

chmod 0600 ~/.pgpass

Here is some more info on the topic: http://wiki.postgresql.org/wiki/Pgpass

Step 3: Create Backups Directories

We want to create both a regular directory and a directory we copy files to after transfers have been made. This way we can delete backups in the moved directory on a cron as well.

cd /var
mkdir backups
cd backups
mkdir database
cd /var
mkdir moved-backups
cd moved-backups
mkdir database

Step 4: Create Scripts Directory & Files

cd /var
mkdir scripts
cd scripts
touch dir_transfer.sh
touch pg_backup_rotated.sh
touch pg_backup.config
touch pg_backup.sh

I got the “pg” files from this link:http://wiki.postgresql.org/wiki/Automated_Backup_on_Linux. The contents of all these files can be found below. Add them with a text editor (like vim).

Step 5: Test Scripts

In order to perform the backup and transfer the backup files to the remote server run the following command.

/var/scripts/pg_backup.sh && /var/scripts/dir_transfer.sh

Step 6: Setup Cron Job for Automated Backups

Almost there! Just need to setup the cron job to run our scripts. Remember, this should be the same user that you used to setup everything else.

crontab -e

That command will open up the default text editor. Insert the following below the comments.

0 3 * * * /var/scripts/pg_backup.sh && /var/scripts/dir_transfer.sh > /dev/null 2>&1

There we go! We are all set. Just make sure that the cron job daemon is running.

service cron status

That should output something like “cron start/running, process #”

That’s it! Hope it helps

Files

dir_transfer.sh

pg_backup.config

pg_backup.sh

pg_backup_rotated.sh