Powershell script for sysadmin to regularly download backups from server to computer

Recently, a friend of mine encountered the problem of the VPS provider suddenly going out of business without notice. Unfortunately, for the sake of cheap, she chose a niche provider - one that didn’t even offer any data downloads. What’s worse is that she doesn’t even have the habit of downloading backups regularly. Overnight, her site disappeared, along with all the data on it.

Shocked by this, I wrote a Powershell script that will automatically download backups from your server to your local computer regularly without having to purchase any additional object storage services or spend a penny on other things. All you need is a Windows computer that you use regularly (and has plenty of hard drive space) and is connected to the Internet.

The script will automatically clean up backups from 5 days ago. You can set the automatic backup interval and the oldest backup time saved locally according to your needs.


$ssh_port = 22
$ssh_address = "username@your.site"


Write-Output "Starting Discourse backup download task..."
Write-Output '------------------------'
Write-Output "Fetching the latest backup file..."
Write-Output ''

while ($true) {

    $filename = ''

    while ($true) {
        try {
            Write-Output "> ssh -p $ssh_port $ssh_address 'cd /var/discourse/shared/standalone/backups/default/ && ls -t | head -n 1'"
            Write-Output ''
            $filename = ssh -p $ssh_port "$ssh_address" 'cd /var/discourse/shared/standalone/backups/default/ && ls -t | head -n 1'
            break
        }
        catch {
            $filename = ''
    
            Write-Output "Failed to fetch... Here is the log:"
            Write-Output '-------------'
            Write-Output $_
            
            $answer = Read-Host "Do you want to re-fetch? (y/N)"
            if ($answer -ne 'y') {
                break
            }
            Write-Output ''
        }
    }
    
    
    if ([String]::IsNullOrEmpty($filename)) {
        Write-Output "Error: Failed to fetch file name $filename"
        Write-Output ''
        $answer = Read-Host 'Retry?(y/N)'
  
        if ($answer -eq 'y') {
  
        }
        else {
      
            exit 1
        }
  
    }
    else {

        Write-Output "Latest backup: $filename"
        Write-Output ''
        
        $need_download = $true
        if (Test-Path ".\backups\$filename") {
            $answer = Read-Host ".\backups\$filename already exists. Do you want to download it again?(y/N)"
            Write-Output ''
            if ($answer -ne 'y') {
                $need_download = $false
            }
        }
        if ($need_download) {
            Write-Output "Start downloading..."
            Write-Output ''
            
            while ($true) {
                try {
                    Write-Output "scp -p $ssh_port ${ssh_address}:/var/discourse/shared/standalone/backups/default/$filename .\backups\"
                    Write-Output ''

                    scp -p $ssh_port "${ssh_address}:/var/discourse/shared/standalone/backups/default/$filename" .\backups\
                    
                    Write-Output "Download completed"
                    Write-Output ''
                    
                    break
                }
                catch {

                    Write-Output "Download failed >_<... The following is the log:"
                    Write-Output ''

                    Write-Output $_
                    
                    $answer = Read-Host "Download again? (y/N)"
                    Write-Output ''
                    if ($answer -ne 'y') {
                        break
                    }
                }
            }

        }
  
        Write-Output "Start trying to clean old backup files..."
        Write-Output ''

        $count = 0
        $backupfiles = Get-ChildItem -Path .\backups\
  
        foreach ($file in $backupfiles) {
            if ($file.CreationTime -le (Get-Date).AddDays(-5)) {
                try {
                    Write-Output "Delete old backup file $file ..."
                    Write-Output ''
                    $file.Delete()
                    $count = $count + 1
                } catch {
                    Write-Output "An error occurred while deleting old backup file $file >_<"
                    Write-Output '-------------------'
                    Write-Output $_
                    Write-Output '-------------------'
                }
            }
        }

        if ($count -ge 0) {
            Write-Output "Cleaned $count old backup files"
            Write-Output ''
        }
        else {
            Write-Output 'No old backup files need to clean up'
            Write-Output ''
        }

        Pause
  
        exit 0
  
    }
  
  
}


Save the above script as scriptname.ps1 in the path where you wish to download the backup. Try “Run with Powershell”. If successful, then you can proceed to the next step.

To schedule a task

  1. Search “Scheduled Tasks”.
  2. Double-click Add Scheduled Task. The Scheduled Task Wizard appears.
  3. Click Next, then click Browse. The Select Program to Schedule dialog appears.
  4. Navigate to the script that you created click it, then Open. You are returned to the Scheduled Task Wizard.
  5. Provide a name for the task, or keep the default, which is the filename, specify how often to run the script, then click Next.
  6. Specify the starting time and date (if you specified Daily, Weekly, or Monthly, or etc.) and recurrence, then click Next. This item should match the automatic backup cycle of your discourse
  7. Type the user name and password for the account that will run the script, then click Next.
  8. If you want to configure advanced properties, select the check box, then click Finish.
14 Likes

A good reminder of how important it is to have remote backups. Another meta user suffered the same issue a few months ago.

One of my instance uses the built-in S3 backups feature, the others use Rclone and a CRON task to send backups on Google Drive. There’s a guide for this: Use rclone to sync backups to Dropbox or Google Drive

Thanks for sharing your script Linca :+1:

3 Likes

Indeed a very good reminder. I did suggest (here) that the update function could check and warn if the backup files have last-accessed-times too far in the past. We update every two or three months, so that check wouldn’t run too often, and the time of an update is a good time to be sure there’s a safe backup. Any means of copying a backup file to somewhere else should update the last-accessed timestamp.

3 Likes

Good point, thanks for the script. Let me share rsync version :slight_smile: .

In the example below sql and attachments are synchronized separately. There is no need to include attachments into backup file. No need to delete old backups as well.

I use Chocolatey to install cwrsync. After installation it’s just needed to add some lines at the end of the cmd-file + create a schedule:

C:\ProgramData\chocolatey\lib\rsync\tools\cwrsync.cmd

For example:

rsync -rvm --delete --ignore-errors --ignore-existing --size-only --chmod=ugo=rwX -e "ssh -i /cygdrive/c/Users/user1/.ssh/id_rsa" login@host:/var/discourse/shared/standalone/backups/default/ /cygdrive/d/backup/forum/db/

rsync -rvm --delete --ignore-errors --ignore-existing --size-only --chmod=ugo=rwX -e "ssh -i /cygdrive/c/Users/user1/.ssh/id_rsa" login@host:/var/discourse/shared/standalone/uploads/ /cygdrive/d/backup/forum/uploads/

Note 1: Use /cygdrive/c/ instead of C:, read the cmd-file for reference and syntax.

Note 2: You can modify command to use password instead of ssh-key (PEM-format)

Note 3: If you don’t put a slash after the dump/ folder, rsync will copy the folder, but if you do, it will only copy the contents of the folder.

Note 4: if chocolatey will upgrade cwrsync, it creates a backup of previous cmd at the same folder. You need to copy your commands to new cmd manually to continue backups.

Important: Set the same rule into Scheduler as OP wrote. The scheduler line is:

C:\ProgramData\chocolatey\lib\rsync\tools\cwrsync.cmd >> d:\backup\cwrsync.txt 2>&1

It will append the output to the file cwrsync.txt. That’s all.

3 Likes

I have the habit of setting up rclone to automatically copy the backups to a remote (a self-hosted nas at home) because of paranoia about just this kind of thing happening out of nowhere. This post serves very well as a reminder that we shouldn’t put all the eggs in one basket.

5 Likes

While you’re at it, you should also get a copy of everything in /var/discourse/containers.

4 Likes

Good idea! Will deploy an update

3 Likes

Where to Search “Scheduled Tasks” ? (新手求教,谢谢)

“Scheduled Tasks” 就是任务计划程序,win10以上直接按win键输入就能搜索

image

2 Likes

Hi. How come I get this error?

For anyone reading this, here are better/more up-to-date instructions since I slightly struggled with the OP’s instructions:

1)Make/Use a folder where this script will be stored. Then, create a folder called backup within that folder.
2)Copy/paste the script in Notepad. NOTE: If using a VPS, use your VPS’ username and the public IP address for the $ssh_address = "username@your.site" line. Save As anyname.ps1 and don’t forget to set the drop-down box to “All Files,” as shown below.
hhhh

3)Press Windows Key ⊞ and type in “Tasks,” then click “Task Scheduler.”
4)On the right-hand side, click “Create basic task.”
5)Give any name/description.
6)Pick if you want daily/weekly/yearly/etc. I recommend Daily.
7)Set the date & time you want to do this. Keep Recur at 1. Make sure you go into your Discourse site’s Admin settings > Backups. Check the box next to automatic backups enabled.
Then, set backup frequency to whatever you want. I recommend 1 but the key here is to have it match up with your Task Scheduler settings. Make the backup dates match up. Example: if you set Task Scheduler to backup Daily, set your Admin Backup backup frequency settings to 1.
8)Leave default setting at “start a program” and click Next.
9)Click Browse and find the script you saved. Click Next.
10)Click Finish.
If you want to test it, right-click the script and choose “Run with Powershell.”

Note: If you get “the system cannot find the file specified,” then enter your password like it shows on PowerShell and it will find it anyway.

1 Like