How can I back up a MySQL database in an azure virtual machine and send backups to blob storage using Azure functions with Python

I have a MySQL database inside an Azure VM and I am looking for a way to write an Azure function in Python that sends the contents of a mysqldump shell command to a blob storage container. I am using a timer trigger to back up the database a few times per day.

My initial tests I used os.system() to run a mysqldump shell command, and I stored a gzipped version in temporary storage and then uploaded, and this worked locally in vscode, but this doesn’t work in production. In my logs it just says file not found, and I’ve tried a few different locations for the file to test in production.

I am looking for a better way to perform these updates, preferably storing the contents of the gzipped file into some python object and then using the blob output binding to store.

Answer

You can just use python schedule in your VM to do everything I think, there is no need to use the Azure function, it is an external environment of your VM on the cloud, it can’t access anything inside of your VM. Try the code below to upload .zip files directly from your VM on schedule:

from azure.storage.blob import BlobClient
import time
import schedule

def uploadZip():
    storage_connection_string='<your storage connection string>'
    container_name = 'backups'

    #ignore the process about generating the .zip file,just store the file to local temp path
    templeFilePath = "d:/home/temp/test.zip"

    blob_client = BlobClient.from_connection_string(storage_connection_string,container_name,time.strftime('%Y-%m-%d,%H:%M:%S',time.localtime())+".zip")
    with open(templeFilePath,'rb') as stream:
        blob_client.upload_blob(stream)
    print("upload successfully")

#backup each 12 hours
#schedule.every(12).hours.do(uploadZip)
#for a quick test, upload each 5 seconds
schedule.every(5).seconds.do(uploadZip)

while 1:
    schedule.run_pending()

Result:

enter image description here enter image description here