Multiple Automatic Backup Locations in Control Panel

Is it possible the control panel can have multiple Automatic Data Backup storage locations enabled at the same time? The goal would be to use 3rd party storage, and also “Computer file” as well. Seems like you can only use 1 at a time even if you filled out both of their settings properly. It makes the most recently one saved active and not the other. The idea is, use 3rd party to save to AWS S3 (works great). And have a local backup to a network drive via the “Computer file” option. I can get each individually to work, but not both at the same time. Having a remote cloud backup, and a local backup would be great for redundancy of data.

Any way to achieve this via editing a config file instead of relying on the control Panel frontend?

Steps to reproduce, enable automatic data backup. Click on 3rd party storage, fill out S3 settings, hit save. Then choose the computer file tab, fill out a location, hit save. When auto backup runs, it will only execute the auto backup of last one you hit save on, not both.

Also in linux, where is the config file path located for the control panel settings?


Unfortunately, the OnlyOffice control panel only allows one backup location to be active at a time. To enable both AWS S3 and a local network drive for backups, you can create a script that handles both processes and schedule it with a cron job.

Create a Backup Script:


Backup to AWS S3

aws s3 sync /path/to/backup s3://your-s3-bucket/path --delete

Backup to local network drive

rsync -av --delete /path/to/backup /path/to/network/drive

echo “Backup completed at $(date)” >> /var/log/onlyoffice/backup.log

Schedule the Script:

crontab –e

Add the following line to run the script daily at 2 AM:

0 2 * * * /path/to/

This way, you can achieve redundancy by using both remote and local backups.

hope it helps !

Thank you

1 Like

thanks for the idea. I looked into the api and came up with this python script. Im using Ubuntu, so YMMV. change the auth details to your server details, and the local_backup_dir to where you want to save it on the local computer. My full code is more robust, it saves it to a separate local drive, then after backup is complete it then saves it to a network share.

import requests
import time
import datetime

# Authentication details

# Get authentication token
auth_url = f"{portal_url}/api/2.0/authentication.json"
auth_data = {"userName": username, "password": password}
response =, json=auth_data)

if response.status_code == 201:
    token = response.json()["response"]["token"]
    print(f"Authentication successful. Token: {token}")

    # Prepare backup data
    local_backup_dir = "/tmp"  # Using /tmp as the backup directory
    backup_data = {
        "storageType": "Local",
        "storageParams": [
            {"Key": "filePath", "Value": local_backup_dir}
        "backupMail": True,
        "collection": []  # Optional: Include collection if necessary

    # Start the backup
    backup_url = f"{portal_url}/api/2.0/portal/startbackup"
    headers = {"Authorization": token}
    backup_response =, json=backup_data, headers=headers)

    if backup_response.status_code == 201:
        print("Backup started.")
        backup_info = backup_response.json()

        # Monitor progress
        while not backup_info["response"]["isCompleted"]:
            # Check progress every 5 seconds

            progress_url = f"{portal_url}/api/2.0/portal/getbackupprogress"
            progress_response = requests.get(progress_url, headers=headers)

            if progress_response.status_code == 200:
                backup_info = progress_response.json()
                print("Backup progress:", backup_info)

                if backup_info["response"]["error"]:
                    print(f"Backup error: {backup_info}")

                print(f"Error fetching backup progress. Status code: {progress_response.status_code}")

        if backup_info["response"]["isCompleted"]:
            print("Backup completed successfully.")

            # Get backup history
            history_url = f"{portal_url}/api/2.0/portal/getbackuphistory"
            headers = {"Authorization": token}
            history_response = requests.get(history_url, headers=headers)

            if history_response.status_code == 200:
                history_data = history_response.json()

                # Find the most recent backup
                latest_backup = None
                latest_created_on = datetime.datetime.min
                for backup in history_data["response"]:
                    created_on = datetime.datetime.strptime(backup["createdOn"], '%Y-%m-%dT%H:%M:%S')
                    if created_on > latest_created_on:
                        latest_created_on = created_on
                        latest_backup = backup

                if latest_backup:
                    backup_filename = latest_backup["fileName"]
                    print("Backup filename:", backup_filename)

                    # ... (Optional: Use backup_filename to retrieve or manage the backup file)
                    print("Warning: No backup history found.")

                print(f"Error fetching backup history. Status code: {history_response.status_code}")

        print(f"Backup initiation failed. Status code: {backup_response.status_code}")

    print(f"Authentication failed. Status code: {response.status_code}")