I'm testing Solus 4.0 on a VBox VM, prior to installing it on hardware. What I'm missing from other distros are useful backup tools.

Normally on other distros I use TimeShift to backup system files and BackInTime to backup my user files. On Solus, I've tried Duplicity, Deja-dup, and Vorta, but none seem usable to me. Either it runs from the CLI only, and I need to memorize lots of commands and options, or it won't allow my mounted destination folder (/mnt/Backups/Linux) and complains that it can't deal with relative paths.

Can anyone suggest Solus equivalents with GUIs that install from the stable repository and appear in the menu? I'm using Budgie, in case that makes a difference. Thanks for any help with this.

    You don't need to remember the CLI commands, once you get them, you put them in a script and you can even schedule the script so it runs automatically.

    For the GUI, I think vorta is simple & easy but you already excluded it from your list.

      WetGeek

      you can just make an alias for your command line commands. It can not be easier than this actually.

        kyrios "For the GUI, I think vorta is simple & easy but you already excluded it from your list."

        Yeah, that was because it objected to the location of my mounted destination folder. I use both Windows machines and Linux machines, so both the download and backup folders on my NAS have subdirectories for Windows and for Linux.

        At least, I think that was Vorta. Might have been deja-dup, or maybe it was both of them.

        nodq Yes, I just read about bash aliasing in O'Reilly's book "Learning the Bash Shell." I'm a retired software engineer who no longer needs to create Windows software for 10 or 12 hours a day.

        I've been using Linux at home for decades, but the only scripting I've done has been Win32 or PowerShell at work. Bash scripts are, for me, like learning a brand new language, and I'm looking forward to it now that I have the time.

        I know you asked for GUI but I have a backup script that I find really useful and maybe others would like it (or maybe you decide to try using it instead of GUI). I require my backup software to be open-source (and for cloud backups, encrypted).

        For local backups (external HDD) I use this rsync script:

        #!/bin/bash
        
        DATE=`date +%Y-%m-%d_%H-%M-%S`
        SOURCEDIR=/home/$USER
        TARGETDIR=/run/media/$USER/External\ Backup/rsync\ backups
        
        #Check if TARGETDIR does not exist
        if [ ! -d "$TARGETDIR" ]; then
        	echo "Target directory not found." >&2
        	exit 1
        fi
        
        #Make directories if not exists
        mkdir -p "$SOURCEDIR/BackupFiles/package-logs"
        mkdir -p "$SOURCEDIR/BackupFiles/gpg-backup"
        mkdir -p "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs"
        
        #Installed packages logs
        #apt-mark showmanual > "$SOURCEDIR/BackupFiles/package-logs/dpkg.log"
        eopkg li > "$SOURCEDIR/BackupFiles/package-logs/eopkg.log"
        snap list > "$SOURCEDIR/BackupFiles/package-logs/snap.log"
        flatpak list > "$SOURCEDIR/BackupFiles/package-logs/flatpak.log"
        
        #GPG public keys and ownertrust
        gpg --export --armor > "$SOURCEDIR/BackupFiles/gpg-backup/public-keys.asc"
        gpg --export-ownertrust --armor > "$SOURCEDIR/BackupFiles/gpg-backup/ownertrust.asc"
        
        #rsync command
        rsync -avPh --exclude="*.vc" --exclude-from "$SOURCEDIR/BackupFiles/exclude-paths/exclude-paths-rsync.txt" --log-file="$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-$USER-$DATE.log" --link-dest="$TARGETDIR/current/" "$SOURCEDIR/" "$TARGETDIR/$USER-$DATE/"
        ERROR1=$?
        #rsync command for *.vc files which need checksum option
        rsync -avPhm --checksum --include="*/" --include="*.vc" --exclude="*" --log-file="$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-$USER-$DATE.log" --link-dest="$TARGETDIR/current/" "$SOURCEDIR/" "$TARGETDIR/$USER-$DATE/"
        ERROR2=$?
        
        #Symlink and failed backup logic
        #copy log files to external hdd
        if [ "$ERROR1" = 0 ] && [ "$ERROR2" = 0 ]; then
        	rm -f "$TARGETDIR/current"
        	ln -s "$TARGETDIR/$USER-$DATE" "$TARGETDIR/current"
        	cp "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-$USER-$DATE.log" "$TARGETDIR/$USER-$DATE/BackupFiles/backup-logs/rsync-logs/rsync-log-$USER-$DATE.log"
        	cp "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-$USER-$DATE.log" "$TARGETDIR/$USER-$DATE/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-$USER-$DATE.log"
        else
        	mv "$TARGETDIR/$USER-$DATE" "$TARGETDIR/failed-$USER-$DATE"
        	mv "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-$USER-$DATE.log" "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-failed-$USER-$DATE.log"
        	mv "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-$USER-$DATE.log" "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-failed-$USER-$DATE.log"
        	cp "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-failed-$USER-$DATE.log" "$TARGETDIR/failed-$USER-$DATE/BackupFiles/backup-logs/rsync-logs/rsync-log-failed-$USER-$DATE.log"
        	cp "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-failed-$USER-$DATE.log" "$TARGETDIR/failed-$USER-$DATE/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-failed-$USER-$DATE.log"
        fi
        

        Basically, it uses "--link-dest" to make "timeline" backups. This means that you have a folder named by date with that state of your files. However, when files are the same "--link-dest" uses hardlinks so that the filesystem doesn't actually use extra space for duplicate files. You will have a symlink directory called "current" which points to the most recent backup, and "--link-dest" uses this symlink to establish it's hardlinks.

        Make sure to add directories to exclude in the exclude paths file (in my script it's located at "$SOURCEDIR/BackupFiles/exclude-paths/exclude-paths-rsync.txt"). I use a different command for my veracrypt containers (I name them ".vc") that uses checksum, so you can most likely ignore that part.

        You are free to remove the gpg/package logs sections if you don't want them.

        The last part of the code makes the symlink properly and also transfers the log files to the target directory. Without the symlink logic, "--link-dest" will not function correctly.

        For cloud-backups I use rclone:

        #!/bin/bash
        
        DATE=`date +%Y-%m-%d_%H-%M-%S`
        SOURCEDIR=/home/$USER
        REMOTE=GDrive-encrypt
        
        #Make directories if not exists
        mkdir -p "$SOURCEDIR/BackupFiles/backup-logs/rclone-logs"
        
        #rclone command
        rclone sync "$SOURCEDIR/" "$REMOTE:current" -Pv --drive-chunk-size 512M --filter-from "$SOURCEDIR/BackupFiles/exclude-paths/filters-rclone.txt" --backup-dir "$REMOTE:$DATE" --log-file="$SOURCEDIR/BackupFiles/backup-logs/rclone-logs/rclone-log-$USER-$DATE.log"
        #rclone command for *.vc files which need checksum option
        rclone sync "$SOURCEDIR/" "$REMOTE:current" -Pvc --transfers 1 --drive-chunk-size 512M --filter-from "$SOURCEDIR/BackupFiles/exclude-paths/filters-vc-rclone.txt" --backup-dir "$REMOTE:$DATE" --log-file="$SOURCEDIR/BackupFiles/backup-logs/rclone-logs/rclone-log-vc-$USER-$DATE.log"
        
        #rclone command for log files
        rclone sync "$SOURCEDIR/BackupFiles/backup-logs/" "$REMOTE:current/BackupFiles/backup-logs/" -Pv --drive-chunk-size 512M --backup-dir "$REMOTE:$DATE/BackupFiles/backup-logs/"
        

        rclone doesn't support "--link-dest" so you can use "--backup-dir" (also available in rsync). Instead of keeping "timeline snapshots" it simply will keep older version of files in date-labeled directories instead of deleting them. Duplicate files that aren't updated just stay in the main target directory.

        You have to set up your encrypted remote first with rclone, but it's relatively simple and uses interactive text interface.

        rclone has "filters" that you can use if you have both "include" and "exclude" rules. See https://rclone.org/filtering/. The syntax is a little different than rsync.

        Thank you! Unlike the script examples I've seen so far, yours is COMMENTED. I've only given it a quick look, but I'm sure I'll be able to understand it well after another chapter or two in the scripting book I'm reading.

        Hmmm ... now I'm wondering if I could just use mono? I wrote C# daily for 15 years or so, and with your script as a guide, I could probably finish that in an hour or so, including testing and debugging it.