I know you asked for GUI but I have a backup script that I find really useful and maybe others would like it (or maybe you decide to try using it instead of GUI). I require my backup software to be open-source (and for cloud backups, encrypted).
For local backups (external HDD) I use this rsync script:
#!/bin/bash
DATE=`date +%Y-%m-%d_%H-%M-%S`
SOURCEDIR=/home/$USER
TARGETDIR=/run/media/$USER/External\ Backup/rsync\ backups
#Check if TARGETDIR does not exist
if [ ! -d "$TARGETDIR" ]; then
echo "Target directory not found." >&2
exit 1
fi
#Make directories if not exists
mkdir -p "$SOURCEDIR/BackupFiles/package-logs"
mkdir -p "$SOURCEDIR/BackupFiles/gpg-backup"
mkdir -p "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs"
#Installed packages logs
#apt-mark showmanual > "$SOURCEDIR/BackupFiles/package-logs/dpkg.log"
eopkg li > "$SOURCEDIR/BackupFiles/package-logs/eopkg.log"
snap list > "$SOURCEDIR/BackupFiles/package-logs/snap.log"
flatpak list > "$SOURCEDIR/BackupFiles/package-logs/flatpak.log"
#GPG public keys and ownertrust
gpg --export --armor > "$SOURCEDIR/BackupFiles/gpg-backup/public-keys.asc"
gpg --export-ownertrust --armor > "$SOURCEDIR/BackupFiles/gpg-backup/ownertrust.asc"
#rsync command
rsync -avPh --exclude="*.vc" --exclude-from "$SOURCEDIR/BackupFiles/exclude-paths/exclude-paths-rsync.txt" --log-file="$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-$USER-$DATE.log" --link-dest="$TARGETDIR/current/" "$SOURCEDIR/" "$TARGETDIR/$USER-$DATE/"
ERROR1=$?
#rsync command for *.vc files which need checksum option
rsync -avPhm --checksum --include="*/" --include="*.vc" --exclude="*" --log-file="$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-$USER-$DATE.log" --link-dest="$TARGETDIR/current/" "$SOURCEDIR/" "$TARGETDIR/$USER-$DATE/"
ERROR2=$?
#Symlink and failed backup logic
#copy log files to external hdd
if [ "$ERROR1" = 0 ] && [ "$ERROR2" = 0 ]; then
rm -f "$TARGETDIR/current"
ln -s "$TARGETDIR/$USER-$DATE" "$TARGETDIR/current"
cp "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-$USER-$DATE.log" "$TARGETDIR/$USER-$DATE/BackupFiles/backup-logs/rsync-logs/rsync-log-$USER-$DATE.log"
cp "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-$USER-$DATE.log" "$TARGETDIR/$USER-$DATE/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-$USER-$DATE.log"
else
mv "$TARGETDIR/$USER-$DATE" "$TARGETDIR/failed-$USER-$DATE"
mv "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-$USER-$DATE.log" "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-failed-$USER-$DATE.log"
mv "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-$USER-$DATE.log" "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-failed-$USER-$DATE.log"
cp "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-failed-$USER-$DATE.log" "$TARGETDIR/failed-$USER-$DATE/BackupFiles/backup-logs/rsync-logs/rsync-log-failed-$USER-$DATE.log"
cp "$SOURCEDIR/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-failed-$USER-$DATE.log" "$TARGETDIR/failed-$USER-$DATE/BackupFiles/backup-logs/rsync-logs/rsync-log-vc-failed-$USER-$DATE.log"
fi
Basically, it uses "--link-dest" to make "timeline" backups. This means that you have a folder named by date with that state of your files. However, when files are the same "--link-dest" uses hardlinks so that the filesystem doesn't actually use extra space for duplicate files. You will have a symlink directory called "current" which points to the most recent backup, and "--link-dest" uses this symlink to establish it's hardlinks.
Make sure to add directories to exclude in the exclude paths file (in my script it's located at "$SOURCEDIR/BackupFiles/exclude-paths/exclude-paths-rsync.txt"). I use a different command for my veracrypt containers (I name them ".vc") that uses checksum, so you can most likely ignore that part.
You are free to remove the gpg/package logs sections if you don't want them.
The last part of the code makes the symlink properly and also transfers the log files to the target directory. Without the symlink logic, "--link-dest" will not function correctly.
For cloud-backups I use rclone:
#!/bin/bash
DATE=`date +%Y-%m-%d_%H-%M-%S`
SOURCEDIR=/home/$USER
REMOTE=GDrive-encrypt
#Make directories if not exists
mkdir -p "$SOURCEDIR/BackupFiles/backup-logs/rclone-logs"
#rclone command
rclone sync "$SOURCEDIR/" "$REMOTE:current" -Pv --drive-chunk-size 512M --filter-from "$SOURCEDIR/BackupFiles/exclude-paths/filters-rclone.txt" --backup-dir "$REMOTE:$DATE" --log-file="$SOURCEDIR/BackupFiles/backup-logs/rclone-logs/rclone-log-$USER-$DATE.log"
#rclone command for *.vc files which need checksum option
rclone sync "$SOURCEDIR/" "$REMOTE:current" -Pvc --transfers 1 --drive-chunk-size 512M --filter-from "$SOURCEDIR/BackupFiles/exclude-paths/filters-vc-rclone.txt" --backup-dir "$REMOTE:$DATE" --log-file="$SOURCEDIR/BackupFiles/backup-logs/rclone-logs/rclone-log-vc-$USER-$DATE.log"
#rclone command for log files
rclone sync "$SOURCEDIR/BackupFiles/backup-logs/" "$REMOTE:current/BackupFiles/backup-logs/" -Pv --drive-chunk-size 512M --backup-dir "$REMOTE:$DATE/BackupFiles/backup-logs/"
rclone doesn't support "--link-dest" so you can use "--backup-dir" (also available in rsync). Instead of keeping "timeline snapshots" it simply will keep older version of files in date-labeled directories instead of deleting them. Duplicate files that aren't updated just stay in the main target directory.
You have to set up your encrypted remote first with rclone, but it's relatively simple and uses interactive text interface.
rclone has "filters" that you can use if you have both "include" and "exclude" rules. See https://rclone.org/filtering/. The syntax is a little different than rsync.