Hello Freelancer and Community,
my next request is a "backup" bash script that should work with rsync and Retention up to 30 days for changed/deleted files. It should be possible to not! bulk transfer all domains webfolder, mysql,... only specified ones where i can change the folder in the bash script.
System in use:
Dedicated server with Centos 7 with some websites hosted there and placed in /webfolder/domain-name
Second dedicated server with just OS and a /backup/domain-name folder created
What i need is:
* bash/shell script to execute via shell/cron
* i should be able to change values to 1,3,7,30 in the bash script.
If 1 day is set, then:
* rsync from the live server /webfolder/domain-name to the remote server /backup/domain-name/file_backup_full where a 1:1 copy is saved daily there like rsync --delete or something easy
* export specified mysql (not all) and rsync/copy to remote server /backup/domain-name/mysql_backup
If 3 or 7 or 30 days are set in the bash by me, then:
* rsync from the live server /webfolder/domain-name to the remote server /backup/domain-name/file_backup_full where a 1:1 copy is saved daily there like rsync --delete or something
* additional with 3,7,30 set it should:
create a new file_backup_DATE folder and copy changed/delete files to this folder.
* delete old backups if 3,7,30 days are reached.
* copy db to exist /backup/domain-name/mysql_backup/[login to view URL],...
I just want to set in the bash by myself: [login to view URL], 2. mysql details 3. If 1,3,7,30 days backup.
Everything else should run automaticly.
Website [login to view URL] with 3 days set:
Homefolder on webserver: /webfolder/[login to view URL]
Rsync /webfolder/[login to view URL] to remote server /backup/[login to view URL]
Export mysql with name-> name_date, then copy to remote server /backup/[login to view URL]
Check in /webfolder/[login to view URL] what changed since last rsync ( there are always changes).
For deleted/changed files found, create a "date folder" on the remote server /backup/[login to view URL] and copy changes to it.
When its done, rsync the live webfolder to the /backup/[login to view URL] so with rsync --delete or something that its 1:1 copy then again in the FULL folder. Changes stays in the file_backup_23/02/2020 folder.
Day 3 the same as day 2 :)
Last requirement is, that depense on what is set 3,7,30, the script checks how many backups exist and delete the oldest one if required.
Example for 3 day:
i start a backup today, then it creates:
* today the -> /backup/[login to view URL]
* tomorrow file_backup_23/02/2020 with changed/deleted files
* day after tomorrow file_backup_24/02/2020 with changed/deleted files
And then at the next run, it should delete the oldest day folder, in this case file_backup_23/02/2020 and create a new one file_backup_25/02/2020
Same for the mysql backup file.
Puhhh, that was much :) sorry for that but i had no other idea how to explain that.
I think its clear what i mean and there are also many bash scripts out there on the net for almost that use case, but i want my "own" and good working script.
Thank you all for reading sooooo much.
Awaiting your offers.
Hello I am an experienced system admin and i can provide you with the bash script along with the functionality you are looking for.. Please message me back to get started ASAP. Thank you..
4 фрилансеров(-а) в среднем готовы выполнить эту работу за €78
I have 4+ years of experience of full-stack, web development (Specialized in E-commerce, Php, HTML, CSS, MySQL, Wordpress. If you hire me, I promise to give you the greatest service. Thank You…