System Admin required for Ubuntu / AWS CLI data offloading (backup) solution

Suljettu Julkaistu 4 vuotta sitten Maksettu toimituksen yhteydessä
Suljettu Maksettu toimituksen yhteydessä

I need a GUN Ubuntu System Admin with AWESOME AWS expertise. You are probably going to be a Centos freak as well - and I'll appreciate that too (for potential ongoing work).

If you don't know how to write insane scripts and you are not all over the AWS CLI then DONT. BOTHER. PITCHING.

If you don't know what IAM Users are and how to craft clean policies, then log off freelancer now and consider your life choices.

If you can collaborate with me during 11am to 6pm ADST/AEST, then please make that clear in your pitch. (I am based in Melbourne and prefer Slack to communicate and collaborate in real-time - does anyone email anymore?).

I run a managed services agency, and I want to update the way we perform disaster recovery offsite data storage.

This will be developed on one production server, and we will roll it out to the remaining servers.

I need a script, or set of scripts that will iterate through a configured set of directories and do the following:

target_directory=/home/user1/:/home/user2

backup_staging=/backups

s3_bucket=s3://bucket

server_ref=prod01

notification_email=myticketingsystem@mydomain. [login to view URL]

Iterarate through all subdirectories in target_directory and create a corresponding [login to view URL] archive including date/time in the filename in backup_staging.

For example, /users/fred/www.fred. com would be archived to /backups/fred/www.fred. com/www.fred. [login to view URL]

During the archiving process, the .git directory in the subdirectory root MUST NOT be included.

Once the archive is created, the script should use AWS CLI to copy the archive to the bucket, creating a directory structure if required to mirror inside the backups directory but using the server_ref first . i.e. s3://bucket/server_ref/[login to view URL][archive]

Once the archive is sent to S3, any archives older than 45 days should be removed from S3. (this can be done via policy if possible/preferred).

Once the archive is CONFIRMED to have been copied to S3, the local archive should be removed - ie /backups/fred/www.fred. com/www.fred. [login to view URL]

The script should then process the NEXT subdirectory in the target directory.

This means one archive is created, offloaded and removed from the staging area before the next one is processed.

Once the first target_dir is processed, the next one should be processed.

If an error occurs at any time, an email should be sent.

Once the offloading is complete for all sites, an email should be sent.

The script should report on how long it took to complete when reporting completion email.

The script should be set to run at 2am each morning.

You should tell me what is good/bad/indifferent about my approach (bearing in mind this will be running on 16 servers) and help me craft something even better (if that's at all possible).

Stage 2 of this will be to configure a VM (somewhere) with x TB storage to use AWS CLI Sync to copy down ALL server backups from the S3 storage to a local storage device.

Tell me in your pitch if you know whether "aws sync" deletes files in the destination location or reflects changes in the destination back on the source.

You will need to sign an NDA before starting on this.

Also, let me know your favourite brand of coffee and what you thought of The Expanse.

Linux Amazon Web Services Ubuntu Järjestelmänvalvoja Bash Scripting

Projektin tunnus: #24130607

Tietoa projektista

10 ehdotusta Etäprojekti Aktiivinen 3 vuotta sitten

10 freelanceria on tarjonnut keskimäärin $494 tähän työhön

tekzee

Hello, I have read your project REQUIREMENT and UNDERSTOOD IT COMPLETELY. We deal with the ALL TYPE OF SERVER RELATED ISSUE as we have expert team for this. We will be able to do but we need proper details regarding y Lisää

$500 AUD 7 päivässä
(69 arvostelua)
6.9
raj00565

HI, I understand you requirement and can fulfill it as I m a DevOps engineer with a profound experience of 6 + years in relative field Offered services in DevOps, sysOps, AWS/ GCP/ Cloud Linux server administration, A Lisää

$500 AUD 2 päivässä
(51 arvostelua)
6.4
ranumehta2017

***AWS EXPERT*** Lisää

$500 AUD 7 päivässä
(36 arvostelua)
5.2
atikullx

Hire me. I've done such bash script to backup files (Not for s3 but for remote FTP, but not a prob to do for s3) I can do exactly what you wanted. I'm a Server & Linux System Admin, AWS, Google Cloud & Microsoft Azure Lisää

$300 AUD 3 päivässä
(71 arvostelua)
5.1
SalmanAwan

Hi mate, I hope you are doing great after writing that project description. Best one I have read in ages, better than the ones I receive from "team of consultants". I quickly reviewed my life choices and still think Lisää

$500 AUD 7 päivässä
(13 arvostelua)
4.1
gargankit642

Nice to meet you I am an Amazon Cloud Architect for the web infrastructure serving 90 million page impressions and 12 TB Internet traffic per month. The AWS services I use are EC2, ELB, MySQL RDS, VPC, CloudFront, Elas Lisää

$419 AUD 5 päivässä
(7 arvostelua)
3.9
kishansunny

I guess I'm sort of breaking your rules about not bidding, but here goes: GNU/Linux admin - check AWS - check I prefer Debian over CentOS :/ I'm good enough with aws cli, but I need to look at the docs some times 11am Lisää

$500 AUD 7 päivässä
(1 arvostelu)
2.0
vckbansod12

Hello. This is Viky Bansod. I am working in DevOps and Cloud Operations. Total experience is 6+ yrs along with working on Ubuntu 14.04,16.04, Redhat and AWS Linux. I have a certification in AWS(Solution Archit Lisää

$666 AUD 7 päivässä
(0 arvostelua)
0.0
HarshaPradeepR

Hello, I am a certified AWS engineer (on all 3 level of exams) and have many years of experience on working on AWS and scripting bash/python/boto3 and many more. I have done many work with clients via slack and looki Lisää

$555 AUD 4 päivässä
(0 arvostelua)
0.0