Bash Script to run a program via Spark and build Partition table
$30-250 USD
Suljettu
Julkaistu noin 3 vuotta sitten
$30-250 USD
Maksettu toimituksen yhteydessä
I have a bash script job that utilizes sqoop. The job has needed to be updated to have sqoop stripped out and replaced with spark. The job host it's data in hadoop, and should be updated to where the data is partitioned by product and year.
Challenges: You will not have access to the data and will need to create your own data. You will have schemas to work from. I will work with you to give you as much as I can to help finish this effort.
It needs to be completed in a couple days, but it may take a few hours.
The job has to use bash script, sql, spark.
The job is to create several new hive partition tables partitioned by product and year.
A spark program that can be used to run the current bash script program
The same spark program to be used to create and update the new partition tables 4) Strip sqoop out of current program, and rewrite it for Spark
Create automation, where the this job kicks off daily.
Deliverables
Creation of partitioned hive tables by product and year.
Remove and replace sqoop code with spark code
Be able to automate job daily that will also update the partitioned hive table while creating a new hive table when needed (example: There is currently no 2022 table. When 2022 data is available, a 2022 hive table is automatically created.
I checked the requirements of Bash Script to run a program via Spark and build Partition table and i can do it perfectly.
I am available to discuss now.
Waiting for your response.
Thanks.
Hi,
I have more than 4 years of experience in the big data field. During this period I have developed many spark and Hadoop applications.
I am very confident that I will be able to complete your work because of the experience that I have in this field.
Kindly provide me with further details and I can get started on it right away.
Regards,
Amith