
Closed
Posted
We are looking for an experienced data engineer with strong expertise in Apache Airflow, especially in dynamic DAG generation and scalable workflow design. Key Requirements: Strong hands-on experience with Apache Airflow Proven experience in dynamic DAG generation Solid experience working with AWS, especially MWAA (Managed Workflows for Apache Airflow) Good understanding of modern data/median architecture (data pipelines, orchestration, ETL/ELT workflows) Ability to design scalable, maintainable, and efficient workflows Experience with debugging and optimizing Airflow performance Nice to Have: Experience with data lakes / warehouses (e.g., S3, Redshift, Snowflake, etc.) Infrastructure as Code (Terraform/CloudFormation) CI/CD for data pipelines Project Scope: Design and implement dynamic DAGs Optimize existing Airflow workflows Set up / improve MWAA environment Provide best practices for scalable data pipeline architecture
Project ID: 40345603
6 proposals
Remote project
Active 18 days ago
Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
6 freelancers are bidding on average ₹1,064 INR/hour for this job

Hi, As per my understanding: You need an experienced data engineer to design scalable Airflow workflows with dynamic DAG generation, optimize existing pipelines, and improve your AWS MWAA setup while ensuring performance, maintainability, and best practices in modern data architecture. Implementation approach: I will start by reviewing your current DAGs, MWAA config, and pipeline architecture to identify bottlenecks. Then, I’ll implement dynamic DAG generation using modular, parameter-driven designs to improve scalability and reuse. I’ll optimize scheduling, task parallelism, and resource usage to enhance performance. On AWS MWAA, I’ll refine environment setup, logging, and monitoring. If needed, I’ll align pipelines with S3/Redshift/Snowflake and introduce CI/CD (via Git + Terraform). Final outcome will be robust, maintainable, and production-grade workflows with clear documentation. A few quick questions: 1. Current Airflow version and MWAA setup details? 2. Main performance issues (latency, failures, scaling)? 3. Data sources and destinations (S3, Redshift, etc.)? 4. Existing DAG complexity (static vs partially dynamic)? 5. Do you have CI/CD or IaC already in place?
₹750 INR in 40 days
4.9
4.9

My name is Laiba and I believe I am the perfect match for your dynamic Apache Airflow DAGs project. I have an extensive background in DevOps, including mastery of technologies like Kubernetes, Docker, and CI/CD pipelines, which are all highly relevant to your project needs. Additionally, my proficiency in managing cloud platforms such as Azure and AWS – a necessary skill for this role – helps me design and maintain secure, scalable, and high-performance systems for effective software delivery. One aspect that sets me apart from other candidates is my unique combination of biotech knowledge with technical engineering skills. This demonstrates my ability to apply a scientific understanding of complex domains to practical contexts – a crucial trait when dealing with data-related challenges. On top of that, I am also an accomplished content writer which enhances further my communication skills, enabling me to deliver articulate documentation and precise logical analyses. When it comes to Apache Airflow specifically, I possess considerable experience not just in using the tool efficiently, but also in optimizing its performance to ensure that it meets your requirements for scalability, maintenance, and efficiency. My knack for spotting potential enhancements is augmented by my exposure to Infrastructure as Code methodologies that you mentioned as nice-to-haves for this project.
₹750 INR in 40 days
3.1
3.1

As a seasoned data engineer, I bring an unprecedented level of experience and expertise to the table. My 13+ years in the field have allowed me to fully explore and exploit the power of tools like Apache Airflow, and my skillset aligns perfectly with the requirements you've outlined. My deep understanding of Docker, AWS Lambda, S3 storage, and other infrastructure-related technologies is complemented by a knack for identifying optimizations that lead to efficient workflows - an ability I'm particularly proud of. Moreover, given your project's emphasis on scalable, maintainable, and efficient workflow designs, my extensive experience with modern data architecture can help reduce technical debt and support not just your current needs but also future growth. This includes weaponizing tools like Redshift as well as adopting Infrastructure-as-code practices through Terraform/CloudFormation, which ties perfectly into your nice-to-have mentioned. As an added bonus, my thorough knowledge will allow me to provide pertinent suggestions during the setup/improvement phase of the MWAA environment. Due to my commitment towards delivering what I promise each time, maintaining a 100% job success rate even after handling several challenging projects for varied clientele domains can be a testament to my dedication and technologically savvy demeanor.
₹1,000 INR in 40 days
0.0
0.0

I can help you with that, I am a data engineer in Australia. Happy to provide 2 hours of free service for you to gain confidence in me.
₹1,682 INR in 40 days
0.0
0.0

Hyderabad, India
Member since Mar 31, 2026
₹750-1250 INR / hour
₹750-1250 INR / hour
₹750-1250 INR / hour
₹150000-250000 INR
₹750-1250 INR / hour
₹100-400 INR / hour
$30-250 USD
₹37500-75000 INR
$30-250 AUD
£10-15 GBP / hour
₹12500-37500 INR
₹50000-100000 INR
₹12500-37500 INR
₹500000-521000 INR
₹75000-150000 INR
$10-30 USD
$15-25 USD / hour
₹12500-37500 INR
€250-750 EUR
₹100-150 INR / hour
£250-750 GBP
$15-25 USD / hour
₹100-400 INR / hour
$250-750 USD
$750-1500 USD