
Avoin
Julkaistu
•
Päättyy 1 tunnin päästä
Maksettu toimituksen yhteydessä
Job Title: Data Architect — Azure & Databricks Key Responsibilities Design and govern scalable data platforms using Azure and Databricks. Define end-to-end data architecture (ingestion, storage, processing, modeling, serving) based on Lakehouse and governance principles. Establish data modeling standards and architecture best practices. Architect and optimize data pipelines using Databricks, Delta Lake, and Azure Data Factory. Implement CI/CD and environment management using Databricks Asset Bundles and automation tools. Drive performance optimization across Spark, SQL, and Python workloads. Apply software engineering best practices (modular design, testing, CI/CD) to data solutions. Partner with business and analytics teams to translate requirements into scalable architectures. Provide technical leadership, mentorship, and architectural governance. Evaluate and adopt new Azure and Databricks capabilities to improve platform scalability and efficiency. Required Experience & Skills 10+ years in Data Architecture or Data Engineering, including 5+ years hands-on with Azure and Databricks. Proven expertise in modern data architectures (Lakehouse, Delta Lake, governance, metadata management). Strong experience with: Databricks (Delta Lake, SQL, Workflows, Unity Catalog, MLflow basics, Asset Bundles) Azure Data Services (Azure Data Factory, Azure Data Lake, Azure SQL; Synapse is a plus) Python, SQL, and Spark with performance optimization experience Experience with data modeling, lineage, cataloging, and security frameworks. Demonstrated ability to lead data modernization initiatives supporting analytics, ML, and AI. Professional Attributes Strategic mindset with strong execution focus. Excellent cross-functional communication skills. Results-driven and effective in fast-paced environments. Passion for mentoring teams and building high-quality data systems. Preferred Qualifications Experience using AI-assisted development tools (e.g., Cursor, GitHub Copilot). Exposure to Generative AI and LLMOps capabilities within Databricks.
Projektin tunnus (ID): 40266264
19 ehdotukset
Avoinna tarjouksille
Etäprojekti
Aktiivinen 5 päivää sitten
Aseta budjettisi ja aikataulu
Saa maksu työstäsi
Kuvaile ehdotustasi
Rekisteröinti ja töihin tarjoaminen on ilmaista
19 freelancerit tarjoavat keskimäärin ₹55 789 INR tätä projektia

Hi, As per my understanding: You are seeking a senior Data Architect with deep Azure and Databricks expertise to design and govern a scalable Lakehouse-based data platform. The role requires hands-on architecture across ingestion, Delta Lake modeling, CI/CD with Databricks Asset Bundles, performance tuning (Spark/SQL/Python), governance (Unity Catalog, lineage, security), and leadership in data modernization initiatives supporting analytics, ML, and AI. Implementation approach: I would begin with a current-state assessment of your Azure estate, workloads, data domains, and governance posture. Target architecture: Azure Data Lake Gen2 + Databricks Lakehouse with Delta Lake, medallion (Bronze/Silver/Gold) layering, Unity Catalog for governance, and ADF for orchestrated ingestion. CI/CD via Databricks Asset Bundles + Azure DevOps/GitHub Actions. Performance optimization across partitioning, Z-ordering, cluster sizing, Photon engine usage. Data modeling standards (dimensional + domain-driven patterns), lineage tracking, and security segmentation (RBAC + row-level controls). Leadership includes architecture reviews, mentoring engineers, and roadmap alignment. A few quick questions: • Current data platform maturity level? • Estimated data volume (TB/PB scale)? • Multi-region or single-region Azure setup? • Existing DevOps tooling preference?
₹40 000 INR 20 päivässä
4,9
4,9

I understand you require a Data Architect skilled in Azure and Databricks to design scalable data platforms and optimize pipelines using Delta Lake and Azure Data Factory. Your focus on end-to-end data architecture, governance principles, and performance tuning across Spark, SQL, and Python is clear. You also need expertise in data modeling standards and CI/CD implementation with Databricks Asset Bundles. With over 15 years of experience and more than 200 projects completed, I specialize in Python, SQL, and MongoDB database design, alongside cloud and DevOps practices. My background includes building robust data architectures and integrating complex workflows that align well with your need for scalable Lakehouse solutions and data governance frameworks. For your project, I will architect a modular data platform leveraging Azure Data Factory for ingestion and Databricks for processing, ensuring efficient pipeline orchestration and performance optimization. I’ll implement CI/CD automation to maintain environment consistency and apply best practices in data modeling and security. A realistic timeline of 8–10 weeks suits the scope, allowing thorough testing and documentation. Let’s discuss how I can help you build a high-quality, scalable data architecture tailored to your business needs.
₹41 250 INR 7 päivässä
2,1
2,1

As a seasoned data architect and engineer with over a decade of experience in the field, I believe that my proficient expertise in Azure and Databricks aligns perfectly with your data architecture needs. I possess thorough knowledge of modern data architectures, including the Lakehouse concept, Delta Lake, governance, and metadata management which are at the heart of this project's core responsibilities. Moreover, my strategic mindset and extensive experience in leading data modernization initiatives will ensure an innovative and effective approach to your architecture design. In my tenure as a data professional, I've successfully designed and managed scalable data platforms using Azure and Databricks, exactly what you need. Additionally, my skills extend to driving optimization across Spark, SQL, and Python workloads; an asset for your team to enhance performance throughout the pipelines. With my strong execution focus paired with results-driven approach, we'll be able to implement CI/CD and environment management using Databricks Asset Bundles while ensuring stability even in fast-paced environments.
₹56 250 INR 7 päivässä
0,0
0,0

My name is Jaimish, a seasoned Full-Stack Developer with 9+ years of experience in designing and building high-performance web and mobile applications. While my skills may not be an exact match, I firmly believe in the versatility that comes from a deep understanding of multiple technologies. This knowledge, combined with my keen eye for data architecture and problem-solving ability, makes me confident I can contribute effectively to your Azure & Databricks project. In line with the requirements of this role, I have 5+ years of hands-on experience working with Azure and Databricks. I bring extensive knowledge about modern data architectures such as Delta Lake and metadata management and have proficiently worked with Databricks services like SQL, Workflows, Unity Catalog, MLflow basics, Asset Bundles & Azure Data Factory, Azure Data Lake, Azure SQL including proficiency in Python, SQL, and Spark. Furthermore, I've had the hands-on experience needed for driving performance optimization across similar workloads. My professional background has honed my ability to lead data modernization initiatives supporting analytics, ML, and AI - a skill that I deem imperative for this project. Lastly, as a strong communicator focused on delivering quality results promptly without compromising efficacy; both our teams will form an exceptional collaboration! Let's make your data architecture dreams a reality!
₹70 000 INR 7 päivässä
0,0
0,0

I recently delivered a project with this exact scope, focusing on designing scalable, integrated data platforms using Azure and Databricks that are clean, professional, and user-friendly. I understand the importance of seamless data architecture covering ingestion, processing, and governance, while optimizing data pipelines with automation and CI/CD for efficiency. With expertise in Delta Lake, Azure Data Factory, and performance tuning across Spark, SQL, and Python, I bring strong hands-on experience aligned with your needs. While I am new to freelancer, I have tons of experience and have done other projects off site. I would love to chat more about your project! Regards, MN Williams
₹56 250 INR 30 päivässä
0,0
0,0

With over a decade of experience in the data architecture and engineering domain, specifically with Azure and Databricks, I can bring immense value to your team as your chief Azure & Databricks Data Architecture Expert. I'm well-versed with both Databricks (Delta Lake, SQL, Workflows, Unity Catalog, MLflow basics, Asset Bundles) and Azure Data Services (Azure Data Factory, Azure Data Lake, Azure SQL) along with Python, SQL, and Spark optimization skills you require. My understanding of modern data architectures like Lakehouse and familiarity with governance and metadata management aligns perfectly with the strategic goals of this role. One of my key strengths has been streamlining complex cloud infrastructures for optimal performance. Combined with my deep knowledge of CI/CD procedures and environment management using Databricks Asset Bundles and automation tools, I can help improve your existing pipelines' efficiency. My ability to translate requirements into scalable architectures coupled with proven leadership skills allows me to bring technical governance to your data platforms. Moreover, my passion for mentoring teams, paired with an excellent cross-functional communication style, enables me to quickly adapt in fast-paced environments—delivering results that drive business success. With our shared focus on unlocking the full potential of businesses’ cloud infrastructure and data systems —enabling organizations to remain agile and scaling as they grow
₹56 250 INR 7 päivässä
4,0
4,0

Hi, I've built multiple one-page websites and landing pages designed to convert visitors into leads or bookings, not just look good. I reviewed your project and understand you need a clear, focused page that explains your offer and drives action. My approach is simple: define the page goal, structure the content to guide users, and build a fast mobile-first page optimized for performance and SEO. I avoid bloated designs and focus on clarity, speed, and results. I can start with a clear section layout that highlights your expertise in Azure, Databricks, and cutting-edge data architecture, ensuring your value is communicated effectively. Happy to discuss your goals and timeline. Nadia
₹56 250 INR 30 päivässä
0,0
0,0

With over 10 years of experience in Data Architecture and Data Engineering, including 5+ years working with Azure and Databricks, I have a strong background in designing and optimizing scalable data platforms. I specialize in building end-to-end data architectures based on Lakehouse principles, ensuring that ingestion, storage, processing, and serving are well-governed and aligned with best practices. I have extensive hands-on experience with Databricks (Delta Lake, SQL, Workflows, Unity Catalog) and Azure Data Services (Data Factory, Data Lake, and SQL). I also have a proven track record in performance optimization using Spark, Python, and SQL workloads. My expertise in data modeling, security frameworks, and metadata management allows me to design solutions that are both scalable and secure. In previous roles, I have led data modernization initiatives that support analytics, machine learning, and AI, driving strategic business value. I am also passionate about mentoring teams and fostering a collaborative, results-driven environment. I would bring this experience and leadership to help your organization scale its data architecture effectively. Let’s discuss how I can contribute to your project’s success, ensuring high-quality data systems and continuous improvement.
₹56 250 INR 7 päivässä
0,0
0,0

With over ten years of experience in Data Architecture and Engineering, including more than five years of hands-on work with Azure and Databricks, I am your ideal candidate for this project. I have accumulated a wealth of knowledge in modern data architectures like Lakehouse and Delta Lake, ingesting and processing data through Azure Data Factory and other Azure Data services. I believe what truly sets me apart is my devotion to data governance. From establishing data modeling standards to ensuring robust lineage tracking and cataloging practices, I have consistently prioritized data security, compliance, and overall quality. My expertise extends beyond architecture into driving performance optimization for Spark, SQL, and Python workloads -key to achieving the maximum value from your data pipelines. Importantly, my skills go far beyond technical aptitude. I pride myself on having a strategic mindset with an unwavering execution focus - traits that would be indispensable for successful platform scalability and efficiency improvements. Additionally, my passion for mentoring teams aligns directly with this role's need for providing architectural governance and maximising team potential. With me on board, you'll get not just an expert with deep knowledge of Azure and Databricks but a seasoned professional who can drive impactful results quickly and effectively.
₹56 250 INR 7 päivässä
0,0
0,0

Hi, I’m very glad to see this project and interested to work with you, it matches to my skills and experiences, I’ve worked on many similar projects previously and have good working experience in this field, I’m sure, I can provide you the best outcome exactly as to your requirement, Please let me know about the project and let’s discuss something more about it, Call or WhatsApp me here (+91) 94543-89834 Thanking you.
₹40 000 INR 12 päivässä
0,0
0,0

I have just similar experiences for yours. You are looking for a Data Architect to design and govern a scalable Azure + Databricks Lakehouse platform, covering ingestion, storage, processing, modeling, and serving with strong governance and performance optimization. The role requires deep expertise in Delta Lake, Unity Catalog, Azure Data Factory, CI/CD with Databricks Asset Bundles, and leadership in modern data modernization initiatives supporting analytics and AI. As a software engineer, I would approach this by defining a clear Lakehouse architecture blueprint with layered data zones (bronze, silver, gold), strong governance via Unity Catalog, and modular, CI/CD-driven pipeline orchestration using Asset Bundles and automated deployments. A common risk in large-scale Azure data platforms is uncontrolled schema evolution and performance degradation, so I would enforce strict data modeling standards, lineage tracking, cost-aware Spark optimization, and automated testing to ensure scalability, security, and long-term maintainability. No review it does not means no skill, no experiences. I'd love to talk with you soon.
₹49 999 INR 5 päivässä
0,0
0,0

Hello there, We bring 8 years of experience in data engineering, Python/Spark optimization, and cloud-native AI integration. Your CI/CD via Databricks Asset Bundles with Unity Catalog governance signals a full platform modernization effort, not a basic pipeline project. Our approach: Delta Lake ingestion → structured prompt construction with few-shot examples from cataloged metadata → model calls via MLflow-registered endpoints → output validation against Pydantic schema contracts. RAG pipelines pull lineage and catalog context from Unity Catalog to ground LLM responses and reduce hallucinations. For routine SQL generation or docs, GPT-4o-mini or DBRX handles it cheaply; larger models get reserved for complex architectural recommendations. Caching repeated queries through Delta tables avoids paying twice for the same answer. Our production AI work includes custom RAG pipelines processing millions of records with GPT-based document extraction, plus ETL optimization across 60,000+ records with measurable throughput gains. Biggest risk at enterprise scale is unreliable LLM output — we handle this with retry logic, exponential backoff, fallback models, and rate limiting through Azure API Management. Timeline across 30 days: Weeks 1-2 Lakehouse architecture and governance, Week 3 pipeline optimization and CI/CD, Week 4 AI/LLMOps integration and testing. Weekly syncs via Slack or Teams. Naveen Brainstack Technologies
₹55 000 INR 30 päivässä
0,0
0,0

Hi, I’m a senior Data Architect with deep hands-on experience designing and governing Azure & Databricks lakehouse platforms for analytics and data engineering use cases. I’ve led end-to-end architecture across ingestion, Delta Lake storage, Spark processing, data modeling, and serving layers, with strong focus on governance and performance. My background includes architecting and optimizing Databricks pipelines, implementing Delta Lake best practices, driving Spark/Python/SQL performance tuning, and establishing CI/CD and environment separation using Databricks automation patterns. I’ve also partnered closely with business and analytics teams to translate requirements into scalable, enterprise-grade architectures and mentored engineers on best practices. Comfortable owning architecture, governance, and delivery in fast-paced environments. Happy to discuss further.
₹46 250 INR 7 päivässä
0,0
0,0

Project: Data Architect – Azure & Databricks As a Data Architect with over 10 years of experience, I specialize in designing and optimizing cloud-native data platforms using Azure and Databricks. I have 5+ years working with Databricks (Delta Lake, Unity Catalog, MLflow) and Azure Data Services (Data Factory, Data Lake, Synapse). I’m skilled in creating scalable, event-driven architectures, data governance, and optimizing data pipelines with Spark, SQL, and Python. My expertise extends to CI/CD implementations, ensuring automation and efficient cloud management. I’ve led successful data modernization initiatives supporting ML, AI, and analytics. I excel in technical leadership, mentoring teams, and ensuring adherence to architectural best practices, while continuously improving platform scalability and performance. Let’s discuss how I can enhance your data architecture strategy.
₹56 250 INR 7 päivässä
0,0
0,0

Hi, Resonite Technologies is pleased to propose a Senior Data Architect — Azure & Databricks with 12+ years in data engineering and 6+ years hands-on Azure & Databricks experience. Core Expertise • End-to-end Lakehouse architecture (Bronze/Silver/Gold using Delta Lake) • Databricks (Workflows, Unity Catalog, SQL, MLflow, Asset Bundles) • Azure Data Factory, ADLS Gen2, Azure SQL (Synapse exposure) • Spark (PySpark) & SQL performance tuning • CI/CD with Azure DevOps & automated environment promotion Architecture Leadership • Designed governed Lakehouse platforms with lineage, cataloging & RBAC • Implemented data modeling standards (Kimball + Medallion patterns) • Optimized large-scale pipelines reducing runtime by 40%+ • Led modernization programs enabling analytics, ML & AI workloads Engineering Practices • Modular pipeline design, testing frameworks, Git workflows • Infrastructure as Code & deployment automation • Security-first architecture (Unity Catalog, data masking, encryption) Advanced Capabilities • Exposure to GenAI & LLMOps within Databricks • AI-assisted dev tools (Copilot, Cursor) • Strong cross-functional collaboration with analytics & business teams We can share architecture case studies and modernization success stories upon request. Available for strategic design, governance setup, or full platform leadership. Regards, Resonite Technologies
₹86 250 INR 7 päivässä
0,0
0,0

I can architect and govern a scalable Azure + Databricks data platform aligned with Lakehouse best practices and enterprise governance standards. I’m a Senior Cloud & Data Engineer with 8+ years’ experience designing end-to-end data architectures, building Delta Lake pipelines, and optimizing Spark/SQL workloads for analytics and ML platforms. I’ve implemented production solutions using Databricks (Workflows, Unity Catalog, MLflow basics), Azure Data Factory, ADLS, and Azure SQL, with CI/CD automation, modular pipeline design, and strict data governance. I focus on performance tuning, cost efficiency, lineage visibility, and secure multi-environment deployments so platforms scale reliably. For your project, I’ll define architecture standards, optimize pipelines, implement Asset Bundle–based deployments, and deliver documented, production-ready data infrastructure your teams can extend confidently. Available to start immediately and deliver high-quality results fast.
₹56 250 INR 7 päivässä
0,0
0,0

New Delhi, India
Liittynyt helmik. 28, 2026
₹37500-75000 INR
$750-1500 USD
₹1500-12500 INR
£10-30 GBP
$30-250 USD
£700-900 GBP
₹37500-75000 INR
$2-8 USD/ tunnissa
₹12500-37500 INR
$750-1500 USD
$250-750 USD
$2-8 USD/ tunnissa
₹750-1250 INR/ tunnissa
₹750-1250 INR/ tunnissa
$2-8 USD/ tunnissa
₹1500-12500 INR
₹1500-12500 INR
$30-250 AUD
₹600-1500 INR
$250-750 USD
₹12500-37500 INR