GCP , Python, SQL, Dagster , Helm for kubernates and SFTP
Strong understanding of data infrastructure to shape a technically appropriate, maintainable, and scalable enterprise grade data integration platform on the cloud.
Self-motivated and driven to improve the speed and efficiency for ETL developers, data analysts, data scientists, actuaries and operational users.
Good understanding of infrastructure scaling, data operations, system internals, security, network, and distributed data processing.
Familiarity with data warehousing, data integration, workflow orchestration, pub sub, and SQL.
Experience with GCP and/or other cloud providers.
3+ years of proven coding and debugging experience using modern software delivery methods in Python and JVM-based languages.
Experience with Spark and Spark based frameworks as applied to data collection and movement.
Experience with managing and pruning historical data.
Knowledge of monitoring tools and procedures in a data infrastructure environment.