We have a requirement for a Data Engineer to :
•Develop scalable ETL Pipelines in Airflow using SQL, Python, Bash, and other technologies
•Work with various APIs to integrate 3rd party apps with internal data models
•Design Dimensional Models that adhere to Kimball methodology
•Create and maintain documentation for each step in the data lifecycle
•Understand and communicate data lineage to foster trust and optimize reporting
•Experience with data warehouse technical architectures, ETL/ ELT, reporting/analytic tools, and scripting.
•Experience with AWS services including S3, Lambda, EMR, RDS, Glue, data pipeline, and other big data technologies.
•Extensive data analysis and data design experience.
•Experience with scripting (Python, PySpark experience is a strong plus).
•Experience with Agile engineering practices and end-to-end automation of data delivery.
•Ability to plan, coordinate, deliver and validate data engineering pipelines.
•Ability to translate business and technical requirements into data engineering designs and solutions.
•5+ years with: