- Growe welcomes those who are excited to:Design, develop, and maintain scalable data pipelines using Airflow
- Create and optimize tables in Athena, ensuring efficient query performance and cost-effectiveness
- Write and manage SQL transformations in DBT, building reusable and well-documented models
- Optimize data workflows for performance, reliability, and cost-efficiency
- Automate infrastructure provisioning using Terraform for data processing environments
- Ensure data integrity and consistency through monitoring, validation, and error handling
- Collaborate with analytics, BI, and engineering teams to understand data needs and deliver solutions
- Troubleshoot and resolve data pipeline issues in a timely manner.
- We need your professional experience:1+ years of experience in data engineering with a focus on ELT processes
- Good expertise in Python, SQL, data modeling, and performance optimization
- Experience with Airflow DAGs for workflow orchestration
- Experience with DBT for data transformations and modular modeling
- Understanding of AWS S3, Athena, and data lake architectures
- Familiarity with Terraform for infrastructure automation (will be a plus).
We appreciate if you have those personal features:Strong analytical skills;Problem-solving skills;Attention to detail.
We are seeking those who align with our core values:GROWE TOGETHER: Our team is our main asset. We work together and support each other to achieve our common goals;DRIVE RESULT OVER PROCESS: We set ambitious, clear, measurable goals in line with our strategy and driving Growe to success;BE READY FOR CHANGE: We see challenges as opportunities to grow and evolve. We adapt today to win tomorrow.
Originally posted on Himalayas