Want to get hired at Qode?
AWS Data Engineer
Qode
HybridHybrid
Original Job Summary
AWS Data Engineer
Location: Gurgaon / Hyderabad
Experience: 1.6 – 6 years
About the Role
We are seeking a skilled AWS Data Engineer with strong expertise in data engineering tools and cloud technologies. The ideal candidate should have hands-on experience in Snowflake, Python, and SQL, with exposure to AWS Glue. Experience with DBT, Redshift, and Lambda is an added advantage.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL workflows on AWS.
- Work with Snowflake, Python, and SQL to process, transform, and analyze data.
- Optimize and maintain existing data architectures for high performance and cost efficiency.
- Integrate data from multiple sources into a unified data warehouse.
- Collaborate with data analysts, data scientists, and business stakeholders.
- Implement data governance, quality checks, and best practices.
- Leverage AWS services such as Redshift and Lambda for processing tasks.
- Utilize DBT for data modeling and transformation when applicable.
- Troubleshoot data pipeline issues and perform root cause analysis.
Required Skills (Mandatory)
- Snowflake – Data warehouse development and query optimization
- Python – Scripting for automation and data processing
- SQL – Advanced query writing and optimization
- AWS Glue
Preferred / Optional Skills
- DBT – Data transformation and modeling
- AWS Redshift – Data warehousing and analytics
- AWS Lambda – Serverless data processing
Qualifications
- Bachelor’s degree in Computer Science, Information Technology, or related field
- 1.6 – 6 years of hands-on experience in data engineering roles
- Strong problem-solving skills and ability to work in fast-paced environments
- Good communication and collaboration skills
Key skills/competency
Snowflake, Python, SQL, AWS Glue, ETL, DBT, Redshift, Lambda, data pipelines, data warehousing
How to Get Hired at Qode
🎯 Tips for Getting Hired
- Research Qode's culture: Study their mission, values, and recent news.
- Customize your resume: Highlight AWS and data engineering skills.
- Showcase project experience: Detail data pipelines and ETL projects.
- Prepare for technical tests: Review Snowflake, Python, and SQL challenges.
📝 Interview Preparation Advice
Technical Preparation
circle
Review AWS Glue and Redshift usage.
circle
Practice Snowflake query optimization.
circle
Improve Python scripting for automation.
circle
Study SQL advanced query techniques.
Behavioral Questions
circle
Describe a challenging data project.
circle
Explain teamwork in pipeline development.
circle
Detail problem-solving during system failures.
circle
Discuss adapting to technology changes.