Senior Data Engineer @ Velotio Technologies
Your Application Journey
Email Hiring Manager
Job Details
About the Role - Senior Data Engineer
Velotio Technologies is a product engineering company working with innovative startups and enterprises. Recognized as a Great Place to Work® in India, our team of 450+ elite engineers builds cloud-native, data engineering, B2B SaaS, IoT & Machine Learning products. We have delivered full-stack product development for 110+ startups worldwide.
Key Responsibilities
- Design, develop, and maintain robust, scalable data pipelines.
- Collaborate with business stakeholders to translate data requirements.
- Implement data quality checks and monitoring for accuracy.
- Optimize data pipelines for performance and efficiency.
- Troubleshoot and resolve data pipeline issues.
- Stay updated with emerging technologies in data engineering.
Qualifications
Minimum of 2 years' experience in data engineering with a Bachelor’s or Master’s degree in Computer Science, Engineering or a related field. Proficiency in SQL, Python/Java, and experience with Spark, Streaming Analytics, Medallion Architecture, Data Connector Development and Data Modelling are required. Familiarity with cloud-based data warehousing solutions such as Snowflake and AWS tools (Kinesis, SNS, SQS) is essential.
Desired Skills & Experience
- Data pipeline architecture and ETL processes.
- Data warehousing and data modeling.
- Python and cloud computing.
Our Culture
We pride ourselves on an autonomous, empowered work culture with a flat hierarchy and a startup-oriented approach. Velotio Technologies values diversity and inclusivity, celebrating regular successes in a fun and positive environment.
Key skills/competency
Senior Data Engineer, data pipelines, SQL, Python, Spark, AWS, data warehousing, ETL, data modeling, cloud computing
How to Get Hired at Velotio Technologies
🎯 Tips for Getting Hired
- Research Velotio Technologies' culture: Study their mission, values, and recent success stories.
- Customize your resume: Highlight data pipeline and cloud experience.
- Prepare for technical interviews: Focus on SQL, Python, Spark, and AWS.
- Showcase past projects: Demonstrate data engineering successes clearly.