Job Responsibilities
- Develop and maintain ETL pipelines for seamless data integration.
- Build scalable data warehouses to support organizational analytics.
- Establish data quality controls and governance processes, ensuring security and accuracy.
- Collaborate with Data Scientists, Analysts, and business units to deliver actionable insights.
- Optimize workflows and infrastructure for better performance and reduced bottlenecks.
- Drive automation of repetitive data processes to enhance operational efficiency.
- Monitor data pipelines and swiftly resolve technical issues to minimize disruptions.
Requirements
- Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field.
- 2-5 years of experience in data engineering, database management, or similar roles.
- Proficiency in programming languages such as Python, Java, and Scala.
- Advanced SQL skills for querying, transformation, and optimization.
- Hands-on experience with ETL tools like Apache Airflow, Talend, and SSIS.
- Expertise in cloud platforms such as AWS, Azure, and Google Cloud.
- Knowledge of data warehouse solutions like Snowflake, Redshift, and BigQuery.
- Strong problem-solving and analytical skills.
- Effective collaboration and communication abilities.
- Attention to detail and time management proficiency.
Benefits
- Competitive salary package.
- Opportunities for professional development.
- Dynamic and friendly work environment.
- Health and wellness benefits.
How to Apply
If you are interested in this exciting opportunity, please use the Apply button or visit our company website to submit your application.