We are looking for a motivated and experienced Data Engineer to join our team! As a Data Engineer, you will be responsible for building and maintaining internal and external data feeds and data pipelines.
Don't miss this chance to land on the best opportunity and get the chance to upskill. Apply now!
Job Requirements & Qualification:
- Must Have Skills:
- Python + ETL (foundation) and upskilled to Apache Airflow + trouble shooting on Python to prepare for dashboarding on Tableau
Job Roles & Responsibilities:
- Designs, builds, tests and maintains on-premises and cloud-based data architecture and structures such as data marts, data warehouses, data lakes and data pipelines.
- Designs, builds, tests and maintains on-premises and cloud-based data pipelines to acquire, consolidate, integrate and persist structured data streams.
- Designs and develops the data lakes batch management control processes and error handling procedures.
- Designs and develops ETL processes for the data lakes lifecycle (staging of data, ODS data integration, EDW and data lakes).
- Prepares and documentation technical specifications, authors and executes unit test scripts/cases.
- Works with internal and external data providers and subject matter experts to understand data sources and formats.
- Keeps apprised of current and emerging data, development, and cloud-based technologies and best practices.
- Identifies opportunities for process improvement/optimization; designs and implements these improvements as directed.
- Works within the Data Analytics Team and to guide in feature development, testing and deployment.
- Helps implement and plan for disaster recovery.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.