Design, build and implement efficient, scalable and high-performance data pipelines and workflows to process high volume of data into multiple tech e.g. Hadoop, NoSQL, OLAP cube, or traditional RDBMS.
Documentation and knowledge transfer to customers as required.
Design, test and execute and setup and migration plan at customer sites.
Interfacing on-site directly with customers, working with clients’ personnel and our consultants.
Onsite/offsite support applications in various locations.
Job Requirements:
Graduated from reputable University majoring in Information Technology or Computing. GPA minimum = 3.00.
Experience as ETL developer/data engineer/data warehouse developer minimum 2 years.
Good knowledge in various relational databases and solid SQL scripting experience and understanding.
Solid understanding in at least 1 ETL / data pipeline applications: Informatica PowerCenter, IBM DataStage, Oracle DI, SAP data services, Microsoft SSIS, Airflow etc.
Good knowledge in Data Model and Data Warehouse
Fluent both in spoken and written English is a must.
Able to work independently with minimum supervision.
Able to perform in a demanding, changing, and fast-paced environment.
Good communication skills.
Team player.
Qualifications & other requirements
You should have or be completing the following to apply for this opportunity.