Job Requirements
Job benefits
-
Team-building events
Our company simply cannot function well without teams of people working together. That said, we provide numerous team-building activities and events for you and your team to nurture meaningful relationships between every individual.
-
Professional Development
Every employee is an invaluable asset to any team; that's why we want to help you grow. Level up your skills and expertise through our professional co-development programs with notable organizations. We will cover the cost.
This job post is managed by
Job description for Big Data Engineer at Fetch Technology
- Deliver big data solution based on premise Hadoop or cloud based systems like AWS. Manage Hadoop cluster,
- Design ingestion layer for structured & unstructured data (text, voice, xml etc) & implement insurance specific data
- model for business & analytics use.
- Deliver ELT solution including data extraction, transformation, cleansing, data integration and data management.
- Implement batch & near real time data ingestion pipelines based on reference architecture like Lambda.
- Ability to augment with new sources of data including internal/external untapped data. Contribute to the establishment and maintenance of cloud computing platform and big data services.
- Operationalize analytics models for production usage with big data workflows, proper security & access control.
- Ability to provide support for analytics tools & environment like RServer etc & debug performance issues.
- Performs analysis, design and development of ETL processes to support project requirements.
- Develop Informatica BDM mappings, SQL/stored procedures as well as data maps for PowerExchange, or Unix shell scripts.
- Develop Sqoop scripts to extract data to/from RDBMS to Hadoop.
- Develop CDC jobs using DES and creates data ingestion pipeline using DEI component.
- Develop Hive & HBase tables and write impala queries.
- Develop Spark jobs in (Scala/Python/Java) in order to stream / publish or consume data from Hadoop.
- Able to use and configure Informatica EDC for metadata management of the Informatica BDM jobs created.
- Able to use Informatica DEQ and Informatica Analyst for Data Profiling and Data Quality scorecards.
- Performs unit testing, QA, and work with business partners to resolve any issues discovered during UAT.
- Responsible for peer-review of mappings and workflows when required.
- Maintains development and test data environments by populating the data based on project requirements.
- Works with production control and operations as needed to promote mappings/workflows, implement schedules and resolve the issues.
- Reviews ETL performance and conducts performance tuning as required on mappings / workflows or SQL.
- Maintains all applicable documentation pertaining to specific SDLC phases.
- Min 4 years of experience in Big Data technology using Cloudera Hadoop and Informatica BDM 10.4.1.
- Have good hands on experience on creating Informatica BDM mappings using CDC powerexchange component as well as DES, DEI and DEQ.
- Knowledge of Hadoop architectures with hands on experience in implementing data lake.
- Experience working & implementing data as a service via REST API for data products.
- Should be strong programmer in Python, Scala, Java & with advance knowledge of Linux.
- Master/Bachelor degree holder in Engineering or computer application or anything equivalent.
- BigData Technology : Apache Hadoop(Cloudera), Spark, Kafka, Hbase, No SQL DB, Hive, Impala, Sqoop, Flume, Pyspark
- Data Integration tools: Informatica BDM 10.4.1, Informatica Axon 7.0, Any other ETL Tool
- Business acumen: Insurance or banking domain knowledge preferred.
- Technology evangelist.
- Strong analytical and problem solving skills.
- Strong understanding of ETL Development best practices, Strong understanding of Database Concepts, Performance
- Tuning in SQL and Informatica.
- Strong knowledge of technology platforms and environments.
- Proven ability to work independently in a dynamic environment with multiple assigned projects and tasks.
- Outstanding ability to communicate, both verbally and in writing.
- Ability to develop complex mappings and workflows in accordance with requirements.
- Attractive salary and annual salary review.
- Working in a professional, friendly, well-equipped environment both with seniors, foreigners and Vietnamese.
- With extensive on job training, you will always have a chance to work with new emerging technologies.
- Full social insurance, health insurance & unemployment insurance according to Vietnam Labor Law.
- 12-day annual leave per year.
- 13th month salary.
- Premium Health Care.
- Enjoy diversified activities: Free snack, Monthly Team-building, Company trip, Play Game (PlayStation, Board games), Play Sport (Football, Badminton...).
- Working hours: From 8 AM - 5 PM, Monday to Friday.
- Location:
Glints Safety Tips
Don't provide your bank or credit card details when applying for jobs. Legitimate employers and hiring managers do not require an application fee or expect you to pay for training.
Learn More