Tunjangan dan keuntungan
Opsi kerja jarak jauh
Thanks to technology, we no longer have to be physically present at the office to be productive. Joining our company allows you to work anywhere without place-constraint.
Our company simply cannot function well without teams of people working together. That said, we provide numerous team-building activities and events for you and your team to nurture meaningful relationships between every individual.
Ever feel stuck with your career? We don't hire you simply because we needed to fill an empty slot. Together, we will help you shape and grow your career so you can progress further and rediscover your true sense of purpose at work.
Bantuan tempat kerja
In need of a laptop or certain devices specifically for work? It's on us. We will provide the necessary tools that you need so you can focus on what you do best and get a job done.
Deskripsi pekerjaan Data Engineer Pt Nomura Research Institute Indonesia
- Perform data exploration, data cleaning, data imputation, and feature engineering on unstructured and structured data.
- Build the infrastructure for optimal extraction, transformation, and loading (ETL) of data from a wide variety of data sources.
- Develop and maintain optimal data pipeline architecture for training statistical and machine learning models such as regression and classification.
- Develop and maintain evaluations to measure the effectiveness of training data. This includes measuring the capabilities of models on a variety of tasks and domains.
- Collaborate with data scientists and machine learning engineers to develop a comprehensive data science/machine learning solution pipeline.
- Bachelor's degree in computer science or related fields, or equivalent software engineering experience.
- Proficiency in Python programming language
- Experience in dataset processing and feature engineering using tools such as Numpy, Pandas, and Scikit-Learn
- Visualization skills using tools such as Matplotlib, Seaborn, and Bokeh
- Understanding of deep learning frameworks such as Pytorch and TensorFlow
- Understanding of SQL and NoSQL
- Understands Hadoop / Spark / Kafka / Hive / Presto
- Proficiency in source control i.e. Git
- Deep understanding of Object-Oriented Programmings (OOP) concepts such as inheritance, delegation, and abstract class
- Understanding of cloud-native technologies such as AWS, GCP, and Azure
- Experience in using Docker
- Experience in using AWS services such as S3, EC2, Glue, Sagemaker
- Experience in AWS Step Function and/or AWS Lambda is even better
- Proficiency in Scala and Java programming languages
- Enjoy iterating quickly with research prototypes and learning new technologies
- Apply via Glints
- Screening interview with our in-house recruiter
- Aptitude test
- Techical test
- Interview with our Hiring Manager
- Final interview