Job Requirements
Skills
Job benefits
-
Flexible work hours
Productivity curve is not something steady and consistent as it depends on each person's unique traits and preferences. At our company, as long as your team is in sync and your goal is hit, you can flexibly decide when you want to work.
-
Medical insurance
To ensure your health and wellbeing, you have various medical plans to choose from depending on your situation and unique needs. From partial up to full medical coverage, we got you covered.
This job post is managed by
Job description for Data Engineer at Bithealth
- Minimum 2 years of experience working as Data Engineer using Spark, Airflow, and Python.
- Have an engineer mindset to solve problems and keenness to explore new technologies.
- Must be a natural self-starter with the ability to work autonomously.
- A team player who is open to give and receive feedback.
- Design, implement, and manage end-to-end data pipelines (ETL, data streaming, and data warehousing) to make data easily accessible for analysis.
- Build the infrastructure required for optimal extraction, transformation, and loading (ETL) of data from a wide variety of data sources.
- Build data quality checks and ensure the quality of data products, including timeliness, consistency, performance, and accuracy.
- Ensure data quality and data integrity from data sources, identify opportunities for data acquisition & analyze and organize raw data.
- Explore and integrate appropriate new big data technologies into the current infrastructure.
- Monitor data collection, storage, and retrieval processes. Work and collaborate with the data team (data analyst, data scientist) to successfully deliver the project.
- Strong knowledge with one of these programming languages (Python, Java, Scala, Go).
- Experience with Structured and Unstructured Data, both of SQL and No-SQL.
- Familiar with data scheduling solutions like Airflow, Luigi, Apache NiFi.
- Have a Good understanding of SQL.
- Familiar with API concepts and have experience designing and building RESTful API.
- Know how to collaborate in a GIT-based workflow (GitHub, GitLab, BitBucket).
- Have experience managing server infrastructure.
- Have experience managing public cloud (AWS/GCP/Azure).
- Familiarity with cloud big data environments such as AWS (EMR, Glue, Athena, Redshift), GCP (BigQuery, Dataflow, Dataproc) and Azure (Azure Synapse Analytics, Databricks, ADLS Gen 2).
- Comfortable working with object-storage (Amazon S3/Google Cloud Storage/Azure Blob Storage).
- Have a curiosity with all the new evolving data engineering tools and can explore independently.
- Advanced in debugging.
- Good Teamwork & Communication skills.
- Familiarity with monitoring stack like ELK, Grafana, Prometheus is a plus.
Glints Safety Tips
Don't provide your bank or credit card details when applying for jobs. Legitimate employers and hiring managers do not require an application fee or expect you to pay for training.
Learn More