Data Engineer (Health & Fitness app) | Partner
- We are looking for a Data Engineer to join our partner team.
- The primary goal of this role is to design, build, and maintain a scalable data infrastructure that supports analytics, reporting, and data-driven decision-making across the organization.
- This role focuses on developing reliable data pipelines, integrating multiple data sources, ensuring data quality, and delivering business-oriented data solutions aligned with company goals.
Remote Full-Time | Data Engineer
Apply
job description
WHAT YOU’LL DO:
- Development and maintenance of ELT processes for collecting, transforming, and loading data from various sources into BigQuery.
- Creation and development of a dbt project to build data models (stage, base, intermediate, dimension, report layers) used in analytics and reporting.
- Optimization of performance of pipelines, SQL queries, and data models.
- Validation, testing, and data quality control at all stages of the pipeline.
- Setting up monitoring and alerting for data pipelines and key infrastructure components.
- Integration of data from various sources (mobile application, financial and marketing information), including APIs and external systems.
- Collaboration with analysts to ensure data is properly structured, easy to use, and supports the team’s analytical needs.
- Maintaining up-to-date documentation regarding data architecture, ELT processes, and dbt models.
- Tools you will work with: SQL, Python, dbt, and GCP cloud services for building and maintaining data pipelines.
WHAT WE EXPECT FROM YOU:
- 3+ years of experience in a Data Engineer position or a related role with a strong focus on building pipelines and data modeling.
- Deep knowledge of SQL: complex queries, optimization, understanding of Data Warehousing principles.
- Practical experience working with BigQuery, Snowflake, Redshift, or other analytical warehouses.
- Experience working with dbt, including testing, documentation, and CI/CD integration.
- Experience building and maintaining ELT/ETL pipelines.
- Strong proficiency in Python for integrations, automation, and data processes.
- Confident use of Git and experience working with GitHub/GitLab.
- Experience working with GCP or other cloud platforms.
- Attention to detail, strong problem-solving mindset, and ownership of results.
- Experience collaborating with analysts and understanding their data needs.
- Nice to have:
- Experience optimizing cost efficiency and query performance in cloud environments.
- Experience working with real-time / streaming data solutions.
- Ability to clearly explain technical solutions to stakeholders.
Do you want to know some details about this position?
Kate will help!more details
YOUR JOURNEY WITH US:
- Step 1: Pre-screen.
- Step 2: Interview with Product Analyst and Data Engineer.
- Step 3: Final interview with HRD.
- Step 4: Reference check.
- Step 5: Job Offer!
WHAT WE OFFER:
- Fast learning culture — regular workshops (from hard skills to life hacks), sessions with internal leaders and external experts, plus a learning budget for courses and conferences.
- Remote with trust — core hours 10:00–18:00 (Kyiv time), then full flexibility. We value results, not hours online.
- Career boost — fast, transparent growth in roles and responsibilities, with internal candidates always prioritized.
- Wellbeing support — 2 additional recovery days per month, optional weekly meditations, 14 days of vacation, and paid sick leave.
WHAT WE OFFER:
Recommend a friend
apply
Haven’t found
a vacancy that
suits you?
Maybe we will find something to offer you
Send resume