
Middle Data Engineer
- Украина
- Постоянная работа
- Полная занятость
- Ingest data from RDBMS/APIs/files into Snowflake (batch/incremental; CDC when applicable).
- Build modular SQL/Python transformations; handle semi‑structured JSON; publish consumer‑read tables/views.
- Orchestrate Airflow DAGs (dependencies, retries, backfills, SLAs) with monitoring and alerting.
- Ensure idempotent re‑runs/backfills; maintain runbooks and perform RCA for incidents.
- Tune performance & cost in Snowflake (warehouse sizing, pruning; clustering when justified).
- Partner with BI/Analytics to refine definitions and SLAs for delivered datasets.
- 2–4 years building production ETL/ELT; strong SQL (joins, window functions) + Python for data tooling.
- Snowflake hands‑on: Streams/Tasks/Time Travel; performance & cost basics; JSON handling.
- Airflow proficiency: reliable DAGs, retries/backfills, SLAs; monitoring & alert routing.
- Data warehousing/modeling (Kimball/3NF), schema evolution; API integrations (auth, pagination, rate limits, idempotency).
- Git‑based CI/CD; clear written English; privacy/GDPR basics.
- iGaming familiarity: stakes, wins, GGR/NGR, RTP, retention/ARPDAU, funnels; RG/regulatory awareness.
- AI & automation interest/experience: Snowflake Cortex for auto‑documentation, semantic search over logs/runbooks, or parsing partner PDFs (with guardrails).
- Exposure to cloud storage (GCS/S3/ADLS), Terraform/Docker, and BI consumption patterns (Tableau/Looker/Power BI).
- Direct cooperation with the already successful, long-term, and growing project.
- Flexible work arrangements.
- 20 days of vacation.
- Truly competitive salary.
- Help and support from our caring HR team.