Data Engineer
Atto Trading Technologies Посмотреть все вакансии
- Киев
- Постоянная работа
- Полная занятость
We are expanding an international, diverse team with experts in trading, statistics, engineering, and technology. Our disciplined approach, combined with rapid market feedback, allows us to quickly turn ideas into profit. Our environment of learning and collaboration allows us to solve some of the world’s hardest problems, together. As a small firm, we remain nimble and hold ourselves to the highest standards of integrity, ingenuity, and effort.Role Highlights:We are seeking an experienced Data Engineer to design, build, and maintain our comprehensive Data Lake for a fast-growing number of research and production datasets. This role combines hardware and platform infrastructure expertise with data engineering excellence to support our rapidly growing data assets (~200TB current, scaling ~100TB/year).Responsibilities:
- Architect and manage high-performance, scalable on-premise data storage systems optimized for large-scale data access and analytics workloads
- Configure and maintain compute clusters for distributed data processing
- Plan capacity and scalability roadmaps to accommodate 100TB+ annual data growth
- Design and implement efficient monitoring and alerting systems to forecast growth trends and proactively react to critical states
- Design, create, automate, and maintain various data pipelines
- Enhance existing and setup new “data checks” and alerts to determine when the data is “bad”
- Design and implement a comprehensive on-premise Data Lake system connected to VAST storage solution for normalized market data across:
- US Equities, US Futures, and SIP feeds
- Other market data sources that will be further added
- Security Definition data for various markets
- Various private column data
- Build and operate end‑to‑end data pipelines and SLA/SLO monitoring to ensure data quality, completeness, and governance
- Analyze existing data models, usage patterns, and access frequencies to identify bottlenecks and optimization opportunities
- Develop metadata and catalog layers for efficient data discovery and self‑service access
- Design and deploy event‑driven architectures for near real‑time market data processing and delivery
- Orchestrate ETL/ELT data pipelines using tools like Prefect (or Airflow), ensuring robustness, observability, and clear operational ownership
- Ensure fault tolerance, scalability, and high availability across existing systems
- Partner with traders, quantitative researchers, and other stakeholders to understand use cases and continuously improve the usability, performance, and reliability of the Data Lake
- 5+ years of experience in data engineering or data platform roles
- Proven experience with large‑scale data infrastructure (hundreds of TBs of data, high‑throughput pipelines)
- Strong understanding of market data formats and financial data structures (e.g., trades, quotes, order books, corporate actions)
- Experience designing and modernizing data infrastructure within on-premise solutions
- Bachelor’s degree in Computer Science, Engineering, or related field required; Master’s degree preferred or equivalent practical experience
- Data Engineering - Spark, Iceberg (or similar table formats), Trino/Presto, Parquet optimization
- ETL pipelines - Prefect/Airflow or similar DAG-oriented tools
- Infrastructure - High-performance networking and compute
- Storage Systems - High-performance distributed storage, NAS/SAN, object storage
- Networking - Low-latency networking (aware about DPDK and kernel bypass technologies. Data center infrastructure basics
- Programming - Python (production‑grade), SQL, building APIs (e.g., FastAPI)
- Data Analysis - Advanced SQL, Tableau (or similar BI tools), data profiling tools
- Experience in HFT or financial services
- Background in high‑frequency trading (HFT) or quantitative finance
- Competitive compensation package
- Performance-based bonus opportunities
- Healthcare & Sports/gym budget
- Mental health support, including access to therapy
- Paid time off (25 days)
- Relocation support (where applicable)
- International team meet-ups
- Learning and development support, including courses and certifications
- Access to professional tools, software, and resources
- Fully equipped workstations with high-quality hardware
- Modern office with paid lunches