We are looking for a Senior Data Engineer who will build and maintain scalable data pipelines, modernize orchestration processes and ensure high-quality, reliable data across the product ecosystem. This role focuses on ETL development, data processing, orchestration and platform stability, without a requirement for deep DevOps responsibilities.
Project — a modern data platform focused on building reliable and scalable data pipelines, processing large-scale datasets and improving orchestration processes to support product analytics, internal tools and data-driven decision making across the organization.
Requirements: * Strong background in Data Engineering with hands-on experience in ETL development, orchestration and distributed data processing. * Solid experience with Airflow, Spark, DBT, Kubernetes-based executors and AWS services. * Proven experience working with large-scale datasets (TB level) and complex data workflows. * Strong Python skills for ETL development and data manipulation. * Understanding of data modelling, data quality, storage formats and workflow optimization. * Experience with monitoring and observability tools. * Ability to take ownership, work independently and drive improvements. * Excellent communication skills in a cross-functional environment. * Upper-Intermediate English level or higher.
Responsibilities: * Develop and support ETL pipelines using Airflow, Spark, DBT, Argo, Airbyte and AWS-based tools. * Own data ingestion, transformation and quality processes across datasets of tens of terabytes. * Migrate and modernize orchestration from legacy solutions to Airflow running on Kubernetes. * Design, maintain and optimize data workflows for accuracy, performance and reliability. * Implement data quality checks, governance rules and access controls. * Collaborate with engineering, analytics and product teams to ensure data availability for applications, dashboards and internal tools. * Improve scalability, observability and stability of data processing using monitoring tools such as Prometheus, DataDog and Grafana. * Contribute to architectural decisions related to data workflows and platform evolution. * Provide technical guidance to team members and participate in code reviews.
We offer: * Competitive salary with regular review. * Vacation (up to 20 working days). * Paid sick leave (10 working days). * National holidays as paid time off (11 days). * Online English courses. * Accountant assistance and legal support. * Flexible working schedule: remote, office-based, or hybrid. * Direct cooperation with the customer. * Dynamic environment with low bureaucracy and great team spirit. * Challenging projects in diverse business domains and a variety of tech stacks. * Communication with Top/Senior-level specialists to strengthen your hard skills. * Online teambuildings and volunteering initiatives for culture development and support.