Primary Responsibilities: * Organizing and maintaining real-time data collection, processing, and analysis; * Designing and implementing automated reporting systems for business metric monitoring; * Configuring and optimizing data pipelines (ETL processes); * Working with data visualization tools (e.g., Grafana) to create clear and informative dashboards; * Optimizing high-load analytical queries; * Developing and maintaining predictive models and machine learning; algorithms for data analysis (if required).
Core Skills: * Strong knowledge of SQL with experience in query optimization for large datasets; * Hands-on experience with data pipeline orchestration tools; * Proficiency in data visualization tools (e.g., Grafana, Power BI, Tableau); * Experience working with real-time analytics and data warehouses; * Expertise in big data processing and ETL optimization; * Proficiency in data processing programming languages (e.g., Python, Scala, or SQL); * Experience with Data Bricks (preferred).
Additional Skills: * Understanding of machine learning fundamentals and experience with libraries such as scikit-learn, TensorFlow, or PyTorch (a plus); * Experience working with cloud platforms (AWS, GCP) to deploy analytical solutions; * Understanding of CI/CD processes for automating data analytics infrastructure.
Language Requirements: * Intermediate English proficiency for working with technical documentation and communicating with external service support.
We offer: * An interesting project and non-trivial tasks that will allow us to show your professional attitude and creativity; * Friendly team; * Comfortable working schedule and working conditions; * Opportunity to work remotely as well as in an office located in the city centre; * Stable, competitive salary; * Paid vacation and sick leaves; * Opportunity for professional growth and career development; * English, paid professional courses, coffee/fruits and other pluses :)