Описание: |
About the Company
We’re a forward-thinking technology company on a mission to turn data into actionable insights that fuel real-world success. From HR and facility monitoring to retail analytics, marketing, and learning support systems, we harness AI to deliver innovative solutions that help businesses and professionals thrive.
Responsibilities * Design, build, and maintain scalable data pipelines and ETL processes. * Collaborate with cross-functional teams (data scientists, engineers, and researchers) to create and deploy large language model (LLM) solutions. * Work with cloud platforms (preferably GCP) to manage, optimize, and secure data lakes and data warehouses. * Integrate data from various sources, ensuring data quality and reliability for analytics and reporting. Set up and maintain systems for pipeline orchestration (e.g., Airflow, Kubeflow) and containerization (e.g., Docker, Kubernetes). * Contribute to prompt optimization and quality improvements in AI-driven applications. Perform data analysis and reporting using BI tools (Google Data Studio, Power BI, Tableau, etc.).
Nice-to-Have Skills * Proven track record in building and optimizing ETL/ELT processes for large datasets. * Exposure to NoSQL databases. * Hands-on work with Kubernetes, Kubeflow, or Docker. * Familiarity with additional cloud services (AWS, Azure). * Experience integrating ChatGPT API or similar AI services.
If you’re excited to drive innovation through data and AI, we’d love to hear from you. We look forward to exploring how you can help us build the future of AI-driven insights.
Відгукнутись на вакансію |