Founded in 1996, Sinteks Group of Companies is one of the largest and most successful retail companies in the Caucasus and Central Asia. Sinteks operates over 100 fashion, jewelry, perfumery & cosmetics, and homeware stores, as well as nearly 10 cafe branches. The company represents international brands across luxury, premium and mass-market segments.
Important notice for applicants: Please review the requirements carefully and ensure your CV clearly reflects the relevant knowledge and skills.
Role Overview:
This role focuses on creating robust, automated data pipelines and integrating with advanced AI platforms to solve key business challenges. You will establish a scalable data architecture that ensures data quality, enabling real-time business intelligence and strategic decision-making to drive efficiency and value throughout the organization.
Key Responsibilities:
Solution Development.
- Design, develop, test, and deploy end-to-end automation workflows and applications
- Write clean, efficient, and maintainable code, primarily in Python.
Data Engineering & Integration.
- Create and manage efficient data pipelines, primarily using SQL, to extract and prepare data for use in automation workflows
- Integrate with third-party LLM APIs (e.g., OpenAI, Anthropic) to perform complex tasks like text analysis and data extraction
- Build and consume REST APIs to connect various systems.
Data Engineering & Integration.
- Work in a tight-knit partnership with the Product Manager to understand requirements and deliver effective technical solutions
- Manage the technical infrastructure for projects, including database connections, API credentials, and cloud hosting.
- Monitor the performance and reliability of deployed automations.
Data Infrastructure Development.
- Design and implement automated data extraction pipelines from multiple ERP systems including 1C
- Develop and maintain a centralized data warehouse architecture
- Establish data quality monitoring and validation processes
- Create technical infrastructure to support advanced analytics and reporting automation
- Ensure system reliability and availability for critical reporting functions
Technical Standards and Documentation.
- Define and implement data engineering best practices
- Maintain comprehensive technical documentation for all systems and processes
- Evaluate and recommend new technologies for data infrastructure improvement
- Provide technical guidance as the team expands
Professional Knowledge:
- Python and SQL programming languages - Expert level
- ETL/ELT pipeline development - Expert level
- Cloud platforms (Azure, AWS, or GCP) - Advanced level
- Software engineering fundamentals (Git, testing) - Advanced level
- Data modeling and database architecture - Advanced level
- Version control systems (Git) - Proficient level
- Azerbaijani language - Native or fluent
- English language - Advanced level
- MS Office Suite - Proficient level
Education:
- Bachelor's degree in Computer Science, Information Systems, Engineering, or related technical field
Work Experience:
- Minimum 3-5 years of experience in a software engineering, data engineering, or backend development role.
- Demonstrated experience building production-level data pipelines
- Experience with financial or ERP data systems preferred
- Experience with 1C or similar enterprise systems advantageous
- Proven, hands-on experience building data-intensive applications
Professional Skills and Competencies:
- Strong analytical and problem-solving abilities
- Attention to detail and commitment to data accuracy
- Ability to work independently and manage multiple priorities
- Effective communication skills for technical and non-technical audiences
- Project management and deadline adherence capabilities
- Continuous learning mindset for emerging technologies
Interested candidates are invited to send their CVs to the e-mail address in the Apply for job button indicating “Data Engineer” in the subject line.