Data Engineer

  • Deadline 5 Aprel 2025

As Data Engineer you’ll develop, build, maintain, and manage data pipelines. This requires working with large datasets, databases,
and the software used to analyze them
Job responsibilities:

  • Design, develop, and optimize ETL/ELT workflows using ETL/ELT tools (DBT, Apache Nifi) from various sources to big data platforms.
  • Design, develop, and optimize ETL/ELT workflows using SAP SLT, SAP BW for data replication from SAP ERP systems to SAP
  • HANA and big data platforms.
  • Implement data integration solutions using technologies like Hadoop, Spark, Kafka, or similar.
  • Develop data models for big data storage and processing.
  • Develop dimensional data models for data warehouse and OLAP storages,
  • Develop datamarts as a data products for data science and business intelligence workloads
  • Write and optimize code in programming languages such as Python, Java, or Scala to process large datasets.
  • Automate data processing tasks and develop custom data solutions.
  • Develop realtime and batch data ingestion pipelines using Apache Nifi and other data ingestion/mediation rules
  • Develop streaming ELT/ETL data pipelines using Kafka, Apache Spark, Apache Flink
  • Develop and finetune existing CI/CD pipelines using Gitlab
  • Develop and finetune process orchestration using Apache Airflow
  • Develop and manage ETL/ELT processes to ensure efficient data flow.
  • Monitor and improve the performance of data extraction, loading, and transformation processes.
  • Implement data quality checks and validation procedures.
  • Ensure compliance with data governance policies, including data security and privacy standards.
  • Work with cross-functional teams, including data scientists, analysts, and business stakeholders, to meet data requirements.
  • Provide technical support and troubleshoot data-related issues.

Job requirements:

  • Bachelor's degree in Computer Science, Information Systems, or related field; Master's degree is a plus.
  • 3-5 years of experience in data engineering.
  • Experience with SAP reporting stack (SAP BW, SAP HANA, SAP BO Universe design).
  • Experience with big data technologies (e.g., Hadoop, Spark).
  • Basic experience with data streaming technologies (Kafka and Spark Streaming).
  • Strong programming skills in Python, Java, or Scala.
  • Experience with Apache Airflow.
  • Solid understanding of ETL/ELT processes and data warehousing concepts
  • Solid understanding of SQL and NoSQL databases (MS SQL, PostgreSQL, SAP HANA, Cassandra)
  • Experience with containerization (Docker, Kubernetes as developer)
  • Basic understanding of oil and gas industry processes and data requirements is a plus
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and teamwork abilities.

We offer:

  • 5 work days from 8-17 or 9-18;
  • Meal allowance;
  • Annual performance and project bonuses;
  • Corporate health program: VIP voluntary insurance and special discounts for gyms;
  • Access to Digital Learning Platforms.

Note: Only candidates who meet the requirements of the vacancy will be contacted for the next stage.

  • Daily0
  • Weekly423
  • Monthly1613