Client: IT Services Center
Office Location: Bucharest
Contract Duration: At least 12 months
Project No.: 001120925

General

We are looking for Senior Data Engineers to join our client’s Data Team, contributing to the design, development, and optimization of scalable data pipelines. The ideal candidate will have expertise in big data technologies, AWS cloud services, and data engineering best practices enabling efficient data transformation, warehousing, and analytics.

 

Responsibilities/Activities

  • Design, develop, and maintain scalable, high-performance data pipelines using Python, PySpark
  • Design, build, and optimize ETL/ELT pipelines and data transformation processes using SQL and ETL frameworks to ensure performance, scalability, and cost-efficiency, while supporting data warehousing solutions
  • Work with big data processing technologies such as Apache Spark on AWS Glue and EMR
  • Utilize Amazon S3, AWS Glue, and AWS Lake Formation for scalable data storage and analytics
  • Design and implement data models to support Data Warehousing (Redshift) and Business Intelligence (BI) needs
  • Ensure data quality, validation, and optimization to maintain reliability and consistency
  • Leverage AWS cloud-native services (Lambda, Step Functions, Kinesis) to build scalable data solutions
  • Collaborate with BI and analytics teams to ensure seamless data accessibility and insights delivery
  • Implement CI/CD pipelines for automated deployment, monitoring, and management of data solutions

Requirements

Technical

  • At least 5 years of experience in Data Engineering
  • Strong proficiency in Python and PySpark
  • Experience with workflow orchestration (Apache Airflow, AWS Step Functions)
  • Expertise in AWS Data Lake & Warehouse stack (S3, Glue, Redshift, EMR, Athena)
  • Experience with BI tools (QuickSight preferred)
  • Advanced SQL skills for data manipulation, validation, and optimization
  • Strong knowledge of ETL best practices, Data Modeling, and Data Warehousing
  • Proven ability to scale ETL pipelines
  • Solid understanding of CI/CD pipelines and DevOps principles

Education

  • University degree in Computer Science, Mathematics or another related field

Others

  • Good level of English (oral and written)
  • Strong analytical and problem-solving skills
  • Ability to quickly learn and adapt to new technologies
  • Ability to work in a fast-paced, agile environment

Nice to have requirements

  • Experience with streaming data solutions (Kafka, Pub/Sub)
  • Experience with Snowflake
  • Knowledge of Terraform and/or CloudFormation
  • Experience in the financial-banking industry

Apply for this position

Allowed Type(s): .pdf, .doc, .docx

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. By accepting our Terms and Conditions, you consent to our use of cookies and other tracking technologies. Terms & Conditions

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close