Client: IT Services Center
Office Location: Bucharest
Contract Duration: At least 12 months
Project No.: 001190925

General

We are looking for Senior Data Engineers to join our client’s Data Team, contributing to the design, development, and optimization of scalable data pipelines. The ideal candidate will have expertise in big data technologies, Google Cloud Platform (GCP) services, and data engineering best practices enabling efficient data transformation, warehousing, and analytics.

 

Responsibilities/Activities

  • Design, develop, and optimize ETL/ELT pipelines and data transformation workflows using SQL, and GCP-native ETL tools
  • Ensure high performance, scalability, and cost efficiency while supporting data warehousing initiatives
  • Build, optimize, and maintain scalable data pipelines leveraging Python with PySpark or Apache Beam
  • Build and manage big data processing solutions leveraging Apache Spark on GCP Dataproc and Dataflow for batch and streaming workloads
  • Utilize Cloud Storage, BigQuery, and other GCP data lake services to enable scalable and efficient storage, retrieval, and analytics of large datasets
  • Develop and implement robust data models to support Data Warehousing (BigQuery) and Business Intelligence reporting (Looker, Data Studio)
  • Ensure data quality, validation, and optimization to maintain consistent, reliable, and trustworthy datasets
  • Leverage GCP cloud-native services—including Pub/Sub, Cloud Functions, Cloud Run, and Workflows—to design and deliver scalable, event-driven data solutions
  • Collaborate closely with BI and analytics teams to facilitate seamless access to data and support actionable insights
  • Implement and maintain CI/CD pipelines for automated deployment, monitoring, and operational management of data solutions

Requirements

Technical

  • At least 5 years of experience in Data Engineering
  • Strong proficiency in Python and SQL
  • Experience with PySpark or Apache Beam
  • Experience with workflow orchestration (Apache Airflow / Cloud Composer, GCP Workflows)
  • Expertise in GCP Data Lake & Warehouse stack (Cloud Storage, BigQuery, Dataproc, Dataflow, Pub/Sub)
  • Experience with BI tools (Looker, Data Studio preferred)
  • Strong knowledge of ETL best practices, Data Modeling, and Data Warehousing
  • Solid understanding of CI/CD pipelines and DevOps principles

Education

  • University degree in Computer Science, Mathematics or another related field

Others

  • Good level of English (oral and written)
  • Strong analytical and problem-solving skills
  • Ability to quickly learn and adapt to new technologies
  • Ability to work in a fast-paced, agile environment

Nice to have requirements

  • Experience with Snowflake or multi-cloud architectures
  • Knowledge of Terraform and/or Deployment Manager
  • Experience in the financial-banking industry

Apply for this position

Allowed Type(s): .pdf, .doc, .docx

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. By accepting our Terms and Conditions, you consent to our use of cookies and other tracking technologies. Terms & Conditions

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close