We use cookies. You have options. Cookies help us keep the site running smoothly and inform some of our advertising, but if you’d like to make adjustments, you can visit our Cookie Notice page for more information.
We’d like to use cookies on your device. Cookies help us keep the site running smoothly and inform some of our advertising, but how we use them is entirely up to you. Accept our recommended settings or customise them to your wishes.

Senior Data Engineer

Location: Prague, .
Company Description:

We Dream. We Do. We Deliver.

As a full-service, data-driven customer experience transformation, we partner with Top 500 companies in the DACH region and in Eastern Europe. Originally from Switzerland, Merkle DACH was created out of a merger between Namics and Isobar - two leading full-service digital agencies. 

Our 1200+ digital enthusiasts are innovating the way brands are built, through providing expertise in Digital Transformation strategy, MarTech platforms, Creativity, UX, CRM, Data, Commerce, Mobile, Social Media, Intranet and CMS. We are part of the global Merkle brand, the largest brand within the dentsu group, who shares with us a network of over 66,000 passionate individuals in 146 countries.

Our business includes to a large extent the integration and adaptation of standard software along the tasks in our customer projects. In order to meet different requirements, we rely on long-term partnerships with leading technology providers.

We are looking for a savvy Data Engineer to join our team of data heroes. You will be responsible for designing and building Big Data architecture pipelines for data lakehouses in cloud, as well as optimizing and productionizing Machine Learning and predictive models. The ideal candidate is an experienced software engineer and data wrangler who enjoys building complex platforms from the ground up, using the latest technologies in cloud. You will cooperate with data architects and data scientists on large data projects for the biggest international brands, as well as build an internal platform framework to ensure consistent & optimal delivery. You should be a versatile self-starter eager to roll out next-gen data architectures, comfortable supporting multiple technologies/teams/solutions/clients, and also a great team player able to work within our international team with a positive, startup-minded attitude.

Job Description:

What you will do

  • Design & implement data ingestion and processing of various data sources using cloud (MS Azure, AWS, Google) ‘big data’ technologies like Spark, DataBricks, Glue, Airflow, Kafka, DataFactory, NoSql DBs, SageMaker, ML Studio
  • Create and maintain data tools for data scientist / analyst teams that assist them in building and optimizing data algorithms like AI / Machine Learning models, productionize the models
  • Assemble large, complex data sets that meet functional / non-functional business requirements for data lakehouse
  • Develop data pipelines to provide actionable insights into marketing automation, customer acquisition, and other key businesses
  • Deploy DevOps automation of continuous development / test / deployment processes
  • Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs, like optimizing existing data delivery, re-designing infrastructure for greater scalability, etc.
  • Support pre-sales by proposing technical solution and accurate effort estimate

Required Skills

  • Experience in building and productionizing big data architectures, pipelines and data sets.
  • Understanding (big) data concepts and patterns (data lake, lambda architecture, streaming processing,  DWH, BI & reporting)
  • 2+ years of experience in a Data Engineer role, who has attained experience using the following software/tools:
    • Experience with big data tools: Hadoop, Spark, Kafka, etc.
    • Experience with object-oriented/object function scripting languages: Python, Scala, Java, Scala, R, C++ etc.
    • Experience with MS Azure (DataBrics, Data Factory, Data Lake, Cosmos DB, Event Hub, PowerBI) or AWS (Glue, EC2, EMR, RDS, Redshift, Sagemaker) cloud services
    • Implementing large-scale data/events oriented pipelines/workflows using ETL tools
    • Extensive working experience with relational (MS SQL, Oracle, postgress, Snowflake..) and NoSQL databases (Cassandra, MongoDB, Redshift, Elasticsearch, Redis,…)
  • Strong analytic skills related to working with (un)structured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Experience in setting up and using CI/CD automation tools
  • Strong project management and organizational skills.

Preferred Skills

  • Deep hands-on development experience in MS Azure or AWS environments
  • Past experience in delivery of business intelligence projects, using tools like PowerBI, Tableau, Qlick Sense, Keboola…
  • Working knowledge of message queuing, stream processing, and highly scalable real-time  data processing using technologies like Storm, Spark-Streaming, etc.
  • Experience with data pipeline / workflow management tools like Airflow, NiFi, StreamSets, Glue, Azure Data Factory etc.
Additional Information:

Why you should work with us?!

  • We are cooperating with an all types and sizes of clients from all around the world which allows us to provide a very diverse portfolio of projects and technologies to choose from to our people.
  • You will be part of a startup-minded branch of a company with a very strong and stable background of the Swiss/German headquarters.
  • Your contributions will have a distinct impact on our clients

With us, you will become part of: 

  • An international team, where you can gain new/relevant experience.
  • A dynamic environment where you will never happen to fall into a routine work.
  • Career path growth astounding – client grows, our team grow, you grow!
  • Start-up agile atmosphere.
  • Friendly international team of creative minds

 And we offer even more!

  • Brand new offices in Prague with great accessibility.
  • Laptop and Mobile with international tariff even for your private use.
  • Cafeteria of benefits to choose from – life insurance, pension insurance, Edenred Cafeteria and more are coming
  • Meal vouchers.
  • 5 weeks of paid vacation (25 days)
  • Sick Days
  • Medical advisory system – ulekare.cz
  • Soulmio – well-being benefit.
  • We value self-education and learning new technologies, so we support all our team members in obtaining new certifications, attending learning tutorials and conferences etc.
  • Flexible working hours and home-office.
  • Opportunity to travel.
  • We have an employee breakfasts regularly. We also enjoy beer or wine and we create a lot of opportunities to get together for those who enjoy life outside of work as well.
  • We value self-education and learning new technologies, so we support all our team members in obtaining new certifications, attending learning tutorials and conferences etc.
  • Relocation support for candidates outside of the Czech Republic if needed.

If interested in this rewarding opportunity, please contact me as soon as possible to discuss in more detail.


More Information:

Graduate Opportunities: Whether you're still studying, recently graduated or are already working and fancy a career hop, we could have a perfect opportunity for you.
Experienced Hires: Leverage your expertise, challenge the status quo and grow your career at Merkle.

In Our Company