Customer Data Engineer

Klik hier om te solliciteren!

Data Engineer

We are looking for a Data Engineer to join the data team at Matrixian Group, who will help design and develop the pipelines and algorithms to build and extend features in the Matrixian Platform. You will work with fellow data engineers to maintain and improve our infrastructure to collect, structure, improve and share data with downstream and client-facing applications for Customer Intelligence.

Full-time · Amsterdam

About the job role

As a data engineer you'll work on data collection, data pipelines, ETL, databases, and APIs. You'll help further develop our Cloud and data center hosted infrastructure. Your work is instrumental to keep our complex, distributed platform resilient to data integrity issues, service disruptions and free from vulnerabilities. You will structure our data so it flows efficiently and securely through the data pipelines to meet client SLA’s and security standard requirements. Your passion is to work with the latest and greatest technologies that make working with large amounts of data easy. You are pro-active in keeping yourself up to date and are always looking for ways how to apply new technologies to our benefit. You will work closely and collaboratively in an Agile environment with our data scientists/analysts, Platform development engineers and product teams to take Matrixian data services from conception to production deployment. 

Job elements

  • Collaborate with domain experts, analysts and data scientists to solve data challenges
  • Apply data modelling methodologies and contribute to a robust data platform
  • Design, develop, and test new data-related features for our platform
  • Design, implement, and maintain scalable data pipelines
  • Monitoring the already established data pipelines
  • Create high-quality code that is scalable, reliable and reusable
  • Optimize data pipelines and workflow to maximize operational performance and efficiency
  • Give input on how to improve the scalability and security of our data platform
  • Enable and run data migrations across different databases and different servers
  • Manipulate, process and extract value from large disconnected datasets
  • Manage individual project priorities, deadlines, and deliverables.


  • Strong programming skills in Python
  • In-depth knowledge of data manipulation and transformation
  • Hands-on experience building complex ETL data pipelines
  • Experience in setting up/maintaining both SQL as well as noSQL databases
  • Skilled in use of deployment and provisioning automation tools e.g. Docker
  • Strong expertise/background with Linux
  • Knowledge in security, authentication and authorization e.g. LDAP
  • Knowledge of data modeling and data warehousing concepts
  • Experience with version control/Git
  • Strong written and verbal English communication skills
  • Experience in Agile/Scrum
  • Teamplayer
  • Responsible
  • Flexible

Must-have skills

  • MongoDB
  • ElasticSearch
  • MySQL
  • Python 3.7 & 3.8
  • Experience with at least two of these Python packages: FastAPI, Flask, Requests, BeautifulSoup, Selenium
  • Experience with at least two of these Python libraries: asyncio, concurrent futures, multiprocessing, threading
  • Git / Version control
  • Docker
  • CI/CD
  • Bash
  • Linux
  • Apache Airflow
  • Web scraping

What we offer

  • Good salary

  • 25 holidays
  • 8% holiday pay
  • Lots of responsibility, independence and room for initiatives and possibilities to make your mark


If the shoe fits, apply!