As a Data Engineer you exploit the daily reach of millions of readers and viewers on our online platforms as a competitive advantage by closely collaborating with analysts and data scientists across all departments throughout the entire organisation. You build cloud-based data pipelines, both batch and streaming, and its underlying infrastructure.
As a Data Engineer you ensure the reliability and stability of our Group Data Platform and its data pipelines: you implement required monitoring, you understand how to balance the costs of running cloud processes with benefits of reliable speed of delivery and you want to continually improve production systems. In short: you live up to our principle ‘You Built It, You Run It’.
Your profile
How will you make a difference?
One day you build an ETL to feed our reporting system, the next one you’re building a real-time customer profile by aggregating his online behaviour and allowing the usage of this profile to recommend other articles on our online platforms.
Co-develop and co-operate the cloud-based data pipelines and platform from inception and design, through deployment, operation and refinement
Closely collaborate with data scientists and analysts in our daily work, colocation days, data engineering guild and communities of practice.
You are a protagonist in harmonising our data landscape across countries, over departments and through acquisitions.
Experience implementing highly available and scalable big data platforms
In depth knowledge of at least one cloud provider, preferably AWS
Experience developing in at least one of the following languages in the context of data engineering: Scala and Python
Understanding of modern software engineering best practices like design patterns, microservices and chaos-engineering
DevOps experience (set-up of CI/CD pipelines, set-up of systems ...)
You see the value in a team and enjoy working together with others, also with techniques like pair-programming
You either have an AWS certification or are willing to achieve AWS certification within 6 months (minimum AWS Certified Associate)
Nice to have knowledge:
Experience building highly automated infrastructures by using Terraform
Expertise in designing, analysing and troubleshooting large-scale distributed systems
Define and deploy monitoring, metrics, and logging systems, and automated security controls
Understanding of modern Database concepts and best practices like ETL , data warehousing
Systematic problem-solving approach, coupled with strong communication skills
Snowplow data knowledge
Decent SQL knowledge is a plus
Snowflake, DBT Toolings
Offer
You will be working at a leading media company bustling with fun colleagues.
Like you, they are passionate about digital and offline media and are continuously learning new things from each other and from the best in the trade. You are set out on a journey where every next week will be different from the last, and where you are stimulated on a daily basis to take things to the next level. As a cherry on top, we offer you an interesting salary package and corresponding benefits (company car, group and health insurance, 32 days of paid leave, and a renowned company restaurant).