Principal Engineer, Data Platforms Ready for a challenge? Then Just Eat might be the place for you. We're a leading global online delivery platform, and our vision is to empower everyday convenience. Whether it's a Friday-night feast, a post-gym poke bowl, or grabbing some groceries, our tech platform connects tens of millions of customers with hundreds of thousands of restaurant, grocery and convenience partners across the globe. About this role We are seeking a highly accomplished and visionary Principal Engineer to join our Data Platforms leadership team. You will serve as the top technical authority, reporting directly to the Director of Engineering, and closely pairing with the Head of Engineering. This role defines the technical strategy and architecture that enables our organisation of 50+ engineers to build and operate robust, scalable, and high-performance data systems. We value your ability to lead on strategy, deliver technical excellence, and care for the continuous growth of our engineering team. Location: Hybrid - 3 days a week from our London or Amsterdam office & 2 days working from home Reporting to: Director of Engineering These are some of the key components to the position: Define the long term technical vision and roadmap for our modern Data Platform, Mobius (including Data Warehousing, Data Lake, Streaming, and Governance tooling). Design and govern the reference architecture for core data infrastructure, ensuring optimal scalability, reliability, and security. Act as a hands on contributor, tackling the most complex technical challenges and providing code level guidance for critical components. Establish and enforce technical standards for code quality, observability, and Infrastructure as Code (IaC) across all data platform teams. Act as a technical mentor and coach for Senior and Staff Engineers, raising the technical bar across the organisation. Focus on platform engineering principles to improve the developer experience, velocity, and efficiency of all data engineering teams. Lead the evaluation, prototyping, and adoption of new data technologies, balancing industry best practices with business needs. What will you bring to the team? Extensive experience in software and data engineering, with provable experience operating at a Principal, Staff, or equivalent level. Deep proficiency across the modern data stack (Snowflake, BigQuery, Delta Lake, Iceberg, Kafka, Flink). Proven track record designing scalable, self service data platforms using cloud native services (AWS/GCP) and infrastructure automation (Terraform, Ansible). Expert proficiency in Python, Scala, or Go, and extensive experience with data transformation frameworks (Airflow, dbt). Exceptional ability to synthesise complex requirements into simple, elegant, and maintainable architectural designs. Strong communication skills with the ability to influence and align engineering, product, and executive stakeholders. Results driven mindset with the ability to execute quickly, adapt to change, and thrive in high growth, fast paced environments. Desired Skills Experience in a rapidly scaling organisation focused on building distributed systems. Familiarity with data governance, lineage, and observability tools (e.g., Datadog, Prometheus, Open Telemetry). Strong understanding of Machine Learning Operations (MLOps) and how data platforms support the full ML lifecycle. At JET, this is on the menu: Our teams forge connections internally and work with some of the best known brands on the planet, giving us truly international impact in a dynamic environment. Fun, fast paced and supportive, the JET culture is about movement, growth and about celebrating every aspect of our JETers. Inclusion, Diversity & Belonging No matter who you are, what you look like, who you love, or where you are from, you can find your place at Just Eat We're committed to creating an inclusive culture, encouraging diversity of people and thinking, in which all employees feel they truly belong and can bring their most colourful selves to work every day. What else is cooking? Want to know more about our JETers, culture or company? Have a look at our career site where you can find people's stories, blogs, podcasts and more JET morsels. Are you ready to take your seat? Apply now!
Jan 17, 2026
Full time
Principal Engineer, Data Platforms Ready for a challenge? Then Just Eat might be the place for you. We're a leading global online delivery platform, and our vision is to empower everyday convenience. Whether it's a Friday-night feast, a post-gym poke bowl, or grabbing some groceries, our tech platform connects tens of millions of customers with hundreds of thousands of restaurant, grocery and convenience partners across the globe. About this role We are seeking a highly accomplished and visionary Principal Engineer to join our Data Platforms leadership team. You will serve as the top technical authority, reporting directly to the Director of Engineering, and closely pairing with the Head of Engineering. This role defines the technical strategy and architecture that enables our organisation of 50+ engineers to build and operate robust, scalable, and high-performance data systems. We value your ability to lead on strategy, deliver technical excellence, and care for the continuous growth of our engineering team. Location: Hybrid - 3 days a week from our London or Amsterdam office & 2 days working from home Reporting to: Director of Engineering These are some of the key components to the position: Define the long term technical vision and roadmap for our modern Data Platform, Mobius (including Data Warehousing, Data Lake, Streaming, and Governance tooling). Design and govern the reference architecture for core data infrastructure, ensuring optimal scalability, reliability, and security. Act as a hands on contributor, tackling the most complex technical challenges and providing code level guidance for critical components. Establish and enforce technical standards for code quality, observability, and Infrastructure as Code (IaC) across all data platform teams. Act as a technical mentor and coach for Senior and Staff Engineers, raising the technical bar across the organisation. Focus on platform engineering principles to improve the developer experience, velocity, and efficiency of all data engineering teams. Lead the evaluation, prototyping, and adoption of new data technologies, balancing industry best practices with business needs. What will you bring to the team? Extensive experience in software and data engineering, with provable experience operating at a Principal, Staff, or equivalent level. Deep proficiency across the modern data stack (Snowflake, BigQuery, Delta Lake, Iceberg, Kafka, Flink). Proven track record designing scalable, self service data platforms using cloud native services (AWS/GCP) and infrastructure automation (Terraform, Ansible). Expert proficiency in Python, Scala, or Go, and extensive experience with data transformation frameworks (Airflow, dbt). Exceptional ability to synthesise complex requirements into simple, elegant, and maintainable architectural designs. Strong communication skills with the ability to influence and align engineering, product, and executive stakeholders. Results driven mindset with the ability to execute quickly, adapt to change, and thrive in high growth, fast paced environments. Desired Skills Experience in a rapidly scaling organisation focused on building distributed systems. Familiarity with data governance, lineage, and observability tools (e.g., Datadog, Prometheus, Open Telemetry). Strong understanding of Machine Learning Operations (MLOps) and how data platforms support the full ML lifecycle. At JET, this is on the menu: Our teams forge connections internally and work with some of the best known brands on the planet, giving us truly international impact in a dynamic environment. Fun, fast paced and supportive, the JET culture is about movement, growth and about celebrating every aspect of our JETers. Inclusion, Diversity & Belonging No matter who you are, what you look like, who you love, or where you are from, you can find your place at Just Eat We're committed to creating an inclusive culture, encouraging diversity of people and thinking, in which all employees feel they truly belong and can bring their most colourful selves to work every day. What else is cooking? Want to know more about our JETers, culture or company? Have a look at our career site where you can find people's stories, blogs, podcasts and more JET morsels. Are you ready to take your seat? Apply now!
Snowflake Senior Developer Location: London (Hybrid) Employment Type: Full-time Seniority: Senior Individual Contributor Reports to: Data Engineering Lead / Manager Summary of role We are seeking a Snowflake Senior Developer to design, develop, and optimise data solutions on our cloud data platform. You will work closely with data engineers, analysts, and architects to deliver high-quality, scalable data pipelines and models. Strong expertise in Snowflake, ETL/ELT, data modelling, and data warehousing is essential. Responsibilities Snowflake Development: Build and optimise Snowflake objects (databases, schemas, tables, views, tasks, streams, resource monitors). ETL/ELT Pipelines: Develop and maintain robust data pipelines using tools like dbt, Airflow, Azure Data Factory, or similar. Data Modelling: Implement dimensional models (star/snowflake schemas), handle SCDs, and design efficient structures for analytics. Performance Tuning: Optimise queries, manage clustering, caching, and warehouse sizing for cost and speed. Data Quality: Implement testing frameworks (dbt tests, Great Expectations) and ensure data accuracy and freshness. Security & Governance: Apply RBAC, masking policies, and comply with data governance standards. Collaboration: Work with BI teams to ensure semantic alignment and support self-service analytics. Documentation: Maintain clear technical documentation for pipelines, models, and processes. Qualifications Matric and a Degree in IT Strong SQL skills (complex queries, performance tuning) and proficiency in Python for data processing. Experience with ETL/ELT tools (dbt, Airflow, ADF, Informatica, Matillion). Solid understanding of data warehousing concepts (Kimball, Data Vault, normalization). Familiarity with cloud platforms (Azure preferred; AWS/GCP acceptable). Knowledge of data governance, security, and compliance (GDPR). Excellent problem-solving and communication skills. Skills Experience with Snowpark, UDFs, dynamic tables, and external tables. Exposure to streaming/CDC (Kafka, Fivetran, Debezium). BI tool integration (Power BI, Tableau, Looker). Certifications: SnowPro Core or Advanced.
Jan 15, 2026
Contractor
Snowflake Senior Developer Location: London (Hybrid) Employment Type: Full-time Seniority: Senior Individual Contributor Reports to: Data Engineering Lead / Manager Summary of role We are seeking a Snowflake Senior Developer to design, develop, and optimise data solutions on our cloud data platform. You will work closely with data engineers, analysts, and architects to deliver high-quality, scalable data pipelines and models. Strong expertise in Snowflake, ETL/ELT, data modelling, and data warehousing is essential. Responsibilities Snowflake Development: Build and optimise Snowflake objects (databases, schemas, tables, views, tasks, streams, resource monitors). ETL/ELT Pipelines: Develop and maintain robust data pipelines using tools like dbt, Airflow, Azure Data Factory, or similar. Data Modelling: Implement dimensional models (star/snowflake schemas), handle SCDs, and design efficient structures for analytics. Performance Tuning: Optimise queries, manage clustering, caching, and warehouse sizing for cost and speed. Data Quality: Implement testing frameworks (dbt tests, Great Expectations) and ensure data accuracy and freshness. Security & Governance: Apply RBAC, masking policies, and comply with data governance standards. Collaboration: Work with BI teams to ensure semantic alignment and support self-service analytics. Documentation: Maintain clear technical documentation for pipelines, models, and processes. Qualifications Matric and a Degree in IT Strong SQL skills (complex queries, performance tuning) and proficiency in Python for data processing. Experience with ETL/ELT tools (dbt, Airflow, ADF, Informatica, Matillion). Solid understanding of data warehousing concepts (Kimball, Data Vault, normalization). Familiarity with cloud platforms (Azure preferred; AWS/GCP acceptable). Knowledge of data governance, security, and compliance (GDPR). Excellent problem-solving and communication skills. Skills Experience with Snowpark, UDFs, dynamic tables, and external tables. Exposure to streaming/CDC (Kafka, Fivetran, Debezium). BI tool integration (Power BI, Tableau, Looker). Certifications: SnowPro Core or Advanced.
Job Title: Senior Systems Developer DETAILS We are seeking a highly skilled Senior Systems Developer with extensive experience in data architecture, system design, and enterprise-level application development. The successful candidate will be responsible for constructing scalable systems, designing robust data models, and guiding the technical direction of backend and data-driven solutions across the organisation. DUTIES & RESPONSIBILITIES Design, develop, and maintain sophisticated backend systems, APIs, and services. Lead architectural decisions to ensure systems are scalable, secure, and high-performing. Implement best practices for software engineering, and cloud-native development. Collaborate with cross-functional teams (Data Engineering, DevOps, Product, QA) to conceptualise and deliver high-quality solutions. Define and implement enterprise data models, data flows, and database schemas. Architect and maintain data pipelines, data lakes, and data warehouses. Optimise data storage, retrieval, partitioning, and indexing strategies for performance and scalability. Ensure data quality, governance, lineage, and compliance with security standards. Develop integrations between internal and external systems utilising APIs, ETL tools, and messaging systems. Automate workflows, monitoring, and deployment processes. Drive platform modernisation initiatives and migrations to cloud. Participate in code reviews, architecture meetings, and technical strategy discussions. Provide expert guidance on system performance, scalability, and troubleshooting. SKILLS, EXPERIENCE & QUALIFICATIONS Bachelor s or Master s degree in Computer Science, Information Technology, or related discipline. Minimum 8 years of experience in systems development, with at least 2 years dedicated to data architecture. Demonstrated success in delivering enterprise-grade systems and data platforms. Strong programming expertise in Python and AI skills. Profound understanding of system architecture, design patterns, and microservices. Hands-on experience with cloud platforms such as AWS, Azure, or GCP. Expertise in SQL and NoSQL database technologies. Knowledge of ETL/ELT frameworks, data modelling and data governance. Familiarity with containerisation and orchestration tools such as Docker and Kubernetes. Awareness of security frameworks, including authentication and authorisation protocols. Analytical and problem-solving capabilities. Excellent communication and documentation skills. Ability to work independently and lead cross-functional teams. Adaptability to rapidly evolving technological environments PREFERRED SKILLS Airflow, dbt, Spark, Kafka, RabbitMQ, Redis. Git, CI/CD pipelines. Experience with data warehousing solutions such as Snowflake, Redshift, BigQuery, or Synapse. Exposure to AI/ML workflows and model deployment. Experience with streaming systems and real-time architecture. Knowledge of event-driven and serverless architectural patterns. The Salary will be £ 42,500 - £ 45,500 DOE Type: Permanent
Jan 09, 2026
Full time
Job Title: Senior Systems Developer DETAILS We are seeking a highly skilled Senior Systems Developer with extensive experience in data architecture, system design, and enterprise-level application development. The successful candidate will be responsible for constructing scalable systems, designing robust data models, and guiding the technical direction of backend and data-driven solutions across the organisation. DUTIES & RESPONSIBILITIES Design, develop, and maintain sophisticated backend systems, APIs, and services. Lead architectural decisions to ensure systems are scalable, secure, and high-performing. Implement best practices for software engineering, and cloud-native development. Collaborate with cross-functional teams (Data Engineering, DevOps, Product, QA) to conceptualise and deliver high-quality solutions. Define and implement enterprise data models, data flows, and database schemas. Architect and maintain data pipelines, data lakes, and data warehouses. Optimise data storage, retrieval, partitioning, and indexing strategies for performance and scalability. Ensure data quality, governance, lineage, and compliance with security standards. Develop integrations between internal and external systems utilising APIs, ETL tools, and messaging systems. Automate workflows, monitoring, and deployment processes. Drive platform modernisation initiatives and migrations to cloud. Participate in code reviews, architecture meetings, and technical strategy discussions. Provide expert guidance on system performance, scalability, and troubleshooting. SKILLS, EXPERIENCE & QUALIFICATIONS Bachelor s or Master s degree in Computer Science, Information Technology, or related discipline. Minimum 8 years of experience in systems development, with at least 2 years dedicated to data architecture. Demonstrated success in delivering enterprise-grade systems and data platforms. Strong programming expertise in Python and AI skills. Profound understanding of system architecture, design patterns, and microservices. Hands-on experience with cloud platforms such as AWS, Azure, or GCP. Expertise in SQL and NoSQL database technologies. Knowledge of ETL/ELT frameworks, data modelling and data governance. Familiarity with containerisation and orchestration tools such as Docker and Kubernetes. Awareness of security frameworks, including authentication and authorisation protocols. Analytical and problem-solving capabilities. Excellent communication and documentation skills. Ability to work independently and lead cross-functional teams. Adaptability to rapidly evolving technological environments PREFERRED SKILLS Airflow, dbt, Spark, Kafka, RabbitMQ, Redis. Git, CI/CD pipelines. Experience with data warehousing solutions such as Snowflake, Redshift, BigQuery, or Synapse. Exposure to AI/ML workflows and model deployment. Experience with streaming systems and real-time architecture. Knowledge of event-driven and serverless architectural patterns. The Salary will be £ 42,500 - £ 45,500 DOE Type: Permanent