• Home
  • Search Jobs
  • Register CV
  • Post a Job
  • Employer Pricing
  • Contact Us
  • Sign in
  • Sign up
  • Home
  • Search Jobs
  • Register CV
  • Post a Job
  • Employer Pricing
  • Contact Us
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

17 jobs found

Email me jobs like this
Refine Search
Current Search
senior lead software engineer python and databricks
Senior Data Scientist
Grosvenor Casinos Limited
Job Description We want to expand our Data Science function further within our well-established strong data-driven Centralised Analytical department. Our Data Science mission is to build machine models in the production environment relative to Marketing, Customer Insights, and Safer Gambling and establish a strong culture of data-driven decision-making in our organisation's strategy. We are looking for a well-established Data Scientist at all levels who wants new challenges. As a Senior Data Scientist, you will work using data engineering, statistical, and ML/AI approaches to uncover data patterns and build models. We use Microsoft tech stack, including Azure Databricks (Pyspark, python), and we are expanding our data science capabilities. To be successful in the role, you will need to have extensive experience in data science projects and have built the professional skill to understand when an approach to a project is not working, to pause and change approach. The Data Science department is currently a smaller team, with an ambition to grow, with a mix of a Data Scientists and ML engineers. Therefore, it is an excellent opportunity to grow, contribute and challenge yourself. We are not an isolated function, so expect to work closely with business stakeholders, data engineers, marketing analysts and BI analysts to improve our existing models, create new models, and bring our expertise. Core Responsibilities Apply advanced statistical techniques and ML/AI models to development and production environments Collaborate with team members and stakeholders to build data science products that enable others to make business decisions Qualifications Postgraduate degree in a relevant discipline (e.g. STEM, Maths, Statistics, Physics) or equivalent experience Good data modelling, software engineering knowledge, and strong knowledge of statistical, mathematical and ML modelling are a must at this stage. Skilful in writing well-engineered code Proven experience working with ML engineers and production systems (including Cloud platforms) Proven ability to analyse large sets and experience-built ML/AI models in production with the ability to translate them into insights and actionable business recommendations Great technical and commercial communication and collaboration skills with some presentation skills Passion for learning and keeping abreast of new technologies and data models Additional Information Join us to unlock benefits and opportunities that will boost your career journey in a vibrant, inclusive and fulfilling work environment - so you can is important From hybrid working and colleague support networks to menopause support and weekly PepTalks, we're here for you. We'll also invest in your growth by providing development opportunities, leadership training and cutting-edge industry certifications so you have the tools and resources to help you work, win and grow with us. Immerse yourself in new cultures and gain international exposure through our global business. Collaborate with colleagues from around the globe. From pensions to bonus schemes, and private medical insurance to life insurance - we've got you covered. Our benefits vary by brand and/or location. Please have a chat with your local Talent Acquisition specialist to find out what's in place in your location. The Rank Group are committed to being an inclusive employer, ensuring that we better understand and meet the needs and requirements of our candidates and customers. We aim to do this by facilitating fair and equal access to our services. If you require a reasonable adjustment to be made, please reach out to let us know ahead of your interview.
Jul 03, 2025
Full time
Job Description We want to expand our Data Science function further within our well-established strong data-driven Centralised Analytical department. Our Data Science mission is to build machine models in the production environment relative to Marketing, Customer Insights, and Safer Gambling and establish a strong culture of data-driven decision-making in our organisation's strategy. We are looking for a well-established Data Scientist at all levels who wants new challenges. As a Senior Data Scientist, you will work using data engineering, statistical, and ML/AI approaches to uncover data patterns and build models. We use Microsoft tech stack, including Azure Databricks (Pyspark, python), and we are expanding our data science capabilities. To be successful in the role, you will need to have extensive experience in data science projects and have built the professional skill to understand when an approach to a project is not working, to pause and change approach. The Data Science department is currently a smaller team, with an ambition to grow, with a mix of a Data Scientists and ML engineers. Therefore, it is an excellent opportunity to grow, contribute and challenge yourself. We are not an isolated function, so expect to work closely with business stakeholders, data engineers, marketing analysts and BI analysts to improve our existing models, create new models, and bring our expertise. Core Responsibilities Apply advanced statistical techniques and ML/AI models to development and production environments Collaborate with team members and stakeholders to build data science products that enable others to make business decisions Qualifications Postgraduate degree in a relevant discipline (e.g. STEM, Maths, Statistics, Physics) or equivalent experience Good data modelling, software engineering knowledge, and strong knowledge of statistical, mathematical and ML modelling are a must at this stage. Skilful in writing well-engineered code Proven experience working with ML engineers and production systems (including Cloud platforms) Proven ability to analyse large sets and experience-built ML/AI models in production with the ability to translate them into insights and actionable business recommendations Great technical and commercial communication and collaboration skills with some presentation skills Passion for learning and keeping abreast of new technologies and data models Additional Information Join us to unlock benefits and opportunities that will boost your career journey in a vibrant, inclusive and fulfilling work environment - so you can is important From hybrid working and colleague support networks to menopause support and weekly PepTalks, we're here for you. We'll also invest in your growth by providing development opportunities, leadership training and cutting-edge industry certifications so you have the tools and resources to help you work, win and grow with us. Immerse yourself in new cultures and gain international exposure through our global business. Collaborate with colleagues from around the globe. From pensions to bonus schemes, and private medical insurance to life insurance - we've got you covered. Our benefits vary by brand and/or location. Please have a chat with your local Talent Acquisition specialist to find out what's in place in your location. The Rank Group are committed to being an inclusive employer, ensuring that we better understand and meet the needs and requirements of our candidates and customers. We aim to do this by facilitating fair and equal access to our services. If you require a reasonable adjustment to be made, please reach out to let us know ahead of your interview.
Landmark Information Group
Principal Data Engineer
Landmark Information Group
Remote (UK only) The Opportunity Are you ready to shape the future of data engineering at scale? We're looking for a Principal Data Engineer to join our high-performing Data Engineering team - a role ideal for experienced, hands-on professionals who thrive on technical leadership, innovation, and delivery. As a 100% data-driven company, we pride ourselves on engineering excellence and delivering impactful solutions to our clients. Reporting directly to the Head of Data Engineering, you will play a crucial role in driving the team's vision and objectives to completion. You will be expected to provide technical leadership, own the solution, ensure the reliability of data products, and collaborate closely with your team and customers to optimise data solutions. This is a unique opportunity for a highly skilled, energetic and motivated Senior Lead in Data Engineer with deep hands-on expertise in data engineering and architecture, a strong coding background, and a strategic mindset - someone who can balance technical depth with delivery focus and data analytical leadership. In this role, you will: Technical Leadership: Assist the Head of Data Engineering in overseeing the design, development, and optimisation of data software, data infrastructure and pipelines. Team Technical Leadership: Lead and mentor a team of talented data experts, both permanents and contractors, to deliver cutting-edge solutions, ensuring that best practices in data engineering and software development are followed. Lead by example. Be hands-on. Hands-On Delivery: Lead by example - contribute directly to technical challenges, write high-quality code, and guide architectural decisions. Data Strategy, Solutions & Ownership: Own the technical roadmap, aligning engineering efforts with business goals and ensuring timely delivery, quality control, and innovation. Inspire the team by providing a clear vision for technical excellence and innovation in the data engineering strategy. Cloud: Optimise cloud-based data solutions, storage and processing systems, with hands-on experience in AWS and on-prem services. Technical Excellence: Lead the pursuit of technical excellence by championing best practices automatisations and optimisation, in coding, architecture, and performance. Foster a team culture focused on continuous improvement, where learning is encouraged. Collaboration: Work closely with the customers, PMO, and business stakeholders to deliver high-impact, cost-effective solutions. Assemble Large, Complex Data Sets: Craft and manage data sets that meet both functional and non-functional business requirements, ensuring high data quality and integrity. HMLR Long-Term Programme: As your first major engagement, you will contribute to the HM Land Registry (HMLR) programme, expected to run through to the end of 2028. This is one of the UK Government's largest and most ambitious digital data transformation initiatives, aiming to consolidate all Local Authority Land Charge registers across England and Wales into a single, centralised Land Registry-maintained system. You will empower the engineering team to deliver innovative solutions while fostering a collaborative and inclusive environment. As a mentor, you will support Data Engineers and Data Analysts in overcoming technical challenges and ensuring timely, high-quality delivery. About You We're looking for a passionate, technically strong leader who can inspire and elevate those around them. You'll bring: Depth of Expertise: seasoned hands-on experience in data engineering, with a track record of leading complex data engineering initiatives at scale. Extensive experience in designing, implementing, and optimizing data solutions, supported by a history of successfully managing technical teams and delivery of data projects. Exceptional coding skills. Degree in Computer Science, Software Engineering, or similar (applied to Data. Data Specialisation). Extensive experience in Data Engineering and Data Analytics Expert knowledge in data technologies and data transformation solutions and tools. Strong analytical and problem-solving abilities. Good understanding of Quality and Information Security principles. Effective communication, ability to explain technical concepts to a range of audiences Able to provide coaching and training to less experienced members of the team Essential skills: Programming Languages such as Spark, Java, Python, PySpark, Scala, etc (minimum 2) Extensive Data Engineering and Data Analytics hands-on experience (coding/configuration/automation/monitoring/security/etc) FME Advanced Database and SQL skills Certifications AWS or FME certifications are a plus. Nice to have skills: Experience with ETL tools such as AWS Glue, Azure Data Factory, Databricks, etc. Join us and lead the charge in transforming the data landscape at Landmark, while advancing your career in a dynamic and forward-thinking environment. What it's like to work at Landmark: At Landmark, you'll find a friendly, dynamic, and supportive team that values bold ideas, big dreams, and active curiosity. We foster a culture of innovation, encouraging everyone to contribute to the development and direction of our products and services, while continuously seeking new and efficient ways to work. Collaboration and sociability are at the heart of what we do, and we take pride in coming together to achieve great things. We offer a range of benefits to support your well-being and career growth, including: Generous Holiday Allowance: 25 days' holiday plus bank holidays, with the option of adding up to 5 additional unpaid leave days per year Annual Lifestyle Allowance: £300 to spend on an activity of your choice Pension Scheme: Matched up to 6% for the first 3 years, and up to 10% thereafter Private Health Insurance: Provided by Vitality Group Income Protection Scheme Charitable Fundraising: Matched funding for your efforts Cycle to Work and Gym Flex Schemes Internal Coaching and Mentoring: Available throughout your time with us Training and Career Progression: A strong focus on your development Family-Friendly Policies Free Parking Join us at Landmark and be part of a team that supports your ambitions and growth, both personally and professionally. About Us Landmark Information Group holds a wide portfolio of market leading Prop-Tech (property technology) businesses that span an incredible range of markets and technology platforms across the sector. We are at the forefront of innovation and thought leadership in the property industry, being a supplier of national property-related data. We deliver award-winning solutions to estate agency, conveyancing, surveying, lender valuations, land asset management, environmental consultancy, and Government markets. This is a chance to join the organisation as we make major steps forward in leveraging the latest cloud and large-scale technologies to start bringing together the entire market to a unified platform. We are proud to be an equal opportunities employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Apply for this job indicates a required field First Name Last Name Email Phone Resume/CV Enter manually Accepted file types: pdf, doc, docx, txt, rtf Enter manually Accepted file types: pdf, doc, docx, txt, rtf LinkedIn Profile Website What is your current situation/reason for your application? What is your desired salary? What is your notice period? What experience do you have of Geospatial data? Do you have experience of QGIS? or something tool (please identify)? Do you have experience of FME? or similar tool (please identify)? Do you have experience of coaching, mentoring & developing junior members of staff? If so, please give further detail. Are you eligible to work in the UK? Select Are you currently in the UK on a Visa? Select If yes, what type of Visa, and when does it expire? Will you require sponsorship? If successful in working for Landmark, we will carry out financial probity, fraud & basic disclosure checks. Do you have anything to declare? LIG Equal Employment Landmark Information Group (LIG) is committed to diversity and inclusion. The information below is required so that we can monitor the implementation of our equal opportunities policy. We review this data anonymously and separately from the rest of your application. It will not be used for any other purpose, and it will not be available to or accessed by those shortlisting or interviewing candidates. We would encourage you to complete it so that we can have a full picture of our recruitment patterns, and so that we can effectively measure and monitor candidates' experience throughout our recruitment process. Which gender do you identify with? Select Do you consider your gender identity to be different from your registered sex at birth? Select What is your sexual orientation? Select . click apply for full job details
Jul 01, 2025
Full time
Remote (UK only) The Opportunity Are you ready to shape the future of data engineering at scale? We're looking for a Principal Data Engineer to join our high-performing Data Engineering team - a role ideal for experienced, hands-on professionals who thrive on technical leadership, innovation, and delivery. As a 100% data-driven company, we pride ourselves on engineering excellence and delivering impactful solutions to our clients. Reporting directly to the Head of Data Engineering, you will play a crucial role in driving the team's vision and objectives to completion. You will be expected to provide technical leadership, own the solution, ensure the reliability of data products, and collaborate closely with your team and customers to optimise data solutions. This is a unique opportunity for a highly skilled, energetic and motivated Senior Lead in Data Engineer with deep hands-on expertise in data engineering and architecture, a strong coding background, and a strategic mindset - someone who can balance technical depth with delivery focus and data analytical leadership. In this role, you will: Technical Leadership: Assist the Head of Data Engineering in overseeing the design, development, and optimisation of data software, data infrastructure and pipelines. Team Technical Leadership: Lead and mentor a team of talented data experts, both permanents and contractors, to deliver cutting-edge solutions, ensuring that best practices in data engineering and software development are followed. Lead by example. Be hands-on. Hands-On Delivery: Lead by example - contribute directly to technical challenges, write high-quality code, and guide architectural decisions. Data Strategy, Solutions & Ownership: Own the technical roadmap, aligning engineering efforts with business goals and ensuring timely delivery, quality control, and innovation. Inspire the team by providing a clear vision for technical excellence and innovation in the data engineering strategy. Cloud: Optimise cloud-based data solutions, storage and processing systems, with hands-on experience in AWS and on-prem services. Technical Excellence: Lead the pursuit of technical excellence by championing best practices automatisations and optimisation, in coding, architecture, and performance. Foster a team culture focused on continuous improvement, where learning is encouraged. Collaboration: Work closely with the customers, PMO, and business stakeholders to deliver high-impact, cost-effective solutions. Assemble Large, Complex Data Sets: Craft and manage data sets that meet both functional and non-functional business requirements, ensuring high data quality and integrity. HMLR Long-Term Programme: As your first major engagement, you will contribute to the HM Land Registry (HMLR) programme, expected to run through to the end of 2028. This is one of the UK Government's largest and most ambitious digital data transformation initiatives, aiming to consolidate all Local Authority Land Charge registers across England and Wales into a single, centralised Land Registry-maintained system. You will empower the engineering team to deliver innovative solutions while fostering a collaborative and inclusive environment. As a mentor, you will support Data Engineers and Data Analysts in overcoming technical challenges and ensuring timely, high-quality delivery. About You We're looking for a passionate, technically strong leader who can inspire and elevate those around them. You'll bring: Depth of Expertise: seasoned hands-on experience in data engineering, with a track record of leading complex data engineering initiatives at scale. Extensive experience in designing, implementing, and optimizing data solutions, supported by a history of successfully managing technical teams and delivery of data projects. Exceptional coding skills. Degree in Computer Science, Software Engineering, or similar (applied to Data. Data Specialisation). Extensive experience in Data Engineering and Data Analytics Expert knowledge in data technologies and data transformation solutions and tools. Strong analytical and problem-solving abilities. Good understanding of Quality and Information Security principles. Effective communication, ability to explain technical concepts to a range of audiences Able to provide coaching and training to less experienced members of the team Essential skills: Programming Languages such as Spark, Java, Python, PySpark, Scala, etc (minimum 2) Extensive Data Engineering and Data Analytics hands-on experience (coding/configuration/automation/monitoring/security/etc) FME Advanced Database and SQL skills Certifications AWS or FME certifications are a plus. Nice to have skills: Experience with ETL tools such as AWS Glue, Azure Data Factory, Databricks, etc. Join us and lead the charge in transforming the data landscape at Landmark, while advancing your career in a dynamic and forward-thinking environment. What it's like to work at Landmark: At Landmark, you'll find a friendly, dynamic, and supportive team that values bold ideas, big dreams, and active curiosity. We foster a culture of innovation, encouraging everyone to contribute to the development and direction of our products and services, while continuously seeking new and efficient ways to work. Collaboration and sociability are at the heart of what we do, and we take pride in coming together to achieve great things. We offer a range of benefits to support your well-being and career growth, including: Generous Holiday Allowance: 25 days' holiday plus bank holidays, with the option of adding up to 5 additional unpaid leave days per year Annual Lifestyle Allowance: £300 to spend on an activity of your choice Pension Scheme: Matched up to 6% for the first 3 years, and up to 10% thereafter Private Health Insurance: Provided by Vitality Group Income Protection Scheme Charitable Fundraising: Matched funding for your efforts Cycle to Work and Gym Flex Schemes Internal Coaching and Mentoring: Available throughout your time with us Training and Career Progression: A strong focus on your development Family-Friendly Policies Free Parking Join us at Landmark and be part of a team that supports your ambitions and growth, both personally and professionally. About Us Landmark Information Group holds a wide portfolio of market leading Prop-Tech (property technology) businesses that span an incredible range of markets and technology platforms across the sector. We are at the forefront of innovation and thought leadership in the property industry, being a supplier of national property-related data. We deliver award-winning solutions to estate agency, conveyancing, surveying, lender valuations, land asset management, environmental consultancy, and Government markets. This is a chance to join the organisation as we make major steps forward in leveraging the latest cloud and large-scale technologies to start bringing together the entire market to a unified platform. We are proud to be an equal opportunities employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Apply for this job indicates a required field First Name Last Name Email Phone Resume/CV Enter manually Accepted file types: pdf, doc, docx, txt, rtf Enter manually Accepted file types: pdf, doc, docx, txt, rtf LinkedIn Profile Website What is your current situation/reason for your application? What is your desired salary? What is your notice period? What experience do you have of Geospatial data? Do you have experience of QGIS? or something tool (please identify)? Do you have experience of FME? or similar tool (please identify)? Do you have experience of coaching, mentoring & developing junior members of staff? If so, please give further detail. Are you eligible to work in the UK? Select Are you currently in the UK on a Visa? Select If yes, what type of Visa, and when does it expire? Will you require sponsorship? If successful in working for Landmark, we will carry out financial probity, fraud & basic disclosure checks. Do you have anything to declare? LIG Equal Employment Landmark Information Group (LIG) is committed to diversity and inclusion. The information below is required so that we can monitor the implementation of our equal opportunities policy. We review this data anonymously and separately from the rest of your application. It will not be used for any other purpose, and it will not be available to or accessed by those shortlisting or interviewing candidates. We would encourage you to complete it so that we can have a full picture of our recruitment patterns, and so that we can effectively measure and monitor candidates' experience throughout our recruitment process. Which gender do you identify with? Select Do you consider your gender identity to be different from your registered sex at birth? Select What is your sexual orientation? Select . click apply for full job details
Data Scientist
Hirebridge
RLDatix is on a mission to transform care delivery worldwide, ensuring every patient receives the safest, highest-quality care. Through our innovative Healthcare Operations Platform, we're connecting data to unlock trusted insights that enable improved decision-making and help deliver safer healthcare for all. At RLDatix we're making healthcare safer, together. Our shared passion for meaningful work drives us, while a supportive, respectful culture makes it all possible. As a team, we collaborate globally to reach our ultimate goal-helping people. We're searching for a UK-based Data Scientist to join our Data Platform team, so that we can build and scale innovative data science solutions that power critical decisions across our organization. The Senior Data Scientist will lead the development of machine learning models and intelligent data systems to support RLDatix's mission of enabling safer, more efficient healthcare worldwide. How You'll Spend Your Time Designing and developing machine learning models, large language models, and algorithms to deliver meaningful insights from large datasets. Establishing and maintaining ETL workflows and data pipelines in order to ensure data consistency, quality, and usability across the platform. Optimizing machine learning models and AI systems to maximize performance, scalability, and real-world accuracy. Explaining complex data science concepts in order to engage stakeholders and promote data-driven decision-making across departments. Collaborating with engineers, QA, and product teams to align models and infrastructure with business goals. What Kind of Things We're Most Interested in You Having A bachelor's or master's degree in computer science, Data Science, AI, Software Engineering, or a related field is preferred. 3+ years of experience in a data-science related role, with hands-on experience in data science engineering. Proficiency in programming languages such as Python and SQL. Experience with data processing and analysis tools and technologies like TensorFlow, Keras, Scikit-learn. Strong understanding of large language models (LLMs) and machine learning (ML) algorithms and techniques. Familiarity with data visualization tools such as Power BI. Solid understanding of statistical analysis and hypothesis testing. Ability to work with structured and unstructured data sources. Knowledge of cloud computing platforms (AWS, Azure) and data security measures, including compliance with data governance standards and big data technologies. Experience with Databricks and Mosaic AI is plus. By enabling flexibility in how we work and prioritizing employee wellness, we empower our team to do and be their best. RLDatix is an equal opportunity employer, and our employment decisions are made without regard to race, color, religion, age, gender, sexual identity, national origin, disability, handicap, marital status or any other status or condition protected by UK law. As part of RLDatix's commitment to the inclusion of all qualified individuals, we ensure that persons with disabilities are provided reasonable accommodation in the job application and interview process. If reasonable accommodation is needed to participate in either step, please don't hesitate to send a note
Jun 30, 2025
Full time
RLDatix is on a mission to transform care delivery worldwide, ensuring every patient receives the safest, highest-quality care. Through our innovative Healthcare Operations Platform, we're connecting data to unlock trusted insights that enable improved decision-making and help deliver safer healthcare for all. At RLDatix we're making healthcare safer, together. Our shared passion for meaningful work drives us, while a supportive, respectful culture makes it all possible. As a team, we collaborate globally to reach our ultimate goal-helping people. We're searching for a UK-based Data Scientist to join our Data Platform team, so that we can build and scale innovative data science solutions that power critical decisions across our organization. The Senior Data Scientist will lead the development of machine learning models and intelligent data systems to support RLDatix's mission of enabling safer, more efficient healthcare worldwide. How You'll Spend Your Time Designing and developing machine learning models, large language models, and algorithms to deliver meaningful insights from large datasets. Establishing and maintaining ETL workflows and data pipelines in order to ensure data consistency, quality, and usability across the platform. Optimizing machine learning models and AI systems to maximize performance, scalability, and real-world accuracy. Explaining complex data science concepts in order to engage stakeholders and promote data-driven decision-making across departments. Collaborating with engineers, QA, and product teams to align models and infrastructure with business goals. What Kind of Things We're Most Interested in You Having A bachelor's or master's degree in computer science, Data Science, AI, Software Engineering, or a related field is preferred. 3+ years of experience in a data-science related role, with hands-on experience in data science engineering. Proficiency in programming languages such as Python and SQL. Experience with data processing and analysis tools and technologies like TensorFlow, Keras, Scikit-learn. Strong understanding of large language models (LLMs) and machine learning (ML) algorithms and techniques. Familiarity with data visualization tools such as Power BI. Solid understanding of statistical analysis and hypothesis testing. Ability to work with structured and unstructured data sources. Knowledge of cloud computing platforms (AWS, Azure) and data security measures, including compliance with data governance standards and big data technologies. Experience with Databricks and Mosaic AI is plus. By enabling flexibility in how we work and prioritizing employee wellness, we empower our team to do and be their best. RLDatix is an equal opportunity employer, and our employment decisions are made without regard to race, color, religion, age, gender, sexual identity, national origin, disability, handicap, marital status or any other status or condition protected by UK law. As part of RLDatix's commitment to the inclusion of all qualified individuals, we ensure that persons with disabilities are provided reasonable accommodation in the job application and interview process. If reasonable accommodation is needed to participate in either step, please don't hesitate to send a note
Chambers and Partners
Senior Data Scientist
Chambers and Partners
Overview We're seeking a Senior Data Scientist to lead the development of advanced analytics and AI/ML solutions that unlock real value across our business. This is a contract role for 6 months. In this contract role, you'll work with proprietary and B2B research datasets to design, deliver, and scale data-driven products. Collaborating closely with teams in Product, Research, and Technology, you'll help turn strategic ideas into working MVPs-ensuring high standards of methodology, quality, and business relevance throughout. You'll also help shape the data science environment by working alongside our tech teams to support a robust and flexible infrastructure, including sandbox environments for onboarding and evaluating new data sources. This is a great opportunity for a self-driven, impact-oriented data scientist who thrives in a fast-paced, cross-functional setting-and is eager to deliver meaningful results in a short time frame. Main Duties and Responsibilities Spearhead and execute complex data science projects using a combination of open-source and cloud tools, driving innovation and delivering actionable insights. Develop and deploy advanced machine learning models using cloud-based platforms. Collaborate with product managers and designers to ensure the feasibility of product extensions and new products based on existing proprietary, quantitative, and qualitative datasets. Work with outputs from Research and historical data to identify consistent and inconsistent product features and document precise requirements for improved consistency. Collaborate with designers, Tech colleagues, and expert users to come up with engaging ways to visualize data and outliers/exceptions for non-technical audiences. Design and develop novel ways to showcase and highlight key analysis from complex datasets, including joining across datasets that do not perfectly match. Collaborate with Product, Tech, Research, and other stakeholders to understand and define a new, marketable product from existing data. Create and present progress reports and ad-hoc reviews to key stakeholders and teams. Constantly think about and explain to stakeholders how analytics "products" could be refined and productionized in the future. Work with Tech colleagues to improve the Data Science workspace, including providing requirements for Data Lake, Data Pipeline, and Data Engineering teams. Expand on the tools and techniques already developed. Help us understand our customers (both internal and external) better so we can provide the right solutions to the right people, including proactively suggesting solutions for nebulous problems. Be responsible for the end-to-end Data Science lifecycle: investigation of data, from data cleaning to extracting insights and recommending production approaches. Responsible for demonstrating value addition to stakeholders. Coach, guide, and nurture talent within the data science team, fostering growth and skill development. Skills and Experience Delivering significant and valuable analytics projects/assets in industry and/or professional services. Proficiency in programming languages such as Python or R, with extensive experience with LLMs, ML algorithms, and models. Experience with cloud services like Azure ML Studio, Azure Functions, Azure Pipelines, MLflow, Azure Databricks, etc., is a plus. Experience working in Azure/Microsoft environments is considered a real plus. Proven understanding of data science methods for analyzing and making sense of research data outputs and survey datasets. Fluency in advanced statistics, ideally through both education and experience. Person Specification Bachelor's, Master's, or PhD in Data Science, Computer Science, Statistics, or a related field. Comfortable working with uncertainty and ambiguity, from initial concepts through iterations and experiments to find the right products/services to launch. Excellent problem-solving and strong analytical skills. Proven aptitude to learn new tools, technologies, and methodologies. Understanding of requirements for software engineering and data governance in data science. Proven ability to manage and mentor data science teams. Evidence of taking a company or department on a journey from Analytics to Data Science to AI and ML deployed at scale. Ability to translate complex analysis findings into clear narratives and actionable insights. Excellent communication skills, with the ability to listen and collaborate with non-technical and non-quantitative stakeholders. Experience working with client-facing and Tech teams to ensure proper data collection, quality, and reporting formats. Experience presenting investigations and insights to audiences with varying skill sets and backgrounds. Nice to have: experience working with market research methods and datasets. Nice to have: experience in the professional services or legal sector. B2B market research experience would be a significant plus.
Jun 30, 2025
Full time
Overview We're seeking a Senior Data Scientist to lead the development of advanced analytics and AI/ML solutions that unlock real value across our business. This is a contract role for 6 months. In this contract role, you'll work with proprietary and B2B research datasets to design, deliver, and scale data-driven products. Collaborating closely with teams in Product, Research, and Technology, you'll help turn strategic ideas into working MVPs-ensuring high standards of methodology, quality, and business relevance throughout. You'll also help shape the data science environment by working alongside our tech teams to support a robust and flexible infrastructure, including sandbox environments for onboarding and evaluating new data sources. This is a great opportunity for a self-driven, impact-oriented data scientist who thrives in a fast-paced, cross-functional setting-and is eager to deliver meaningful results in a short time frame. Main Duties and Responsibilities Spearhead and execute complex data science projects using a combination of open-source and cloud tools, driving innovation and delivering actionable insights. Develop and deploy advanced machine learning models using cloud-based platforms. Collaborate with product managers and designers to ensure the feasibility of product extensions and new products based on existing proprietary, quantitative, and qualitative datasets. Work with outputs from Research and historical data to identify consistent and inconsistent product features and document precise requirements for improved consistency. Collaborate with designers, Tech colleagues, and expert users to come up with engaging ways to visualize data and outliers/exceptions for non-technical audiences. Design and develop novel ways to showcase and highlight key analysis from complex datasets, including joining across datasets that do not perfectly match. Collaborate with Product, Tech, Research, and other stakeholders to understand and define a new, marketable product from existing data. Create and present progress reports and ad-hoc reviews to key stakeholders and teams. Constantly think about and explain to stakeholders how analytics "products" could be refined and productionized in the future. Work with Tech colleagues to improve the Data Science workspace, including providing requirements for Data Lake, Data Pipeline, and Data Engineering teams. Expand on the tools and techniques already developed. Help us understand our customers (both internal and external) better so we can provide the right solutions to the right people, including proactively suggesting solutions for nebulous problems. Be responsible for the end-to-end Data Science lifecycle: investigation of data, from data cleaning to extracting insights and recommending production approaches. Responsible for demonstrating value addition to stakeholders. Coach, guide, and nurture talent within the data science team, fostering growth and skill development. Skills and Experience Delivering significant and valuable analytics projects/assets in industry and/or professional services. Proficiency in programming languages such as Python or R, with extensive experience with LLMs, ML algorithms, and models. Experience with cloud services like Azure ML Studio, Azure Functions, Azure Pipelines, MLflow, Azure Databricks, etc., is a plus. Experience working in Azure/Microsoft environments is considered a real plus. Proven understanding of data science methods for analyzing and making sense of research data outputs and survey datasets. Fluency in advanced statistics, ideally through both education and experience. Person Specification Bachelor's, Master's, or PhD in Data Science, Computer Science, Statistics, or a related field. Comfortable working with uncertainty and ambiguity, from initial concepts through iterations and experiments to find the right products/services to launch. Excellent problem-solving and strong analytical skills. Proven aptitude to learn new tools, technologies, and methodologies. Understanding of requirements for software engineering and data governance in data science. Proven ability to manage and mentor data science teams. Evidence of taking a company or department on a journey from Analytics to Data Science to AI and ML deployed at scale. Ability to translate complex analysis findings into clear narratives and actionable insights. Excellent communication skills, with the ability to listen and collaborate with non-technical and non-quantitative stakeholders. Experience working with client-facing and Tech teams to ensure proper data collection, quality, and reporting formats. Experience presenting investigations and insights to audiences with varying skill sets and backgrounds. Nice to have: experience working with market research methods and datasets. Nice to have: experience in the professional services or legal sector. B2B market research experience would be a significant plus.
Senior Data Engineer
Sandtech
Sand Technologies is a fast-growing enterprise AI company that solves real-world problems for large blue-chip companies and governments worldwide. We're pioneers of meaningful AI : our solutions go far beyond chatbots. We are using data and AI to solve the world's biggest issues in telecommunications, sustainable water management, energy, healthcare, climate change, smart cities, and other areas that have a real impact on the world. For example, our AI systems help to manage the water supply for the entire city of London. We created the AI algorithms that enabled the 7th largest telecommunications company in the world to plan its network in 300 cities in record time. And we built a digital healthcare system that enables 30m people in a country to get world-class healthcare despite a shortage of doctors. We've grown our revenues by over 500% in the last 12 months while winning prestigious scientific and industry awards for our cutting-edge technology. We're underpinned by over 300 engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data engineers create pipelines that support our data scientists and power our front-end applications. This means we do data-intensive work for both OLTP and OLAP use cases. Our environments are primarily cloud-native spanning AWS, Azure and GCP, but we also work on systems running self-hosted open source services exclusively. We strive towards a strong code-first, data as a product mindset at all times, where testing and reliability with a keen eye on performance is a non-negotiable. JOB SUMMARY A Senior Data Engineer, has the primary role of designing, building, and maintaining scalable data pipelines and infrastructure to support data-intensive applications and analytics solutions. In this role, you will be responsible for not only developing data pipelines but also designing data architectures and overseeing data engineering projects. You will work closely with cross-functional teams and contribute to the strategic direction of our data initiatives. RESPONSIBILITIES Data Pipeline Development: Lead the design, implement, and maintain scalable data pipelines for ingesting, processing, and transforming large volumes of data from various sources using tools such as databricks, python and pyspark. Data Architecture: Architect scalable and efficient data solutions using the appropriate architecture design, opting for modern architectures where possible. Data Modeling: Design and optimize data models and schemas for efficient storage, retrieval, and analysis of structured and unstructured data. ETL Processes: Develop, optimize and automate ETL workflows to extract data from diverse sources, transform it into usable formats, and load it into data warehouses, data lakes or lakehouses. Big Data Technologies: Utilize big data technologies such as Spark, Kafka, and Flink for distributed data processing and analytics. Cloud Platforms: Deploy and manage data solutions on cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP), leveraging cloud-native services for data storage, processing, and analytics. Data Quality and Governance: Implement and oversee data governance, quality, and security measures. Monitoring, Optimization and Troubleshooting: Monitor data pipelines and infrastructure performance, identify bottlenecks and optimize for scalability, reliability, and cost-efficiency. Troubleshoot and fix data-related issues. DevOps: Build and maintain basic CI/CD pipelines, commit code to version control and deploy data solutions. Collaboration: Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand requirements, define data architectures, and deliver data-driven solutions. Documentation: Create and maintain technical documentation, including data architecture diagrams, ETL workflows, and system documentation, to facilitate understanding and maintainability of data solutions. Best Practices: Stay current with emerging technologies and best practices in data engineering, cloud architecture, and DevOps. Mentoring: Mentor and guide junior and mid-level data engineers. Technology Selection: Evaluate and recommend technologies, frameworks, and tools that best suit project requirements and architecture goals. Performance Optimization: Optimize software performance, scalability, and efficiency through architectural design decisions and performance tuning. QUALIFICATIONS Proven experience as a Senior Data Engineer, or in a similar role, with hands-on experience building and optimizing data pipelines and infrastructure, and designing data architectures. Proven experience working with Big Data and tools used to process Big Data Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. Excellent understanding of data engineering principles and practices. Excellent communication and collaboration skills to work effectively in cross-functional teams and communicate technical concepts to non-technical stakeholders. Ability to adapt to new technologies, tools, and methodologies in a dynamic and fast-paced environment. Ability to write clean, scalable, robust code using python or similar programming languages. Background in software engineering a plus. Knowledge of data governance frameworks and practices. Understanding of machine learning workflows and how to support them with robust data pipelines. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Strong understanding of data governance and best practices in data management. Experience with cloud platforms and services such as AWS, Azure, or GCP for deploying and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc) Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a required field First Name Last Name Preferred First Name Email Phone Resume/CV Enter manually Accepted file types: pdf, doc, docx, txt, rtf Enter manually Accepted file types: pdf, doc, docx, txt, rtf What interests and excites you about joining Sand? Where are you currently located? What are your gross annual salary expectations (in USD)? Select When would you be able to join us? How did you hear about the role? Select If you selected other, Sand Staff or Media, please specify
Jun 28, 2025
Full time
Sand Technologies is a fast-growing enterprise AI company that solves real-world problems for large blue-chip companies and governments worldwide. We're pioneers of meaningful AI : our solutions go far beyond chatbots. We are using data and AI to solve the world's biggest issues in telecommunications, sustainable water management, energy, healthcare, climate change, smart cities, and other areas that have a real impact on the world. For example, our AI systems help to manage the water supply for the entire city of London. We created the AI algorithms that enabled the 7th largest telecommunications company in the world to plan its network in 300 cities in record time. And we built a digital healthcare system that enables 30m people in a country to get world-class healthcare despite a shortage of doctors. We've grown our revenues by over 500% in the last 12 months while winning prestigious scientific and industry awards for our cutting-edge technology. We're underpinned by over 300 engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data engineers create pipelines that support our data scientists and power our front-end applications. This means we do data-intensive work for both OLTP and OLAP use cases. Our environments are primarily cloud-native spanning AWS, Azure and GCP, but we also work on systems running self-hosted open source services exclusively. We strive towards a strong code-first, data as a product mindset at all times, where testing and reliability with a keen eye on performance is a non-negotiable. JOB SUMMARY A Senior Data Engineer, has the primary role of designing, building, and maintaining scalable data pipelines and infrastructure to support data-intensive applications and analytics solutions. In this role, you will be responsible for not only developing data pipelines but also designing data architectures and overseeing data engineering projects. You will work closely with cross-functional teams and contribute to the strategic direction of our data initiatives. RESPONSIBILITIES Data Pipeline Development: Lead the design, implement, and maintain scalable data pipelines for ingesting, processing, and transforming large volumes of data from various sources using tools such as databricks, python and pyspark. Data Architecture: Architect scalable and efficient data solutions using the appropriate architecture design, opting for modern architectures where possible. Data Modeling: Design and optimize data models and schemas for efficient storage, retrieval, and analysis of structured and unstructured data. ETL Processes: Develop, optimize and automate ETL workflows to extract data from diverse sources, transform it into usable formats, and load it into data warehouses, data lakes or lakehouses. Big Data Technologies: Utilize big data technologies such as Spark, Kafka, and Flink for distributed data processing and analytics. Cloud Platforms: Deploy and manage data solutions on cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP), leveraging cloud-native services for data storage, processing, and analytics. Data Quality and Governance: Implement and oversee data governance, quality, and security measures. Monitoring, Optimization and Troubleshooting: Monitor data pipelines and infrastructure performance, identify bottlenecks and optimize for scalability, reliability, and cost-efficiency. Troubleshoot and fix data-related issues. DevOps: Build and maintain basic CI/CD pipelines, commit code to version control and deploy data solutions. Collaboration: Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand requirements, define data architectures, and deliver data-driven solutions. Documentation: Create and maintain technical documentation, including data architecture diagrams, ETL workflows, and system documentation, to facilitate understanding and maintainability of data solutions. Best Practices: Stay current with emerging technologies and best practices in data engineering, cloud architecture, and DevOps. Mentoring: Mentor and guide junior and mid-level data engineers. Technology Selection: Evaluate and recommend technologies, frameworks, and tools that best suit project requirements and architecture goals. Performance Optimization: Optimize software performance, scalability, and efficiency through architectural design decisions and performance tuning. QUALIFICATIONS Proven experience as a Senior Data Engineer, or in a similar role, with hands-on experience building and optimizing data pipelines and infrastructure, and designing data architectures. Proven experience working with Big Data and tools used to process Big Data Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. Excellent understanding of data engineering principles and practices. Excellent communication and collaboration skills to work effectively in cross-functional teams and communicate technical concepts to non-technical stakeholders. Ability to adapt to new technologies, tools, and methodologies in a dynamic and fast-paced environment. Ability to write clean, scalable, robust code using python or similar programming languages. Background in software engineering a plus. Knowledge of data governance frameworks and practices. Understanding of machine learning workflows and how to support them with robust data pipelines. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Strong understanding of data governance and best practices in data management. Experience with cloud platforms and services such as AWS, Azure, or GCP for deploying and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc) Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a required field First Name Last Name Preferred First Name Email Phone Resume/CV Enter manually Accepted file types: pdf, doc, docx, txt, rtf Enter manually Accepted file types: pdf, doc, docx, txt, rtf What interests and excites you about joining Sand? Where are you currently located? What are your gross annual salary expectations (in USD)? Select When would you be able to join us? How did you hear about the role? Select If you selected other, Sand Staff or Media, please specify
EasyJet
Senior Data Platform Engineer
EasyJet
Company When it comes to innovation and achievement there are few organisations with a better track record. Join us and you'll be able to play a big part in the success of our highly successful, fast-paced business that opens up Europe so that people can exercise their get-up-and-go. With almost 300 aircraft flying over 1,000 routes to more than 32 countries, we're the UK's largest airline, the fourth largest in Europe and the tenth largest in the world. Set to fly more than 90 million passengers this year, we employ over 10,000 people. Its big-scale stuff and we're still growing. Job Purpose With a big investment into Databricks, and with a large amount of interesting data, this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and use data in a fast paced organisation. You will join as a Senior Platform Data Engineer providing technical leadership to the Data Engineering team. You will work closely with our Data Scientists and business stakeholders to ensure value is delivered through our solutions. Job Accountabilities Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala. Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploy machine learning models and algorithms aimed at addressing specific business challenges and opportunities. Coach and mentor the team (including contractors) to improve development standards. Work with Business Analysts to deliver against requirements and realise business benefits. Build a documentation library and data catalogue for developed code/products. Oversight of project deliverables and code quality going into each release. Key Skills Required Technical Ability: has a high level of current, technical competence in relevant technologies, and be able to independently learn new technologies and techniques as our stack changes. Clear communication; can communicate effectively in both written and verbal forms with technical and nontechnical audiences alike. Complex problem-solving ability; structured, organised, process-driven and outcome-oriented. Able to use historical experiences to help with future innovations. Passionate about data; enjoy being hands-on and learning about new technologies, particularly in the data field. Self-directed and independent; able to take general guidance and the overarching data strategy and identify practical steps to take. Technical Skills Required Significant experience designing and building data solutions on a cloud based, big data distributed system. Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD), and software deployment automation with GitHub actions or Azure DevOps. Experience in testing automation of data transformation pipelines, using frameworks like Pytest or dbt Unit Test. Comfortable writing efficient SQL and debugging. Data warehouse operations and tunning experience in schema evolution, indexing, partitioning. Hands-on IaC development experience with Terraform or CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage frameworks like Great Expectations, dbt data quality, OpenLineage or Marquez, and data drift detection and alerting. Understanding of Data Management principles (security and data privacy) and how they can be applied to Data Engineering processes/solutions (e.g. access management, data privacy, handling of sensitive data (e.g. GDPR) Desirable Skills Experience in event-driven architecture, ingesting data in real time in a commercial production environment with Spark Streaming, Kafka, DLT or Beam. Understanding of the challenges faced in the design and development of a streaming data pipeline and the different options for processing unbounded data (pubsub, message queues, event streaming etc) Understanding of the most commonly used Data Science and Machine Learning models, libraries and frameworks. Knowledge of the development lifecycle of analytical solutions using visualisation tools (e.g. Tableau, PowerBI, ThoughtSpot) Hands-on development experience in an airline, e-commerce or retail industry. Worked within the AWS cloud ecosystem. Experience of building a data transformation framework with dbt. What you'll get in return Competitive base salary Up to 20% bonus 25 days holiday BAYE, SAYE & Performance share schemes 7% pension Life Insurance Work Away Scheme Flexible benefits package Excellent staff travel benefits About easyJet At easyJet our aim is to make low-cost travel easy - connecting people to what they value using Europe's best airline network, great value fares, and friendly service. It takes a real team effort to carry over 90 million passengers a year across 35 countries. Whether you're working as part of our front-line operations or in our corporate functions, you'll find people that are positive, inclusive, ready to take on a challenge, and that have your back. We call that our 'Orange Spirit', and we hope you'll share that too. Apply Complete your application on our careers site. We encourage individuality, empower our people to seize the initiative, and never stop learning. We see people first and foremost for their performance and potential and we are committed to building a diverse and inclusive organisation that supports the needs of all. As such we will make reasonable adjustments at interview through to employment for our candidates.
Jun 26, 2025
Full time
Company When it comes to innovation and achievement there are few organisations with a better track record. Join us and you'll be able to play a big part in the success of our highly successful, fast-paced business that opens up Europe so that people can exercise their get-up-and-go. With almost 300 aircraft flying over 1,000 routes to more than 32 countries, we're the UK's largest airline, the fourth largest in Europe and the tenth largest in the world. Set to fly more than 90 million passengers this year, we employ over 10,000 people. Its big-scale stuff and we're still growing. Job Purpose With a big investment into Databricks, and with a large amount of interesting data, this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and use data in a fast paced organisation. You will join as a Senior Platform Data Engineer providing technical leadership to the Data Engineering team. You will work closely with our Data Scientists and business stakeholders to ensure value is delivered through our solutions. Job Accountabilities Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala. Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploy machine learning models and algorithms aimed at addressing specific business challenges and opportunities. Coach and mentor the team (including contractors) to improve development standards. Work with Business Analysts to deliver against requirements and realise business benefits. Build a documentation library and data catalogue for developed code/products. Oversight of project deliverables and code quality going into each release. Key Skills Required Technical Ability: has a high level of current, technical competence in relevant technologies, and be able to independently learn new technologies and techniques as our stack changes. Clear communication; can communicate effectively in both written and verbal forms with technical and nontechnical audiences alike. Complex problem-solving ability; structured, organised, process-driven and outcome-oriented. Able to use historical experiences to help with future innovations. Passionate about data; enjoy being hands-on and learning about new technologies, particularly in the data field. Self-directed and independent; able to take general guidance and the overarching data strategy and identify practical steps to take. Technical Skills Required Significant experience designing and building data solutions on a cloud based, big data distributed system. Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD), and software deployment automation with GitHub actions or Azure DevOps. Experience in testing automation of data transformation pipelines, using frameworks like Pytest or dbt Unit Test. Comfortable writing efficient SQL and debugging. Data warehouse operations and tunning experience in schema evolution, indexing, partitioning. Hands-on IaC development experience with Terraform or CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage frameworks like Great Expectations, dbt data quality, OpenLineage or Marquez, and data drift detection and alerting. Understanding of Data Management principles (security and data privacy) and how they can be applied to Data Engineering processes/solutions (e.g. access management, data privacy, handling of sensitive data (e.g. GDPR) Desirable Skills Experience in event-driven architecture, ingesting data in real time in a commercial production environment with Spark Streaming, Kafka, DLT or Beam. Understanding of the challenges faced in the design and development of a streaming data pipeline and the different options for processing unbounded data (pubsub, message queues, event streaming etc) Understanding of the most commonly used Data Science and Machine Learning models, libraries and frameworks. Knowledge of the development lifecycle of analytical solutions using visualisation tools (e.g. Tableau, PowerBI, ThoughtSpot) Hands-on development experience in an airline, e-commerce or retail industry. Worked within the AWS cloud ecosystem. Experience of building a data transformation framework with dbt. What you'll get in return Competitive base salary Up to 20% bonus 25 days holiday BAYE, SAYE & Performance share schemes 7% pension Life Insurance Work Away Scheme Flexible benefits package Excellent staff travel benefits About easyJet At easyJet our aim is to make low-cost travel easy - connecting people to what they value using Europe's best airline network, great value fares, and friendly service. It takes a real team effort to carry over 90 million passengers a year across 35 countries. Whether you're working as part of our front-line operations or in our corporate functions, you'll find people that are positive, inclusive, ready to take on a challenge, and that have your back. We call that our 'Orange Spirit', and we hope you'll share that too. Apply Complete your application on our careers site. We encourage individuality, empower our people to seize the initiative, and never stop learning. We see people first and foremost for their performance and potential and we are committed to building a diverse and inclusive organisation that supports the needs of all. As such we will make reasonable adjustments at interview through to employment for our candidates.
Intec Select Ltd
Senior Software Engineer Technical Lead
Intec Select Ltd City, Wolverhampton
Senior Software Engineer Technical Lead A leading Bank is hiring a Senior Software Engineer / Technical Lead to drive the development and design of several greenfield retail banking platforms as our client rebuilds its brand to stay ahead of the competition. We are looking for a Senior Software Engineer Technical Lead with a background in Java, Kafka, and Azure who can provide technical leadership and contribute to the vision and strategy as our client continues through its modernisation campaign. Our client is paying a basic salary of 100,000 + 25% bonus + benefits to be based in Wolverhampton on a hybrid basis. Our client is seeking experienced engineers with recent retail / digital banking experience who can design new product roadmaps, focus on architectural challenges, and provide hands-on technical leadership to a team of engineers. Your responsibilities will include: Lead the development and implementation of a modern cloud foundation and data platform that is robust, scalable, fully automated, secure, and can support the growth of the business. Build Scalable Architectures: Leverage modern technologies to design and implement scalable, secure, and high-performing cloud-native solutions. Data Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Core skill set for this position: Strong experience building and scaling baking systems (Lending, Payments, or Mortgages) with a focus on security compliance and performance is a must. Experience leading upon architectural challenges, system scalability, and guidance of engineering teams is a must. A background in Java, C#, Python, or React development with experience providing hands-on technical leadership is a must. New Product Ramping (approach to ramping up new products with less-experienced teams, providing clear strategies for facilitating MVP products in market and enabling teams to perform at scale) is a must. Digita transformation experience, moving from on-premise to modern cloud service using Azure, is a must. Benefits: 100,000 / 25% bonus / 28 days holiday / Holiday Purchase Scheme / Occasional travel / Health Insurance / 13% pension / plus much more. Senior Software Engineer Technical Lead
Mar 08, 2025
Full time
Senior Software Engineer Technical Lead A leading Bank is hiring a Senior Software Engineer / Technical Lead to drive the development and design of several greenfield retail banking platforms as our client rebuilds its brand to stay ahead of the competition. We are looking for a Senior Software Engineer Technical Lead with a background in Java, Kafka, and Azure who can provide technical leadership and contribute to the vision and strategy as our client continues through its modernisation campaign. Our client is paying a basic salary of 100,000 + 25% bonus + benefits to be based in Wolverhampton on a hybrid basis. Our client is seeking experienced engineers with recent retail / digital banking experience who can design new product roadmaps, focus on architectural challenges, and provide hands-on technical leadership to a team of engineers. Your responsibilities will include: Lead the development and implementation of a modern cloud foundation and data platform that is robust, scalable, fully automated, secure, and can support the growth of the business. Build Scalable Architectures: Leverage modern technologies to design and implement scalable, secure, and high-performing cloud-native solutions. Data Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Core skill set for this position: Strong experience building and scaling baking systems (Lending, Payments, or Mortgages) with a focus on security compliance and performance is a must. Experience leading upon architectural challenges, system scalability, and guidance of engineering teams is a must. A background in Java, C#, Python, or React development with experience providing hands-on technical leadership is a must. New Product Ramping (approach to ramping up new products with less-experienced teams, providing clear strategies for facilitating MVP products in market and enabling teams to perform at scale) is a must. Digita transformation experience, moving from on-premise to modern cloud service using Azure, is a must. Benefits: 100,000 / 25% bonus / 28 days holiday / Holiday Purchase Scheme / Occasional travel / Health Insurance / 13% pension / plus much more. Senior Software Engineer Technical Lead
Intec Select Ltd
Senior Software Engineer Technical Lead
Intec Select Ltd Chatham, Kent
Senior Software Engineer Technical Lead Eligible To Provide Sponsorship A leading Bank is hiring a Senior Software Engineer / Technical Lead to drive the development and design of several greenfield retail banking platforms as our client rebuilds its brand to stay ahead of the competition. We are looking for a Senior Software Engineer Technical Lead with a background in Java, Kafka, and Azure who can provide technical leadership and contribute to the vision and strategy as our client continues through its modernisation campaign. Our client is paying a basic salary of 100,000 + 25% bonus + benefits to be based in Wolverhampton on a hybrid basis. Your responsibilities will include: Lead the development and implementation of a modern cloud foundation and data platform that is robust, scalable, fully automated, secure, and can support the growth of the business. Build Scalable Architectures: Leverage modern technologies to design and implement scalable, secure, and high-performing cloud-native solutions. Data Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Core skill set for this position: Strong experience building and scaling baking systems (Lending, Payments, or Mortgages) with a focus on security compliance and performance is a must. Experience leading upon architectural challenges, system scalability, and guidance of engineering teams is a must. A background in Java, C#, Python, or React development with experience providing hands-on technical leadership is a must. New Product Ramping (approach to ramping up new products with less-experienced teams, providing clear strategies for facilitating MVP products in market and enabling teams to perform at scale) is a must. Digita transformation experience, moving from on-premise to modern cloud service using Azure, is a must. Benefits: 100,000 / 25% bonus / 28 days holiday / Holiday Purchase Scheme / Hybrid / Health Insurance / 13% pension / plus much more. Senior Software Engineer Technical Lead
Mar 08, 2025
Full time
Senior Software Engineer Technical Lead Eligible To Provide Sponsorship A leading Bank is hiring a Senior Software Engineer / Technical Lead to drive the development and design of several greenfield retail banking platforms as our client rebuilds its brand to stay ahead of the competition. We are looking for a Senior Software Engineer Technical Lead with a background in Java, Kafka, and Azure who can provide technical leadership and contribute to the vision and strategy as our client continues through its modernisation campaign. Our client is paying a basic salary of 100,000 + 25% bonus + benefits to be based in Wolverhampton on a hybrid basis. Your responsibilities will include: Lead the development and implementation of a modern cloud foundation and data platform that is robust, scalable, fully automated, secure, and can support the growth of the business. Build Scalable Architectures: Leverage modern technologies to design and implement scalable, secure, and high-performing cloud-native solutions. Data Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Core skill set for this position: Strong experience building and scaling baking systems (Lending, Payments, or Mortgages) with a focus on security compliance and performance is a must. Experience leading upon architectural challenges, system scalability, and guidance of engineering teams is a must. A background in Java, C#, Python, or React development with experience providing hands-on technical leadership is a must. New Product Ramping (approach to ramping up new products with less-experienced teams, providing clear strategies for facilitating MVP products in market and enabling teams to perform at scale) is a must. Digita transformation experience, moving from on-premise to modern cloud service using Azure, is a must. Benefits: 100,000 / 25% bonus / 28 days holiday / Holiday Purchase Scheme / Hybrid / Health Insurance / 13% pension / plus much more. Senior Software Engineer Technical Lead
Senior Data Engineer
Tbwa Chiat/Day Inc
Sand Technologies is a fast-growing enterprise AI company that solves real-world problems for large blue-chip companies and governments worldwide. We're pioneers of meaningful AI : our solutions go far beyond chatbots. We are using data and AI to solve the world's biggest issues in telecommunications, sustainable water management, energy, healthcare, climate change, smart cities, and other areas that have a real impact on the world. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark. Our data engineers create pipelines that support our data scientists and power our front-end applications. This means we do data-intensive work for both OLTP and OLAP use cases. JOB SUMMARY A Senior Data Engineer has the primary role of designing, building, and maintaining scalable data pipelines and infrastructure to support data-intensive applications and analytics solutions. In this role, you will be responsible for developing data pipelines, designing data architectures, and overseeing data engineering projects. RESPONSIBILITIES Lead the design, implement, and maintain scalable data pipelines for ingesting, processing, and transforming large volumes of data. Architect scalable and efficient data solutions using appropriate architecture design. Design and optimize data models and schemas for efficient storage, retrieval, and analysis of structured and unstructured data. Develop, optimize and automate ETL workflows to extract data from diverse sources. Utilize big data technologies such as Spark, Kafka, and Flink for distributed data processing and analytics. Deploy and manage data solutions on cloud platforms such as AWS, Azure, or GCP. Implement and oversee data governance, quality, and security measures. Monitor data pipelines and infrastructure performance, identify bottlenecks and optimize for scalability. Build and maintain basic CI/CD pipelines and deploy data solutions. Collaborate with cross-functional teams to understand requirements and deliver data-driven solutions. Create and maintain technical documentation for data solutions. Stay current with emerging technologies and best practices in data engineering. Mentor and guide junior and mid-level data engineers. Evaluate and recommend technologies that best suit project requirements. Optimize software performance and efficiency through architectural design decisions. QUALIFICATIONS Proven experience as a Senior Data Engineer or in a similar role. Proven experience working with Big Data and tools used to process Big Data. Strong problem-solving and analytical skills. Excellent understanding of data engineering principles and practices. Excellent communication and collaboration skills. Ability to write clean, scalable code using Python or similar programming languages. Knowledge of data governance frameworks and practices. Understanding of machine learning workflows. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL. Strong understanding of data modelling concepts. Experience in big data technologies such as Databricks, Spark, Kafka, and Flink. Experience with modern data architectures, such as lakehouse. Experience with CI/CD pipelines and containerization (e.g., Docker). Experience with ETL tools and technologies. Strong understanding of data governance and best practices. Experience with cloud platforms such as AWS, Azure, or GCP. Would you like to join us as we work hard, have fun and make history? Apply for this job
Feb 20, 2025
Full time
Sand Technologies is a fast-growing enterprise AI company that solves real-world problems for large blue-chip companies and governments worldwide. We're pioneers of meaningful AI : our solutions go far beyond chatbots. We are using data and AI to solve the world's biggest issues in telecommunications, sustainable water management, energy, healthcare, climate change, smart cities, and other areas that have a real impact on the world. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark. Our data engineers create pipelines that support our data scientists and power our front-end applications. This means we do data-intensive work for both OLTP and OLAP use cases. JOB SUMMARY A Senior Data Engineer has the primary role of designing, building, and maintaining scalable data pipelines and infrastructure to support data-intensive applications and analytics solutions. In this role, you will be responsible for developing data pipelines, designing data architectures, and overseeing data engineering projects. RESPONSIBILITIES Lead the design, implement, and maintain scalable data pipelines for ingesting, processing, and transforming large volumes of data. Architect scalable and efficient data solutions using appropriate architecture design. Design and optimize data models and schemas for efficient storage, retrieval, and analysis of structured and unstructured data. Develop, optimize and automate ETL workflows to extract data from diverse sources. Utilize big data technologies such as Spark, Kafka, and Flink for distributed data processing and analytics. Deploy and manage data solutions on cloud platforms such as AWS, Azure, or GCP. Implement and oversee data governance, quality, and security measures. Monitor data pipelines and infrastructure performance, identify bottlenecks and optimize for scalability. Build and maintain basic CI/CD pipelines and deploy data solutions. Collaborate with cross-functional teams to understand requirements and deliver data-driven solutions. Create and maintain technical documentation for data solutions. Stay current with emerging technologies and best practices in data engineering. Mentor and guide junior and mid-level data engineers. Evaluate and recommend technologies that best suit project requirements. Optimize software performance and efficiency through architectural design decisions. QUALIFICATIONS Proven experience as a Senior Data Engineer or in a similar role. Proven experience working with Big Data and tools used to process Big Data. Strong problem-solving and analytical skills. Excellent understanding of data engineering principles and practices. Excellent communication and collaboration skills. Ability to write clean, scalable code using Python or similar programming languages. Knowledge of data governance frameworks and practices. Understanding of machine learning workflows. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL. Strong understanding of data modelling concepts. Experience in big data technologies such as Databricks, Spark, Kafka, and Flink. Experience with modern data architectures, such as lakehouse. Experience with CI/CD pipelines and containerization (e.g., Docker). Experience with ETL tools and technologies. Strong understanding of data governance and best practices. Experience with cloud platforms such as AWS, Azure, or GCP. Would you like to join us as we work hard, have fun and make history? Apply for this job
Senior Azure Data Engineer (Databricks)
Capco
Senior Azure Data Engineer (Databricks) Joining Capco means joining an organisation that is committed to an inclusive working environment where you're encouraged to . We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. Why Join Capco? Capco is a global technology and business consultancy, focused on the financial services sector. We are passionate about helping our clients succeed in an ever-changing industry. You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry. We are/have: Experts across the Capital Markets, Insurance, Payments, Retail Banking and Wealth & Asset Management domains. Deep knowledge in various financial services offerings including Finance, Risk and Compliance, Financial Crime, Core Banking etc. Committed to growing our business and hiring the best talent to help us get there. Focused on maintaining our nimble, agile, and entrepreneurial culture. We offer: A work culture focused on innovation and building lasting value for our clients and employees. Ongoing learning opportunities to help you acquire new skills or deepen existing expertise. A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients. A diverse, inclusive, meritocratic culture. Enhanced and competitive family friendly benefits, including maternity / adoption / shared parental leave and paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement. Data Engineering at Capco You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry. You'll be part of a digital engineering team that develop new and enhance existing financial and data solutions. You'll be involved in digital and data transformation processes through a continuous delivery model. You will work on automating and optimising data engineering processes, develop robust and fault tolerant data solutions both on cloud and on-premise deployments. You'll be able to work across different data, cloud and messaging technology stacks. You'll have an opportunity to learn and work with specialised data and cloud technologies to widen the skill set. A day in the life of a Data Engineer at Capco Working alongside clients to interpret requirements and define industry-leading solutions. Designing and developing robust, well tested data pipelines. Demonstrating and helping clients adhere to best practices in engineering and SDLC. Building event-driven, loosely coupled distributed applications. Developing both on-premise and cloud-based solutions. Supporting internal Capco capabilities by sharing insight, experience and credentials. About you Capco is looking for hardworking, innovative, and creative people to join our Digital Engineering team. We'd also like to see: Practical experience of engineering best practices, while being obsessed with continuous improvement. Deep technical knowledge of two or more technologies and a curiosity for learning other parts of the stack. Experience delivering software/technology projects leveraging Agile methodologies. You have personally made valuable contributions to products, solutions and teams and can articulate the value to customers. You have played a role in the delivery of critical business applications and ideally customer facing applications. You can communicate complex ideas to non-experts with eloquence and confidence. You bring an awareness and understanding of new technologies being used in finance and other industries and loves to experiment. A passion for being part of the engineering team that is forming the future of finance. Skills & Expertise: You will have experience working with some of the following Methodologies/Technologies. Excellent experience in the Data Engineering Lifecycle, you will have created data pipelines which take data through all layers from generation, ingestion, transformation and serving. Experience of modern Software Engineering principles and experience of creating well tested, clean applications. Enthusiasm and ability to pick up new technologies as needed to solve problems. Hands on working experience of the Databricks platform, must have experience of delivering projects which use DeltaLake, Orchestration, Unity Catalog, Spark Structured Streaming on Databricks. Experience with Data Lakehouse architecture and data warehousing principles, experience with Data Modelling, Schema design and using semi-structured and structured data. Extensive experience using Python, Pyspark and the Python Ecosystem with good exposure to python libraries, Proficient in SQL, Experience Developing in other languages e.g. Scala/Java. Experience with Big Data technologies and Distributed Systems such as Hadoop, HDFS, HIVE, Spark, Databricks, Cloudera. Experience developing near real time event streaming pipelines with tools such as - Kafka, Spark Streaming, Azure Event Hubs. Good understanding of the differences and trade-offs between SQL and NoSQL, ETL and ELT. Proven experience in DevOps and using building robust production data pipelines, CI/CD Pipelines on e.g. Azure DevOps, Jenkins, CircleCI, GitHub Actions etc. Exposure to working with PII, Sensitive Data and understands data regulations such as GDPR.
Feb 18, 2025
Full time
Senior Azure Data Engineer (Databricks) Joining Capco means joining an organisation that is committed to an inclusive working environment where you're encouraged to . We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. Why Join Capco? Capco is a global technology and business consultancy, focused on the financial services sector. We are passionate about helping our clients succeed in an ever-changing industry. You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry. We are/have: Experts across the Capital Markets, Insurance, Payments, Retail Banking and Wealth & Asset Management domains. Deep knowledge in various financial services offerings including Finance, Risk and Compliance, Financial Crime, Core Banking etc. Committed to growing our business and hiring the best talent to help us get there. Focused on maintaining our nimble, agile, and entrepreneurial culture. We offer: A work culture focused on innovation and building lasting value for our clients and employees. Ongoing learning opportunities to help you acquire new skills or deepen existing expertise. A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients. A diverse, inclusive, meritocratic culture. Enhanced and competitive family friendly benefits, including maternity / adoption / shared parental leave and paid leave for sickness, pregnancy loss, fertility treatment, menopause, and bereavement. Data Engineering at Capco You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry. You'll be part of a digital engineering team that develop new and enhance existing financial and data solutions. You'll be involved in digital and data transformation processes through a continuous delivery model. You will work on automating and optimising data engineering processes, develop robust and fault tolerant data solutions both on cloud and on-premise deployments. You'll be able to work across different data, cloud and messaging technology stacks. You'll have an opportunity to learn and work with specialised data and cloud technologies to widen the skill set. A day in the life of a Data Engineer at Capco Working alongside clients to interpret requirements and define industry-leading solutions. Designing and developing robust, well tested data pipelines. Demonstrating and helping clients adhere to best practices in engineering and SDLC. Building event-driven, loosely coupled distributed applications. Developing both on-premise and cloud-based solutions. Supporting internal Capco capabilities by sharing insight, experience and credentials. About you Capco is looking for hardworking, innovative, and creative people to join our Digital Engineering team. We'd also like to see: Practical experience of engineering best practices, while being obsessed with continuous improvement. Deep technical knowledge of two or more technologies and a curiosity for learning other parts of the stack. Experience delivering software/technology projects leveraging Agile methodologies. You have personally made valuable contributions to products, solutions and teams and can articulate the value to customers. You have played a role in the delivery of critical business applications and ideally customer facing applications. You can communicate complex ideas to non-experts with eloquence and confidence. You bring an awareness and understanding of new technologies being used in finance and other industries and loves to experiment. A passion for being part of the engineering team that is forming the future of finance. Skills & Expertise: You will have experience working with some of the following Methodologies/Technologies. Excellent experience in the Data Engineering Lifecycle, you will have created data pipelines which take data through all layers from generation, ingestion, transformation and serving. Experience of modern Software Engineering principles and experience of creating well tested, clean applications. Enthusiasm and ability to pick up new technologies as needed to solve problems. Hands on working experience of the Databricks platform, must have experience of delivering projects which use DeltaLake, Orchestration, Unity Catalog, Spark Structured Streaming on Databricks. Experience with Data Lakehouse architecture and data warehousing principles, experience with Data Modelling, Schema design and using semi-structured and structured data. Extensive experience using Python, Pyspark and the Python Ecosystem with good exposure to python libraries, Proficient in SQL, Experience Developing in other languages e.g. Scala/Java. Experience with Big Data technologies and Distributed Systems such as Hadoop, HDFS, HIVE, Spark, Databricks, Cloudera. Experience developing near real time event streaming pipelines with tools such as - Kafka, Spark Streaming, Azure Event Hubs. Good understanding of the differences and trade-offs between SQL and NoSQL, ETL and ELT. Proven experience in DevOps and using building robust production data pipelines, CI/CD Pipelines on e.g. Azure DevOps, Jenkins, CircleCI, GitHub Actions etc. Exposure to working with PII, Sensitive Data and understands data regulations such as GDPR.
Intec Select Ltd
Senior Software Engineer Technical Lead
Intec Select Ltd
Senior Software Engineer Technical Lead A leading Bank is hiring a Senior Software Engineer / Technical Lead to drive the development and design of several greenfield retail banking platforms as our client rebuilds its brand to stay ahead of the competition. We are looking for a Senior Engineer with a background in Java, Kafka, and Az ure who can provide technical leadership and contribute to the vision and strategy as our client continues through its modernization campaign. Our client is paying a basic salary of 100,000 + 25% bonus + benefits to be based in London with occasional travel to Kent. Our client is seeking experienced engineers with recent retail / digital banking experience who can design new product roadmaps, focus on architectural challenges, and provide hands-on technical leadership to a team of engineers. Your responsibilities will include: Lead the development and implementation of a modern cloud foundation and data platform that is robust, scalable, fully automated, secure, and can support the growth of the business. Build Scalable Architectures: Leverage modern technologies to design and implement scalable, secure, and high-performing cloud-native solutions. API Development and Integration: Design and build secure RESTful and GraphQL APIs, ensuring seamless integration with core banking systems (e.g., Mambu) and external services like Open Banking platforms. Data Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Monitoring and Performance Tuning: Implement advanced monitoring and observability solutions using tools like Prometheus, Grafana, or Datadog to proactively identify and resolve performance bottlenecks. Code and System Optimisation: Proactively analyse and optimise existing systems for improved performance, scalability, and maintainability. Core skill set for this position: Strong experience building and scaling baking systems (Lending, Payments, or Mortgages) with a focus on security compliance and performance is a must. Experience leading upon architectural challenges, system scalability, and guidance of engineering teams is a must. A background in Java, C#, Python, or React development with experience providing hands-on technical leadership is a must. New Product Ramping (approach to ramping up new products with less-experienced teams, providing clear strategies for facilitating MVP products in market and enabling teams to perform at scale) is a must. Digita transformation experience, moving from on-premise to modern cloud service using Azure, is a must. Benefits: 100,000 / 25% bonus / 28 days holiday / Holiday Purchase Scheme / Occasional travel / Health Insurance / 13% pension / plus much more. Senior Software Engineer Technical Lead
Feb 18, 2025
Full time
Senior Software Engineer Technical Lead A leading Bank is hiring a Senior Software Engineer / Technical Lead to drive the development and design of several greenfield retail banking platforms as our client rebuilds its brand to stay ahead of the competition. We are looking for a Senior Engineer with a background in Java, Kafka, and Az ure who can provide technical leadership and contribute to the vision and strategy as our client continues through its modernization campaign. Our client is paying a basic salary of 100,000 + 25% bonus + benefits to be based in London with occasional travel to Kent. Our client is seeking experienced engineers with recent retail / digital banking experience who can design new product roadmaps, focus on architectural challenges, and provide hands-on technical leadership to a team of engineers. Your responsibilities will include: Lead the development and implementation of a modern cloud foundation and data platform that is robust, scalable, fully automated, secure, and can support the growth of the business. Build Scalable Architectures: Leverage modern technologies to design and implement scalable, secure, and high-performing cloud-native solutions. API Development and Integration: Design and build secure RESTful and GraphQL APIs, ensuring seamless integration with core banking systems (e.g., Mambu) and external services like Open Banking platforms. Data Engineering and Analytics: Work closely with data teams to define robust data pipelines and scalable cloud-based data platforms using tools like Apache Kafka, Snowflake, or Databricks. Monitoring and Performance Tuning: Implement advanced monitoring and observability solutions using tools like Prometheus, Grafana, or Datadog to proactively identify and resolve performance bottlenecks. Code and System Optimisation: Proactively analyse and optimise existing systems for improved performance, scalability, and maintainability. Core skill set for this position: Strong experience building and scaling baking systems (Lending, Payments, or Mortgages) with a focus on security compliance and performance is a must. Experience leading upon architectural challenges, system scalability, and guidance of engineering teams is a must. A background in Java, C#, Python, or React development with experience providing hands-on technical leadership is a must. New Product Ramping (approach to ramping up new products with less-experienced teams, providing clear strategies for facilitating MVP products in market and enabling teams to perform at scale) is a must. Digita transformation experience, moving from on-premise to modern cloud service using Azure, is a must. Benefits: 100,000 / 25% bonus / 28 days holiday / Holiday Purchase Scheme / Occasional travel / Health Insurance / 13% pension / plus much more. Senior Software Engineer Technical Lead
Xcede Recruitment Solutions
Senior GenAI Engineer
Xcede Recruitment Solutions
AI Engineer Up to 95k + Bonus Location: London - 3 days a week in office My client is a leading financial services company who is in the process of growing their GenAI team. They are looking for an AI Engineer who will be an individual technical contributor to help build their NLP projects focusing on large language models. This role includes working the full life cycle from proof of concept to fine-tuning and deployment. Responsibilities: Identify where GenAI can be used to support the business and deliver measurable advantages. Collaborate with the Data Science and AI team to implement scalable AI applications. Improve and optimize the infrastructure by enhancing MLOps capabilities, CI/CD pipelines, user interfaces, and AI Python libraries. Develop effective prompts for AI models while fine-tuning them. Conduct thorough evaluations of third-party GenAI and LLM technologies. Take ownership of projects from initial research and development stages all the way through production deployment. Write production-ready code. What You Need to Succeed: Minimum of 3 years working as a Data Scientist or Machine Learning Engineer on machine learning projects. At least 2 years experience using LLMs. Strong background in Python. Exposure to MLOps including CI/CD, Docker, Kubernetes. Familiarity with software best practices. Experience working with non-technical stakeholders. Confident using AWS. Bonus if you have used Kafka, Databricks, and RAG. Unfortunately, sponsorship is not provided for this role. If you are interested, please apply here or reach out to .
Feb 17, 2025
Full time
AI Engineer Up to 95k + Bonus Location: London - 3 days a week in office My client is a leading financial services company who is in the process of growing their GenAI team. They are looking for an AI Engineer who will be an individual technical contributor to help build their NLP projects focusing on large language models. This role includes working the full life cycle from proof of concept to fine-tuning and deployment. Responsibilities: Identify where GenAI can be used to support the business and deliver measurable advantages. Collaborate with the Data Science and AI team to implement scalable AI applications. Improve and optimize the infrastructure by enhancing MLOps capabilities, CI/CD pipelines, user interfaces, and AI Python libraries. Develop effective prompts for AI models while fine-tuning them. Conduct thorough evaluations of third-party GenAI and LLM technologies. Take ownership of projects from initial research and development stages all the way through production deployment. Write production-ready code. What You Need to Succeed: Minimum of 3 years working as a Data Scientist or Machine Learning Engineer on machine learning projects. At least 2 years experience using LLMs. Strong background in Python. Exposure to MLOps including CI/CD, Docker, Kubernetes. Familiarity with software best practices. Experience working with non-technical stakeholders. Confident using AWS. Bonus if you have used Kafka, Databricks, and RAG. Unfortunately, sponsorship is not provided for this role. If you are interested, please apply here or reach out to .
Data Scientist
InvestCloud, Inc.
IC London England 77 Shaftesbury Ave Soho 5th Floor London, LND W1D5DU, GBR IC London England 77 Shaftesbury Ave Soho 5th Floor London, LND W1D5DU, GBR About InvestCloud InvestCloud is at the forefront of wealth technology, offering innovative solutions that redefine how the financial services industry operates. With a global presence and a client-first approach, we specialize in digital transformations powered by our flexible, modular technology. About the Team You will be joining the newly formed AI, Data & Analytics team, primarily responsible as a Data Scientist leading various projects within the small AI team. The new team is focused on driving increased value from the data InvestCloud captures to enable a smarter financial future for our clients, in particular focused on "enhanced intelligence". Ensuring we have fit-for-purpose modern capabilities is a key goal for the team. We are seeking a Senior Data Scientist / Machine Learning Engineer with a background in Data Science, Machine Learning, and Generative AI models. The ideal candidate should have a proven track record in delivering business impact and delighting clients by developing and deploying ML and AI models in production, along with excellent problem-solving skills. In this role you will integrate AI and ML solutions into the InvestCloud product suite. Key Responsibilities Implement applications powered by Generative AI and Machine Learning models and deploy them in production Develop and maintain datasets and data pipelines to support Machine Learning model training and deployment Interpret results from Machine Learning models and communicate findings to both technical and non-technical stakeholders Stay updated with the latest advancements in Machine Learning, natural language processing, and generative AI. Analyse large datasets to identify patterns, trends, and insights that can inform business decisions. Work with 3 rd party providers of AI products to evaluate and implement solutions achieving Investcloud's business objectives. Required Skills MSc degree in Mathematics, Statistics, Computer Science, Data Science, Machine Learning, or a related technical field or equivalent practical experience At least four years of professional experience in Data Science, Machine Learning and AI Outstanding communications skills in English Proficiency in programming in Python Knowledge of Machine Learning frameworks (Scikit-learn) and LLM frameworks (e.g. Langchain) Knowledge of data preprocessing, feature engineering and model evaluation metrics Experience using large language models, generative AI and agentic frameworks Experience working with Snowflake and/or Databricks or similar tools Working experience developing and deploying Machine Learning models in production Working experience with Git and Docker Working proficiency in English Strong communication skills to engage with non-technical stakeholders Ability to work in a fast-paced environment, working across multiple projects simultaneously Ability to collaborate effectively as a team player, fostering a culture of open communication and mutual respect. Preferred skills Working experience with Vector Database Technologies Experience with cloud platforms such as AWS, GCP, or Azure What do we offer Join our diverse and international cross-functional team, comprising data scientists, product managers, business analyst and software engineers. As a key member of our team, you will have the opportunity to implement cutting-edge technology to create a next-generation advisor and client experience. Location and Travel The ideal candidate will be expected to work from the office on a regular basis (3 days minimum per week). Occasional travel may be required. Compensation The salary range will be determined based on experience, skills, and geographic location. Please note Visa sponsorship is not available for this role. Equal Opportunity Employer InvestCloud is committed to fostering an inclusive workplace and welcomes applicants from all backgrounds.
Feb 08, 2025
Full time
IC London England 77 Shaftesbury Ave Soho 5th Floor London, LND W1D5DU, GBR IC London England 77 Shaftesbury Ave Soho 5th Floor London, LND W1D5DU, GBR About InvestCloud InvestCloud is at the forefront of wealth technology, offering innovative solutions that redefine how the financial services industry operates. With a global presence and a client-first approach, we specialize in digital transformations powered by our flexible, modular technology. About the Team You will be joining the newly formed AI, Data & Analytics team, primarily responsible as a Data Scientist leading various projects within the small AI team. The new team is focused on driving increased value from the data InvestCloud captures to enable a smarter financial future for our clients, in particular focused on "enhanced intelligence". Ensuring we have fit-for-purpose modern capabilities is a key goal for the team. We are seeking a Senior Data Scientist / Machine Learning Engineer with a background in Data Science, Machine Learning, and Generative AI models. The ideal candidate should have a proven track record in delivering business impact and delighting clients by developing and deploying ML and AI models in production, along with excellent problem-solving skills. In this role you will integrate AI and ML solutions into the InvestCloud product suite. Key Responsibilities Implement applications powered by Generative AI and Machine Learning models and deploy them in production Develop and maintain datasets and data pipelines to support Machine Learning model training and deployment Interpret results from Machine Learning models and communicate findings to both technical and non-technical stakeholders Stay updated with the latest advancements in Machine Learning, natural language processing, and generative AI. Analyse large datasets to identify patterns, trends, and insights that can inform business decisions. Work with 3 rd party providers of AI products to evaluate and implement solutions achieving Investcloud's business objectives. Required Skills MSc degree in Mathematics, Statistics, Computer Science, Data Science, Machine Learning, or a related technical field or equivalent practical experience At least four years of professional experience in Data Science, Machine Learning and AI Outstanding communications skills in English Proficiency in programming in Python Knowledge of Machine Learning frameworks (Scikit-learn) and LLM frameworks (e.g. Langchain) Knowledge of data preprocessing, feature engineering and model evaluation metrics Experience using large language models, generative AI and agentic frameworks Experience working with Snowflake and/or Databricks or similar tools Working experience developing and deploying Machine Learning models in production Working experience with Git and Docker Working proficiency in English Strong communication skills to engage with non-technical stakeholders Ability to work in a fast-paced environment, working across multiple projects simultaneously Ability to collaborate effectively as a team player, fostering a culture of open communication and mutual respect. Preferred skills Working experience with Vector Database Technologies Experience with cloud platforms such as AWS, GCP, or Azure What do we offer Join our diverse and international cross-functional team, comprising data scientists, product managers, business analyst and software engineers. As a key member of our team, you will have the opportunity to implement cutting-edge technology to create a next-generation advisor and client experience. Location and Travel The ideal candidate will be expected to work from the office on a regular basis (3 days minimum per week). Occasional travel may be required. Compensation The salary range will be determined based on experience, skills, and geographic location. Please note Visa sponsorship is not available for this role. Equal Opportunity Employer InvestCloud is committed to fostering an inclusive workplace and welcomes applicants from all backgrounds.
Harnham
DATA ENGINEERING MANAGER
Harnham
DATA ENGINEERING MANAGER LONDON - HYBRID, 2 DAYS A WEEK IN OFFICE UP TO £75K+ BONUS (10 - 20%) Harnham are hiring a Data Engineering Manager for a large insurance company based in London. This role includes leading a group of 5 or more Data Engineers in a significant data transformation. This is a great opportunity for a Data Leader to get significant exposure with Senior Stakeholders in this business. THE COMPANY: This role is for a large insurance company in London that is going through a significant digital transformation. This data transformation means that they are now needing someone to lead one of their Data Engineering teams. The company has some significant growth plans for the next 2 years so this is a great time to join the team. THE ROLE: As a Data Engineering Manager, you will help lead a group of Data Engineers to implement the long-term data strategy. The role will consist of 80% managing 5 Data Engineers and 20% hands-on coding. The teams at this company are cross-functional so you will be working alongside Software Engineers and Data Scientists. In this role, you will be reporting directly to the Head of Engineering. YOUR SKILLS AND EXPERIENCE: The ideal Data Engineering Manager will have experience in: Leading a team of Data Engineers to take a project from inception to deployment Strong communication and stakeholder management skills Implementation of a Snowflake data warehouse Commercial experience with Databricks Querying datasets using SQL Excellent Python programming and coding experience Working in an Azure cloud environment HOW TO APPLY: Please register your interest by sending your CV to Ewan Heyworth via the apply link on this page.
Dec 07, 2022
Full time
DATA ENGINEERING MANAGER LONDON - HYBRID, 2 DAYS A WEEK IN OFFICE UP TO £75K+ BONUS (10 - 20%) Harnham are hiring a Data Engineering Manager for a large insurance company based in London. This role includes leading a group of 5 or more Data Engineers in a significant data transformation. This is a great opportunity for a Data Leader to get significant exposure with Senior Stakeholders in this business. THE COMPANY: This role is for a large insurance company in London that is going through a significant digital transformation. This data transformation means that they are now needing someone to lead one of their Data Engineering teams. The company has some significant growth plans for the next 2 years so this is a great time to join the team. THE ROLE: As a Data Engineering Manager, you will help lead a group of Data Engineers to implement the long-term data strategy. The role will consist of 80% managing 5 Data Engineers and 20% hands-on coding. The teams at this company are cross-functional so you will be working alongside Software Engineers and Data Scientists. In this role, you will be reporting directly to the Head of Engineering. YOUR SKILLS AND EXPERIENCE: The ideal Data Engineering Manager will have experience in: Leading a team of Data Engineers to take a project from inception to deployment Strong communication and stakeholder management skills Implementation of a Snowflake data warehouse Commercial experience with Databricks Querying datasets using SQL Excellent Python programming and coding experience Working in an Azure cloud environment HOW TO APPLY: Please register your interest by sending your CV to Ewan Heyworth via the apply link on this page.
Confidential
Senior Cloud Data Security Engineer - Azure
Confidential
Job Profile Summary bp are looking for a Senior Cloud Data Security Engineer with a strong background in Azure, with experience in designing, engineering, and deploying security across Big Data cloud platforms. KEEP READING, IF YOU ARE AN ENGINEER WITH PASSION FOR BIG DATA AND HANDS-ON SECURITY DEVELOPMENT EXPERIENCE, AS YOU WOULD BE A GREAT FIT. WHAT YOU'LL BE DOING · You'll have direct responsibility to design, engineer and deploy security features and processes responding to data platform development demands. · You'll be leading the ongoing development of Security Operations within the data services, drafting producers and runbooks. · You'll be joining bp's Digital Security function providing security engineering capability supporting products and services across the bp business. Job Advert · You'll be working closely with product & service owners and their teams who operate bp's data platforms. · Lead threat modelling, engineering planning workshops and security design reviews with engineering teams, providing subject matter expertise in resolving complex security problems. · You'll work in an agile environment, mastering continuous improvement deliveries. · As a senior team member, you will have opportunity to mentor others and draft their learning paths. WHO ARE WE LOOKING FOR? · Excellent cloud / data lake / data platform and associated technologies knowledge. · Experience working in digital security across cloud platforms with a specific emphasis on working across big data / data analytics platforms. · Strong demonstrated 'hands on' technical proficiency (see 'Technical competencies'). · Good grasp of Software Development Lifecycle (SDLC) · Experience working across information lifecycle management · A good understanding and application of technology to support the regulatory environment for information management - particularly related to personal and high value data. · A strong communicator, influencer and team player. · A passion for continuous learning, professional development and knowledge sharing. This role would suit a mid-level cloud data security engineer looking to advance into senior position, taking responsibility for the team deliverables and performance. TECHNICAL COMPETENCIES A range of core technical proficiency / knowledge across the following domains: GENERAL · Azure DevOps Pipelines or Github Actions · Infrastructure as a Code tools PROGRAMMING · .NET, Java, Scala · Python 3 · PowerShell / Bash TOOLS · ETL tools (Azure Data Factory) · Logging capabilities (Log Analytics, App insights) · Programming hosts (Azure Functions, Azure Batch, AKS, VMSS) · Data (Data Bricks, Synapse, SQLDB, ADX, Cosmos DB, ADLS) · Messaging (Event Hub, Event Grid, Service Bus, Azure Queues) · arm, Biceps · AAD PROFICIENCY / KNOWLEDGE ACROSS SOME OF THE FOLLOWING DOMAINS WOULD ALSO BE PREFERABLE: · Azure Databricks · Microsoft / Azure Certified · CISSP, CCSP, CIPT At bp, we provide a phenomenal environment and benefits such as an open and inclusive culture, a great work-life balance, tremendous learning and development opportunities to craft your career path, life and health insurance, medical care package and many others! Diversity sits at the heart of our company and as an equal opportunity employer, we stay true to our mission by ensuring that our place can be anyone's place. We do not discriminate based on race, religion, color, national origin, gender and gender identity, sexual orientation, age, marital status, veteran status or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application and interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Reinvent your career as you help our business to meet the challenges of the future. Apply now! #bpInformationSecurity Entity Innovation & Engineering Job Family Group IT&S Group Relocation available No Travel required Negligible travel Time Type Full time Country United Kingdom About BP INNOVATION & ENGINEERING Join us in creating, growing, and delivering innovation at pace, enabling us to thrive while transitioning to a net zero ‎world. All without compromising our operational risk management. Working with us, you can do this by: • deploying our integrated capability and standards in service of our net zero and ‎safety ambitions • driving our digital transformation and pioneering new business models • collaborating to deliver competitive customer-focused energy solutions • originating, scaling and commercialising innovative ideas, and creating ground-breaking new ‎businesses from them • protecting us by assuring management of our greatest physical and digital risks Because together we are: • Originators, builders, guardians and disruptors • Engineers, technologists, scientists and entrepreneurs‎ • Empathetic, curious, creative and inclusive
Dec 07, 2021
Full time
Job Profile Summary bp are looking for a Senior Cloud Data Security Engineer with a strong background in Azure, with experience in designing, engineering, and deploying security across Big Data cloud platforms. KEEP READING, IF YOU ARE AN ENGINEER WITH PASSION FOR BIG DATA AND HANDS-ON SECURITY DEVELOPMENT EXPERIENCE, AS YOU WOULD BE A GREAT FIT. WHAT YOU'LL BE DOING · You'll have direct responsibility to design, engineer and deploy security features and processes responding to data platform development demands. · You'll be leading the ongoing development of Security Operations within the data services, drafting producers and runbooks. · You'll be joining bp's Digital Security function providing security engineering capability supporting products and services across the bp business. Job Advert · You'll be working closely with product & service owners and their teams who operate bp's data platforms. · Lead threat modelling, engineering planning workshops and security design reviews with engineering teams, providing subject matter expertise in resolving complex security problems. · You'll work in an agile environment, mastering continuous improvement deliveries. · As a senior team member, you will have opportunity to mentor others and draft their learning paths. WHO ARE WE LOOKING FOR? · Excellent cloud / data lake / data platform and associated technologies knowledge. · Experience working in digital security across cloud platforms with a specific emphasis on working across big data / data analytics platforms. · Strong demonstrated 'hands on' technical proficiency (see 'Technical competencies'). · Good grasp of Software Development Lifecycle (SDLC) · Experience working across information lifecycle management · A good understanding and application of technology to support the regulatory environment for information management - particularly related to personal and high value data. · A strong communicator, influencer and team player. · A passion for continuous learning, professional development and knowledge sharing. This role would suit a mid-level cloud data security engineer looking to advance into senior position, taking responsibility for the team deliverables and performance. TECHNICAL COMPETENCIES A range of core technical proficiency / knowledge across the following domains: GENERAL · Azure DevOps Pipelines or Github Actions · Infrastructure as a Code tools PROGRAMMING · .NET, Java, Scala · Python 3 · PowerShell / Bash TOOLS · ETL tools (Azure Data Factory) · Logging capabilities (Log Analytics, App insights) · Programming hosts (Azure Functions, Azure Batch, AKS, VMSS) · Data (Data Bricks, Synapse, SQLDB, ADX, Cosmos DB, ADLS) · Messaging (Event Hub, Event Grid, Service Bus, Azure Queues) · arm, Biceps · AAD PROFICIENCY / KNOWLEDGE ACROSS SOME OF THE FOLLOWING DOMAINS WOULD ALSO BE PREFERABLE: · Azure Databricks · Microsoft / Azure Certified · CISSP, CCSP, CIPT At bp, we provide a phenomenal environment and benefits such as an open and inclusive culture, a great work-life balance, tremendous learning and development opportunities to craft your career path, life and health insurance, medical care package and many others! Diversity sits at the heart of our company and as an equal opportunity employer, we stay true to our mission by ensuring that our place can be anyone's place. We do not discriminate based on race, religion, color, national origin, gender and gender identity, sexual orientation, age, marital status, veteran status or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application and interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Reinvent your career as you help our business to meet the challenges of the future. Apply now! #bpInformationSecurity Entity Innovation & Engineering Job Family Group IT&S Group Relocation available No Travel required Negligible travel Time Type Full time Country United Kingdom About BP INNOVATION & ENGINEERING Join us in creating, growing, and delivering innovation at pace, enabling us to thrive while transitioning to a net zero ‎world. All without compromising our operational risk management. Working with us, you can do this by: • deploying our integrated capability and standards in service of our net zero and ‎safety ambitions • driving our digital transformation and pioneering new business models • collaborating to deliver competitive customer-focused energy solutions • originating, scaling and commercialising innovative ideas, and creating ground-breaking new ‎businesses from them • protecting us by assuring management of our greatest physical and digital risks Because together we are: • Originators, builders, guardians and disruptors • Engineers, technologists, scientists and entrepreneurs‎ • Empathetic, curious, creative and inclusive
Confidential
Senior Cloud Data Security Engineer - AWS
Confidential
Job Profile Summary bp are looking for a Senior Cloud Data Security Engineer with a strong background in AWS, with experience in designing, engineering, and deploying security across Big Data cloud platforms. KEEP READING, IF YOU ARE AN ENGINEER WITH PASSION FOR BIG DATA AND HANDS-ON SECURITY DEVELOPMENT EXPERIENCE, AS YOU WOULD BE A GREAT FIT. WHAT YOU'LL BE DOING · You'll have direct responsibility to design, engineer and deploy security features and processes responding to data platform development demands. · You'll be leading the ongoing development of Security Operations within the data services, drafting producers and runbooks. · You'll be joining bp's Digital Security function providing security engineering capability supporting products and services across the bp business. Job Advert · You'll be working closely with product & service owners and their teams who operate bp's data platforms. · Lead threat modelling, engineering planning workshops and security design reviews with engineering teams, providing subject matter expertise in resolving complex security problems. · You'll work in an agile environment, mastering continuous improvement deliveries. · As a senior team member, you will have opportunity to mentor others and draft their learning paths. WHO ARE WE LOOKING FOR? · Excellent cloud / data lake / data platform and associated technologies knowledge. · Experience working in digital security across cloud platforms with a specific emphasis on working across big data / data analytics platforms. · Strong demonstrated 'hands on' technical proficiency (see 'Technical competencies'). · Good grasp of Software Development Lifecycle (SDLC) · Experience working across information lifecycle management · A good understanding and application of technology to support the regulatory environment for information management - particularly related to personal and high value data. · A strong communicator, influencer and team player. · A passion for continuous learning, professional development and knowledge sharing. This role would suit a mid-level cloud data security engineer looking to advance into senior position, taking responsibility for the team deliverables and performance. TECHNICAL COMPETENCIES A range of core technical proficiency / knowledge across the following domains: GENERAL · Azure DevOps Pipelines or Github Actions · Infrastructure as a Code tools PROGRAMMING · .NET, Java, Scala · Python 3 · PowerShell / Bash TOOLS · ETL tools (Amazon Glue, Apache Airflow) · Logging capabilities (CloudTrail, CloudWatch) · Programming hosts (Lambda, EC2, EKS) · Data (Databricks, EMR, Redshift, DynamoDB, RDS, S3, Athena, Lake Formation) · Messaging (Kafka, SNS, SQS, ActiveMQ) · CloudFormation, Terraform · IAM, Config, SSM, Security Hub · Step functions PROFICIENCY / KNOWLEDGE ACROSS SOME OF THE FOLLOWING DOMAINS WOULD ALSO BE PREFERABLE: · AWS Databricks · Microsoft / AWS Certified · CISSP, CCSP, CIPTAt bp, we provide a phenomenal environment and benefits such as an open and inclusive culture, a great work-life balance, tremendous learning and development opportunities to craft your career path, life and health insurance, medical care package and many others! Diversity sits at the heart of our company and as an equal opportunity employer, we stay true to our mission by ensuring that our place can be anyone's place. We do not discriminate based on race, religion, color, national origin, gender and gender identity, sexual orientation, age, marital status, veteran status or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application and interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Reinvent your career as you help our business to meet the challenges of the future. Apply now! #bpInformationSecurity Entity Innovation & Engineering Job Family Group IT&S Group Relocation available No Travel required Negligible travel Time Type Full time Country United Kingdom About BP INNOVATION & ENGINEERING Join us in creating, growing, and delivering innovation at pace, enabling us to thrive while transitioning to a net zero ‎world. All without compromising our operational risk management. Working with us, you can do this by: • deploying our integrated capability and standards in service of our net zero and ‎safety ambitions • driving our digital transformation and pioneering new business models • collaborating to deliver competitive customer-focused energy solutions • originating, scaling and commercialising innovative ideas, and creating ground-breaking new ‎businesses from them • protecting us by assuring management of our greatest physical and digital risks Because together we are: • Originators, builders, guardians and disruptors • Engineers, technologists, scientists and entrepreneurs‎ • Empathetic, curious, creative and inclusive
Dec 05, 2021
Full time
Job Profile Summary bp are looking for a Senior Cloud Data Security Engineer with a strong background in AWS, with experience in designing, engineering, and deploying security across Big Data cloud platforms. KEEP READING, IF YOU ARE AN ENGINEER WITH PASSION FOR BIG DATA AND HANDS-ON SECURITY DEVELOPMENT EXPERIENCE, AS YOU WOULD BE A GREAT FIT. WHAT YOU'LL BE DOING · You'll have direct responsibility to design, engineer and deploy security features and processes responding to data platform development demands. · You'll be leading the ongoing development of Security Operations within the data services, drafting producers and runbooks. · You'll be joining bp's Digital Security function providing security engineering capability supporting products and services across the bp business. Job Advert · You'll be working closely with product & service owners and their teams who operate bp's data platforms. · Lead threat modelling, engineering planning workshops and security design reviews with engineering teams, providing subject matter expertise in resolving complex security problems. · You'll work in an agile environment, mastering continuous improvement deliveries. · As a senior team member, you will have opportunity to mentor others and draft their learning paths. WHO ARE WE LOOKING FOR? · Excellent cloud / data lake / data platform and associated technologies knowledge. · Experience working in digital security across cloud platforms with a specific emphasis on working across big data / data analytics platforms. · Strong demonstrated 'hands on' technical proficiency (see 'Technical competencies'). · Good grasp of Software Development Lifecycle (SDLC) · Experience working across information lifecycle management · A good understanding and application of technology to support the regulatory environment for information management - particularly related to personal and high value data. · A strong communicator, influencer and team player. · A passion for continuous learning, professional development and knowledge sharing. This role would suit a mid-level cloud data security engineer looking to advance into senior position, taking responsibility for the team deliverables and performance. TECHNICAL COMPETENCIES A range of core technical proficiency / knowledge across the following domains: GENERAL · Azure DevOps Pipelines or Github Actions · Infrastructure as a Code tools PROGRAMMING · .NET, Java, Scala · Python 3 · PowerShell / Bash TOOLS · ETL tools (Amazon Glue, Apache Airflow) · Logging capabilities (CloudTrail, CloudWatch) · Programming hosts (Lambda, EC2, EKS) · Data (Databricks, EMR, Redshift, DynamoDB, RDS, S3, Athena, Lake Formation) · Messaging (Kafka, SNS, SQS, ActiveMQ) · CloudFormation, Terraform · IAM, Config, SSM, Security Hub · Step functions PROFICIENCY / KNOWLEDGE ACROSS SOME OF THE FOLLOWING DOMAINS WOULD ALSO BE PREFERABLE: · AWS Databricks · Microsoft / AWS Certified · CISSP, CCSP, CIPTAt bp, we provide a phenomenal environment and benefits such as an open and inclusive culture, a great work-life balance, tremendous learning and development opportunities to craft your career path, life and health insurance, medical care package and many others! Diversity sits at the heart of our company and as an equal opportunity employer, we stay true to our mission by ensuring that our place can be anyone's place. We do not discriminate based on race, religion, color, national origin, gender and gender identity, sexual orientation, age, marital status, veteran status or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application and interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Reinvent your career as you help our business to meet the challenges of the future. Apply now! #bpInformationSecurity Entity Innovation & Engineering Job Family Group IT&S Group Relocation available No Travel required Negligible travel Time Type Full time Country United Kingdom About BP INNOVATION & ENGINEERING Join us in creating, growing, and delivering innovation at pace, enabling us to thrive while transitioning to a net zero ‎world. All without compromising our operational risk management. Working with us, you can do this by: • deploying our integrated capability and standards in service of our net zero and ‎safety ambitions • driving our digital transformation and pioneering new business models • collaborating to deliver competitive customer-focused energy solutions • originating, scaling and commercialising innovative ideas, and creating ground-breaking new ‎businesses from them • protecting us by assuring management of our greatest physical and digital risks Because together we are: • Originators, builders, guardians and disruptors • Engineers, technologists, scientists and entrepreneurs‎ • Empathetic, curious, creative and inclusive
Pertemps
Lead Data Scientist
Pertemps Reading, Berkshire
Who are we? Thames Water is the UK's largest water and wastewater company. We make a daily difference to our 15 million customers by supplying 2.6 billion litres of water through 32,000 km's of pipes, to keep taps flowing and toilets flushing. We are an essential service and have operated throughout the Covid19 pandemic. At Thames Water, every one of our actions, big and small, matters every day. Water is essential to daily life, and that means our business is always open. What you will be doing Data Science Innovation delivers ground-breaking algorithms, digital twins and modelling to solve business and industry problems, as well as being the centre of excellence for data science in the organisation. Leading a team of data scientists, the successful candidate will be joining this data science innovation function and working alongside data engineers, delivery leads and other disciplines on leading edge platforms and tooling. They will be working on stimulating technical challenges with data stored in a secure and documented manner to protect customer data and company reputation. Our ML ops process is considered ahead of the curve in the water space and will offer growing challenges to all levels. The benefits realisation will provide transformation to a digital-first, data-driven business. A key responsibility of this role is leading data scientists in exploring innovative applications of data science to create business opportunities. You will then take successful algorithms and models and collaborate with our product teams to scale out results into production - and directly into the hands of our front-line staff. The right candidate will help us build our data science capability across the organisation by supporting our strategic & technical direction, governance, and recruitment pipelines. They will also work with external partners, and colleagues across the industry to open the door to future collaborative innovation. As a leader, the successful candidate develops and foster a close-knit and skilled team. This includes building individual development roadmaps for each team member, identifying external resource requirements and negotiating resourcing in the most cost-effective way. Data Science advocacy across departments and teams, bringing awareness of data science opportunities and capabilities to those we work with, to fill the innovation pipeline of the future. As Lead Data Scientist, your responsibilities will include: Leading multiple projects across the business Directly and indirectly, growing and managing Data Science teams Negotiating with partners and monitoring the technical aspects of Data Science work Assisting in the design of system architecture, data engineering, presentation layer Owning and communicating the Data Science strategy for the organisation To be successful you will have the following skills and experience: Proven experience in leading and growing out data science teams Very strong academics, ideally a Masters on a highly quantitative degree with research projects Strong commercial experience as a data scientist working on a wide variety of projects Experience in supervision of machine learning and data engineering projects Prior creation and implementation of best practices for statistical data modelling, analysis, and machine learning; data exploration; and processes that allow data team to work efficiently Evidence of strategic thinking and vision for use of data in a commercial setting Expert in Python and in Open Source frameworks and machine learning packages (e.g. Scikit-learn, Pandas, SciPy etc.) Awareness of other data science languages (R, Scala) A good understanding of Machine Learning & Deep Learning techniques and graph theory Strong stakeholder management and engagement skills Strong Communication skills tailored to the audience (exec team, senior stakeholders, engineers) Experience in working with large amounts of raw data; preparing, cleansing and processing Experience with NLP / Text Mining / Sentiment Analysis Experience with collaborative software development or Agile delivery methodologies Preferably accustomed with local and azure cloud environments and their data instances (Azure Data Factory, Azure Data Lake Storage, VMs, Databricks) What's in it for you? Our competitive salary & package includes a generous bonus scheme, car allowance, an excellent contributory pension, 26 days holiday per year increasing to 30 with the length of service and a wider benefits scheme. We're proud of the positive ways of working we have adopted during the pandemic. We want to create a more flexible and dynamic environment so all our colleagues can thrive. For our office-based roles we are moving to a hybrid approach where we will provide options around working from our offices, our operational sites and home dependent on role/team/individual. This will be discussed during the assessment process. Thames Water is a unique, rewarding and diverse place to work. If you join our team, you'll enjoy fast-tracked career opportunities, flexible working arrangements and unparalleled benefits. We're also proud to be an equal opportunity employer, Stonewall Diversity Champion and Disability Confident Leader and we are a Times Top 50 Employer for Women. You can find out more in our working for us section of our website.
Dec 03, 2021
Full time
Who are we? Thames Water is the UK's largest water and wastewater company. We make a daily difference to our 15 million customers by supplying 2.6 billion litres of water through 32,000 km's of pipes, to keep taps flowing and toilets flushing. We are an essential service and have operated throughout the Covid19 pandemic. At Thames Water, every one of our actions, big and small, matters every day. Water is essential to daily life, and that means our business is always open. What you will be doing Data Science Innovation delivers ground-breaking algorithms, digital twins and modelling to solve business and industry problems, as well as being the centre of excellence for data science in the organisation. Leading a team of data scientists, the successful candidate will be joining this data science innovation function and working alongside data engineers, delivery leads and other disciplines on leading edge platforms and tooling. They will be working on stimulating technical challenges with data stored in a secure and documented manner to protect customer data and company reputation. Our ML ops process is considered ahead of the curve in the water space and will offer growing challenges to all levels. The benefits realisation will provide transformation to a digital-first, data-driven business. A key responsibility of this role is leading data scientists in exploring innovative applications of data science to create business opportunities. You will then take successful algorithms and models and collaborate with our product teams to scale out results into production - and directly into the hands of our front-line staff. The right candidate will help us build our data science capability across the organisation by supporting our strategic & technical direction, governance, and recruitment pipelines. They will also work with external partners, and colleagues across the industry to open the door to future collaborative innovation. As a leader, the successful candidate develops and foster a close-knit and skilled team. This includes building individual development roadmaps for each team member, identifying external resource requirements and negotiating resourcing in the most cost-effective way. Data Science advocacy across departments and teams, bringing awareness of data science opportunities and capabilities to those we work with, to fill the innovation pipeline of the future. As Lead Data Scientist, your responsibilities will include: Leading multiple projects across the business Directly and indirectly, growing and managing Data Science teams Negotiating with partners and monitoring the technical aspects of Data Science work Assisting in the design of system architecture, data engineering, presentation layer Owning and communicating the Data Science strategy for the organisation To be successful you will have the following skills and experience: Proven experience in leading and growing out data science teams Very strong academics, ideally a Masters on a highly quantitative degree with research projects Strong commercial experience as a data scientist working on a wide variety of projects Experience in supervision of machine learning and data engineering projects Prior creation and implementation of best practices for statistical data modelling, analysis, and machine learning; data exploration; and processes that allow data team to work efficiently Evidence of strategic thinking and vision for use of data in a commercial setting Expert in Python and in Open Source frameworks and machine learning packages (e.g. Scikit-learn, Pandas, SciPy etc.) Awareness of other data science languages (R, Scala) A good understanding of Machine Learning & Deep Learning techniques and graph theory Strong stakeholder management and engagement skills Strong Communication skills tailored to the audience (exec team, senior stakeholders, engineers) Experience in working with large amounts of raw data; preparing, cleansing and processing Experience with NLP / Text Mining / Sentiment Analysis Experience with collaborative software development or Agile delivery methodologies Preferably accustomed with local and azure cloud environments and their data instances (Azure Data Factory, Azure Data Lake Storage, VMs, Databricks) What's in it for you? Our competitive salary & package includes a generous bonus scheme, car allowance, an excellent contributory pension, 26 days holiday per year increasing to 30 with the length of service and a wider benefits scheme. We're proud of the positive ways of working we have adopted during the pandemic. We want to create a more flexible and dynamic environment so all our colleagues can thrive. For our office-based roles we are moving to a hybrid approach where we will provide options around working from our offices, our operational sites and home dependent on role/team/individual. This will be discussed during the assessment process. Thames Water is a unique, rewarding and diverse place to work. If you join our team, you'll enjoy fast-tracked career opportunities, flexible working arrangements and unparalleled benefits. We're also proud to be an equal opportunity employer, Stonewall Diversity Champion and Disability Confident Leader and we are a Times Top 50 Employer for Women. You can find out more in our working for us section of our website.

Modal Window

  • Home
  • Contact
  • About Us
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • Facebook
  • Twitter
  • Google Plus
  • LinkedIn
Parent and Partner sites: IT Job Board | Jobs Near Me | RightTalent.co.uk | Quantity Surveyor jobs | Building Surveyor jobs | Construction Recruitment | Talent Recruiter | Construction Job Board | Property jobs | myJobsnearme.com | Jobs near me
© 2008-2025 Jobsite Jobs | Designed by Web Design Agency