• Home
  • Search Jobs
  • Register CV
  • Post a Job
  • Employer Pricing
  • Contact Us
  • Sign in
  • Sign up
  • Home
  • Search Jobs
  • Register CV
  • Post a Job
  • Employer Pricing
  • Contact Us
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

13 jobs found

Email me jobs like this
Refine Search
Current Search
data engineer snowflake sql python
DataOps Engineer - Data Science Operations
Castleton Commodities International, LLC
DataOps Engineer - Data Science Operations page is loaded DataOps Engineer - Data Science Operationslocations: London, UKtime type: Full timeposted on: Posted Yesterdayjob requisition id: R1227At Castleton Commodities International (CCI), we are redefining how data and technology shape the future of energy trading. Our Data Science & Technology team is at the forefront of this transformation, developing systems and innovative tools that empower our front-office teams to better understand market dynamics, forecast prices, and manage risk. Our Data Science Operations team is integral to our mission, ensuring the seamless ingestion and management of data across our platforms. The DataOps Engineer will assist with all aspects of data - from data architecture design to on-going data management and will have significant exposure to our Commercial investing teams globally. This position will play an integral role as the firm continues to expand its use of data for advanced analytics and other commercial purposes. Responsibilities: Execute data architecture and data management projects for existing data sources, ensuring alignment with business and technical requirements. Design, enhance, and maintain market data platforms using Python, optimizing for scalability and performance. Manage the end-to-end data ingestion process, including extraction, transformation, loading (ETL), and data publishing for investment and commercial teams. Own and continuously improve the process of mapping, standardizing, and normalizing fundamental analytics data to support consistent usage across business functions. Implement automated workflows for error handling and develop data quality analysis to proactively identify and address systemic issues. Prioritize and resolve critical market data issues based on business impact and user-reported concerns. Coordinate with technology and business stakeholders to align goals, timelines, and deliverables for strategic big data initiatives. Serve as a liaison with commercial (trading) teams to translate their needs regarding data flow, architecture, and the investment process into functional data requirements. Provide operational support for market data platforms, handling performance tuning, incident response, and user assistance. Ensure platform stability and resilience by developing and maintaining operational runbooks, standard operating procedures (SOPs), and incident response protocols. Qualifications: Bachelor's degree in Computer Science, Mathematics, Physics, Engineering or related field of study. 5+ years' experience in data operations production environment, ideally in financial services or energy commodities. Proficient in Python programming and its libraries - Pandas, NumPy, etc. Experience with front-end technologies such as HTML, CSS, JavaScript, and modern front-end frameworks (e.g., React, Angular) is a plus. Prior experience with Snowflake columnar database and ability to design and optimize complex SQL queries. Proficient in using version control systems like Git to manage code repositories effectively. Strong analytical and problem-solving skills to tackle complex technical challenges. Proficiency in debugging and performance optimization techniques. Understanding of the software development lifecycle, from requirements analysis to testing and deployment. Ability to work effectively in a fast-paced, dynamic and high-intensity environment including open-floor plan if applicable to the position, with timely responsiveness and the ability to work beyond normal business hours when required. Employee Programs & Benefits: CCI offers competitive benefits and programs to support our employees, their families and local communities. These include: Competitive comprehensive medical, dental, retirement and life insurance benefits Employee assistance & wellness programs Parental and family leave policies CCI in the Community: Each office has a Charity Committee and as a part of this program employees are allocated 2 days annually to volunteer at the selected charities. Charitable contribution match program Tuition assistance & reimbursement Quarterly Innovation & Collaboration Awards Employee discount program, including access to fitness facilities Competitive paid time off Continued learning opportunitiesVisit to learn more! Commodities International is a leading global energy commodities merchant and infrastructure asset investor. As a trader, CCI deploys capital on a proprietary basis in the physical and financial commodity markets, providing the Company with market insights and access. As a strategic investor and developer, CCI leverages its market expertise, operations capabilities, and industry knowledge to invest in, and develop, select commodity infrastructure assets. This fully integrated platform has generated strong risk-adjusted returns for our investors since our formation.
Mar 16, 2026
Full time
DataOps Engineer - Data Science Operations page is loaded DataOps Engineer - Data Science Operationslocations: London, UKtime type: Full timeposted on: Posted Yesterdayjob requisition id: R1227At Castleton Commodities International (CCI), we are redefining how data and technology shape the future of energy trading. Our Data Science & Technology team is at the forefront of this transformation, developing systems and innovative tools that empower our front-office teams to better understand market dynamics, forecast prices, and manage risk. Our Data Science Operations team is integral to our mission, ensuring the seamless ingestion and management of data across our platforms. The DataOps Engineer will assist with all aspects of data - from data architecture design to on-going data management and will have significant exposure to our Commercial investing teams globally. This position will play an integral role as the firm continues to expand its use of data for advanced analytics and other commercial purposes. Responsibilities: Execute data architecture and data management projects for existing data sources, ensuring alignment with business and technical requirements. Design, enhance, and maintain market data platforms using Python, optimizing for scalability and performance. Manage the end-to-end data ingestion process, including extraction, transformation, loading (ETL), and data publishing for investment and commercial teams. Own and continuously improve the process of mapping, standardizing, and normalizing fundamental analytics data to support consistent usage across business functions. Implement automated workflows for error handling and develop data quality analysis to proactively identify and address systemic issues. Prioritize and resolve critical market data issues based on business impact and user-reported concerns. Coordinate with technology and business stakeholders to align goals, timelines, and deliverables for strategic big data initiatives. Serve as a liaison with commercial (trading) teams to translate their needs regarding data flow, architecture, and the investment process into functional data requirements. Provide operational support for market data platforms, handling performance tuning, incident response, and user assistance. Ensure platform stability and resilience by developing and maintaining operational runbooks, standard operating procedures (SOPs), and incident response protocols. Qualifications: Bachelor's degree in Computer Science, Mathematics, Physics, Engineering or related field of study. 5+ years' experience in data operations production environment, ideally in financial services or energy commodities. Proficient in Python programming and its libraries - Pandas, NumPy, etc. Experience with front-end technologies such as HTML, CSS, JavaScript, and modern front-end frameworks (e.g., React, Angular) is a plus. Prior experience with Snowflake columnar database and ability to design and optimize complex SQL queries. Proficient in using version control systems like Git to manage code repositories effectively. Strong analytical and problem-solving skills to tackle complex technical challenges. Proficiency in debugging and performance optimization techniques. Understanding of the software development lifecycle, from requirements analysis to testing and deployment. Ability to work effectively in a fast-paced, dynamic and high-intensity environment including open-floor plan if applicable to the position, with timely responsiveness and the ability to work beyond normal business hours when required. Employee Programs & Benefits: CCI offers competitive benefits and programs to support our employees, their families and local communities. These include: Competitive comprehensive medical, dental, retirement and life insurance benefits Employee assistance & wellness programs Parental and family leave policies CCI in the Community: Each office has a Charity Committee and as a part of this program employees are allocated 2 days annually to volunteer at the selected charities. Charitable contribution match program Tuition assistance & reimbursement Quarterly Innovation & Collaboration Awards Employee discount program, including access to fitness facilities Competitive paid time off Continued learning opportunitiesVisit to learn more! Commodities International is a leading global energy commodities merchant and infrastructure asset investor. As a trader, CCI deploys capital on a proprietary basis in the physical and financial commodity markets, providing the Company with market insights and access. As a strategic investor and developer, CCI leverages its market expertise, operations capabilities, and industry knowledge to invest in, and develop, select commodity infrastructure assets. This fully integrated platform has generated strong risk-adjusted returns for our investors since our formation.
VC Talent
Senior Data Engineer
VC Talent
Senior Data Engineer A successful FTSE-listed organisation with a friendly, close-knit culture is looking for a Senior Data Engineer to help shape and deliver its evolving data platform. This is a hands-on, high-impact role in a smaller organisation where you will act as both a senior engineer and strategic data partner, designing solutions that support both operational and long-term business goals. You will play an important role in the company s migration to Microsoft Fabric, while continuing to build and optimise its existing SQL Server-based data warehouse environment. While primarily a backend engineering role, you will also support reporting via SSRS/Microsoft Power BI, with an increasing focus on AI-driven capabilities over time. As the Senior Data Engineer, you will work closely with the Solutions Architect and collaborate with teams across the business, with occasional travel to France and Germany. This opportunity suits someone with in-depth SQL experience who is looking to work with modern tools like Microsoft Fabric. Essential Skills Strong SQL Server experience (T-SQL, SSIS, SSRS, stored procedures, functions, triggers) Data warehouse architecture, build and maintenance Excellent communication skills and ability to gather requirements from non-technical stakeholders at all levels Degree in Computer Science or STEM subject Comfortable working 4 days per week onsite in Vauxhall Desirable Microsoft Fabric, Azure Synapse, Azure Data Factory, Snowflake, Databricks Python or any other relevant language Experience with tools such as Power BI, Qlik or Tableau The role offers a package of £64k+, plus a bonus of up to 15%, along with a generous pension, 28 days' holiday, private medical insurance, permanent health insurance, study support, and a modern office with an onsite gym. If your experience aligns, apply with an up-to-date CV as soon as possible, as this is expected to be a popular opportunity.
Mar 12, 2026
Full time
Senior Data Engineer A successful FTSE-listed organisation with a friendly, close-knit culture is looking for a Senior Data Engineer to help shape and deliver its evolving data platform. This is a hands-on, high-impact role in a smaller organisation where you will act as both a senior engineer and strategic data partner, designing solutions that support both operational and long-term business goals. You will play an important role in the company s migration to Microsoft Fabric, while continuing to build and optimise its existing SQL Server-based data warehouse environment. While primarily a backend engineering role, you will also support reporting via SSRS/Microsoft Power BI, with an increasing focus on AI-driven capabilities over time. As the Senior Data Engineer, you will work closely with the Solutions Architect and collaborate with teams across the business, with occasional travel to France and Germany. This opportunity suits someone with in-depth SQL experience who is looking to work with modern tools like Microsoft Fabric. Essential Skills Strong SQL Server experience (T-SQL, SSIS, SSRS, stored procedures, functions, triggers) Data warehouse architecture, build and maintenance Excellent communication skills and ability to gather requirements from non-technical stakeholders at all levels Degree in Computer Science or STEM subject Comfortable working 4 days per week onsite in Vauxhall Desirable Microsoft Fabric, Azure Synapse, Azure Data Factory, Snowflake, Databricks Python or any other relevant language Experience with tools such as Power BI, Qlik or Tableau The role offers a package of £64k+, plus a bonus of up to 15%, along with a generous pension, 28 days' holiday, private medical insurance, permanent health insurance, study support, and a modern office with an onsite gym. If your experience aligns, apply with an up-to-date CV as soon as possible, as this is expected to be a popular opportunity.
Rackspace
Solution Director (Analytics & Al/ML
Rackspace
Rackspace Technology is a leading provider of expertise and managed services across all the major public and private cloud technologies. We've evolved Fanatical Support to encompass the entire customer journey - providing Fanatical Experience from first consultation to daily operations. Our passionate experts combine the power of proactive, always on service and expertise with best in class tools and automation to deliver technology when and how our customers need it. We are seeking a highly accomplished Solution Director (Analytics & Al/ML) to lead the design and sales of two critical solution portfolios: Generative AI/LLM solutions and Data modernization/Lakehouse architectures on AWS. This pivotal role requires mastery of both domains - leveraging generative AI capabilities (Amazon Q, Amazon Bedrock, QuickSight) to drive executive conversations and opportunity creation, while delivering enterprise data modernization through Lakehouse architectures using AWS native services (Glue, SageMaker Unified Studio) and leading platforms (Databricks on AWS, Snowflake on AWS). This is a presales role that demands cross functional experience with proven ability to engage C level stakeholders, drive top of funnel opportunity creation, and maintain comprehensive account ownership across the entire customer lifecycle. The ideal candidate will excel at both selling the vision of generative AI transformation and delivering the reality of enterprise data modernization, combining deep technical expertise with exceptional business acumen and executive presence. Responsibilities Strategic Leadership & Opportunity Development • Drive top of funnel opportunity creation through two parallel tracks: engaging C level stakeholders with generative AI demonstrations (Amazon Q, Amazon Bedrock) and identifying data modernization needs for Lakehouse transformations. • Lead the design and architecture of dual solution portfolios: Generative AI Solutions: Amazon Bedrock implementations, Amazon Q deployments, QuickSight with Q capabilities, RAG architectures, and custom LLM solutions. Data Modernization: Enterprise Lakehouse architectures using AWS Glue, SageMaker Unified Studio, Databricks on AWS, and Snowflake on AWS. • Act as the trusted advisor, positioning generative AI as the transformational vision while grounding delivery in robust data platform modernization. • Develop compelling business cases that connect AI aspirations with practical data foundation requirements, demonstrating ROI across both portfolios. • Stay current with advancements in generative AI (foundation models, LLMs) and modern data architectures (Lakehouse patterns, data mesh, unified analytics). • Contribute to Rackspace's intellectual property through reference architectures covering both generative AI implementations and Lakehouse design patterns. • Mentor and provide leadership to Solution Architects by guiding technical development and fostering skill growth across both generative AI and data modernization solution areas. Customer Engagement & Solution Delivery • Serve as the primary technical lead orchestrating both generative AI discussions and data modernization programs for strategic accounts. • Build strategic relationships using two engagement models: Executive Level: Amazon Q demonstrations, QuickSight analytics with generative BI, art of the possible sessions. Technical Level: Lakehouse architecture workshops, platform assessments (Databricks vs Snowflake vs AWS native), migration planning. • Lead comprehensive consultative engagements that begin with generative AI vision (Amazon Q, Bedrock) and translate into concrete data modernization roadmaps. • Develop proposals that balance innovative AI capabilities with foundational data platform requirements. Guide customers through parallel journeys: generative AI adoption (POCs to production) and data platform modernization (legacy to Lakehouse). • Collaborate with sales teams to position both solution portfolios strategically based on customer maturity and needs. Technical Excellence & Market Awareness Maintain deep expertise across both solution domains: 1) Generative AI: Amazon Bedrock, Amazon Q, QuickSight Q, SageMaker JumpStart, prompt engineering, RAG architectures, vector databases. 2) Data Platforms: AWS Glue, SageMaker Unified Studio, Databricks on AWS, Snowflake on AWS, Redshift, EMR, Apache Iceberg, Delta Lake. • Position AWS solutions effectively against other cloud platforms' offerings in both generative AI (Azure OpenAI, Vertex AI) and data platforms (Azure Synapse, BigQuery). • Guide architectural decisions on build vs. buy for both AI capabilities and data platform components. Required Experience Deep experience with generative AI technologies: Amazon Bedrock, Amazon Q, LLM architectures, RAG implementations. Proven track record delivering data modernization: Lakehouse architectures, Databricks and/or Snowflake implementations, AWS Glue/EMR deployments. A bachelor's degree in computer science, Data Science, Engineering, Mathematics, or a related technical field is required. At the manager's discretion, additional relevant experience may substitute for the degree requirement. A minimum of 15 years of enterprise solution architecture experience. A minimum of 8 years of public cloud experience. A minimum of 5 years as a senior level architect or solutions leader with hands on experience in both AI/ML and data platform modernization. Proven Presales/Sales Engineering experience. Demonstrated success in engaging C level executives using generative AI demonstrations while delivering complex data platform transformations. Strong understanding across the full spectrum: AI/ML: Generative AI, foundation models, LLMs, traditional ML, prompt engineering, fine tuning. Data Platforms: Lakehouse architectures, data mesh, ETL/ELT, streaming, data governance, data quality. Proficiency in Python, SQL, and Spark with hands on experience in: Generative AI: LangChain, vector databases, embedding models. Data Engineering: PySpark, Apache Iceberg/Delta Lake, orchestration tools. A proven ability to articulate both visionary AI possibilities and practical data platform requirements to diverse audiences. Preferred Qualifications An advanced degree (Master's or PhD) in a relevant field. Experience with AWS professional services or AWS partner ecosystem across both AI and data domains. Multiple Lakehouse platforms: Databricks, Snowflake, AWS native (Glue + Athena + Redshift). Multiple AI platforms: AWS Bedrock, Azure OpenAI, Google Vertex AI. Industry certifications: AWS - Solutions Architect Professional, Machine Learning Specialty, Data Analytics Specialty. Platform specific - Databricks Certified, Snowflake SnowPro. Experience with regulated industries requiring governance for both AI and data platforms. Track record building practices that deliver both generative AI solutions and data modernization programs. Published thought leadership in generative AI applications and/or modern data architectures.
Mar 11, 2026
Full time
Rackspace Technology is a leading provider of expertise and managed services across all the major public and private cloud technologies. We've evolved Fanatical Support to encompass the entire customer journey - providing Fanatical Experience from first consultation to daily operations. Our passionate experts combine the power of proactive, always on service and expertise with best in class tools and automation to deliver technology when and how our customers need it. We are seeking a highly accomplished Solution Director (Analytics & Al/ML) to lead the design and sales of two critical solution portfolios: Generative AI/LLM solutions and Data modernization/Lakehouse architectures on AWS. This pivotal role requires mastery of both domains - leveraging generative AI capabilities (Amazon Q, Amazon Bedrock, QuickSight) to drive executive conversations and opportunity creation, while delivering enterprise data modernization through Lakehouse architectures using AWS native services (Glue, SageMaker Unified Studio) and leading platforms (Databricks on AWS, Snowflake on AWS). This is a presales role that demands cross functional experience with proven ability to engage C level stakeholders, drive top of funnel opportunity creation, and maintain comprehensive account ownership across the entire customer lifecycle. The ideal candidate will excel at both selling the vision of generative AI transformation and delivering the reality of enterprise data modernization, combining deep technical expertise with exceptional business acumen and executive presence. Responsibilities Strategic Leadership & Opportunity Development • Drive top of funnel opportunity creation through two parallel tracks: engaging C level stakeholders with generative AI demonstrations (Amazon Q, Amazon Bedrock) and identifying data modernization needs for Lakehouse transformations. • Lead the design and architecture of dual solution portfolios: Generative AI Solutions: Amazon Bedrock implementations, Amazon Q deployments, QuickSight with Q capabilities, RAG architectures, and custom LLM solutions. Data Modernization: Enterprise Lakehouse architectures using AWS Glue, SageMaker Unified Studio, Databricks on AWS, and Snowflake on AWS. • Act as the trusted advisor, positioning generative AI as the transformational vision while grounding delivery in robust data platform modernization. • Develop compelling business cases that connect AI aspirations with practical data foundation requirements, demonstrating ROI across both portfolios. • Stay current with advancements in generative AI (foundation models, LLMs) and modern data architectures (Lakehouse patterns, data mesh, unified analytics). • Contribute to Rackspace's intellectual property through reference architectures covering both generative AI implementations and Lakehouse design patterns. • Mentor and provide leadership to Solution Architects by guiding technical development and fostering skill growth across both generative AI and data modernization solution areas. Customer Engagement & Solution Delivery • Serve as the primary technical lead orchestrating both generative AI discussions and data modernization programs for strategic accounts. • Build strategic relationships using two engagement models: Executive Level: Amazon Q demonstrations, QuickSight analytics with generative BI, art of the possible sessions. Technical Level: Lakehouse architecture workshops, platform assessments (Databricks vs Snowflake vs AWS native), migration planning. • Lead comprehensive consultative engagements that begin with generative AI vision (Amazon Q, Bedrock) and translate into concrete data modernization roadmaps. • Develop proposals that balance innovative AI capabilities with foundational data platform requirements. Guide customers through parallel journeys: generative AI adoption (POCs to production) and data platform modernization (legacy to Lakehouse). • Collaborate with sales teams to position both solution portfolios strategically based on customer maturity and needs. Technical Excellence & Market Awareness Maintain deep expertise across both solution domains: 1) Generative AI: Amazon Bedrock, Amazon Q, QuickSight Q, SageMaker JumpStart, prompt engineering, RAG architectures, vector databases. 2) Data Platforms: AWS Glue, SageMaker Unified Studio, Databricks on AWS, Snowflake on AWS, Redshift, EMR, Apache Iceberg, Delta Lake. • Position AWS solutions effectively against other cloud platforms' offerings in both generative AI (Azure OpenAI, Vertex AI) and data platforms (Azure Synapse, BigQuery). • Guide architectural decisions on build vs. buy for both AI capabilities and data platform components. Required Experience Deep experience with generative AI technologies: Amazon Bedrock, Amazon Q, LLM architectures, RAG implementations. Proven track record delivering data modernization: Lakehouse architectures, Databricks and/or Snowflake implementations, AWS Glue/EMR deployments. A bachelor's degree in computer science, Data Science, Engineering, Mathematics, or a related technical field is required. At the manager's discretion, additional relevant experience may substitute for the degree requirement. A minimum of 15 years of enterprise solution architecture experience. A minimum of 8 years of public cloud experience. A minimum of 5 years as a senior level architect or solutions leader with hands on experience in both AI/ML and data platform modernization. Proven Presales/Sales Engineering experience. Demonstrated success in engaging C level executives using generative AI demonstrations while delivering complex data platform transformations. Strong understanding across the full spectrum: AI/ML: Generative AI, foundation models, LLMs, traditional ML, prompt engineering, fine tuning. Data Platforms: Lakehouse architectures, data mesh, ETL/ELT, streaming, data governance, data quality. Proficiency in Python, SQL, and Spark with hands on experience in: Generative AI: LangChain, vector databases, embedding models. Data Engineering: PySpark, Apache Iceberg/Delta Lake, orchestration tools. A proven ability to articulate both visionary AI possibilities and practical data platform requirements to diverse audiences. Preferred Qualifications An advanced degree (Master's or PhD) in a relevant field. Experience with AWS professional services or AWS partner ecosystem across both AI and data domains. Multiple Lakehouse platforms: Databricks, Snowflake, AWS native (Glue + Athena + Redshift). Multiple AI platforms: AWS Bedrock, Azure OpenAI, Google Vertex AI. Industry certifications: AWS - Solutions Architect Professional, Machine Learning Specialty, Data Analytics Specialty. Platform specific - Databricks Certified, Snowflake SnowPro. Experience with regulated industries requiring governance for both AI and data platforms. Track record building practices that deliver both generative AI solutions and data modernization programs. Published thought leadership in generative AI applications and/or modern data architectures.
Lorien
Tech Lead (Asset Management Data Platform)
Lorien City, London
Tech Lead - Asset Management Data Platform (Pricing & Quant Data) Permanent London (Hybrid - 3 days/week onsite) Up to 140,000 + bonus + excellent benefits Lorien is partnering with a leading UK investment and asset management organisation to hire a hands-on Tech Lead to help build and evolve a modern cloud data platform supporting front-office investment activity. This is a high-impact role in a London-based team that works closely with the business, shaping how critical pricing, hedging and investment datasets are produced, governed and shared. This role involes leading technical design and delivery from the front, mentoring engineers, and setting engineering standards (without formal line management). You'll be trusted to operate with high autonomy and drive delivery day-to-day. The opportunity (what you'll be working on) Building and running a greenfield/early-stage Snowflake data platform that acts as a data hub for the asset management function Delivering high-quality, timely data for pricing workflows (ensuring the right data is available for pricing, and distributed correctly once prices are produced) Supporting asset & liability matching and hedging MI use-cases by providing trusted datasets to investment stakeholders and quant teams Modernising how data is served to consumers, including reworking APIs/data services to use more timely inputs and improve pricing accuracy Helping enable expansion of in-house capability to trade across multiple asset classes, by strengthening data foundations and delivery What you'll do Lead technical design and implementation across data services, APIs, and platform components Set and uphold engineering standards (patterns, code quality, reviews, testing approach) Drive modern delivery practices: CI/CD, test automation, DevOps ways of working Ensure solutions are secure, scalable, and production-ready in a regulated environment Mentor and support engineers through pairing, coaching, workshops and code reviews Work directly with front-office / investment stakeholders to translate outcomes into technical delivery Operate independently, owning your technical backlog and delivery plan with minimal day-to-day oversight What they're looking for (candidate profile) Essential experience Strong hands-on technical leadership in a front-office investment/trading environment Background in Investment Banking, Asset Management, or Wealth Management, ideally where the organisation trades and manages its own funds Proven ability to deliver data-intensive platforms or services used by front-office stakeholders (trading, portfolio, risk, quants) Strong engineering capability across Python, SQL, Snowflake, and AWS (modern cloud patterns; serverless experience is a strong advantage) Comfortable owning solution design, making decisions, and driving delivery with high autonomy Excellent stakeholder communication skills (can influence, align, and keep delivery moving) Desirable Experience with pricing / market data, ALM concepts, hedging MI, or data supplied to quant/model users Strong solution design skills (end-to-end data/service design) Experience working across multiple teams / scaled delivery environments Exposure to event-driven / serverless architectures and modern data engineering patterns Why this role stands out Modern stack: Snowflake + AWS (serverless), building on a new platform with real scope to shape engineering direction Autonomy and trust-based culture: strong ownership, ability to implement what you think is best Front-and-centre London team build-out: a key hire helping establish and grow a new capability close to the business Meaningful domain challenges: pricing, hedging, investment datasets - real-world impact and complexity Working pattern & package London hybrid: typically 3 days per week onsite Salary up to 140,000 (plus bonus) Strong benefits package (details shared during process) Open to flexible working discussions (including part-time/job share in principle) Guidant, Carbon60, Lorien & SRG - The Impellam Group Portfolio are acting as an Employment Business in relation to this vacancy.
Mar 11, 2026
Full time
Tech Lead - Asset Management Data Platform (Pricing & Quant Data) Permanent London (Hybrid - 3 days/week onsite) Up to 140,000 + bonus + excellent benefits Lorien is partnering with a leading UK investment and asset management organisation to hire a hands-on Tech Lead to help build and evolve a modern cloud data platform supporting front-office investment activity. This is a high-impact role in a London-based team that works closely with the business, shaping how critical pricing, hedging and investment datasets are produced, governed and shared. This role involes leading technical design and delivery from the front, mentoring engineers, and setting engineering standards (without formal line management). You'll be trusted to operate with high autonomy and drive delivery day-to-day. The opportunity (what you'll be working on) Building and running a greenfield/early-stage Snowflake data platform that acts as a data hub for the asset management function Delivering high-quality, timely data for pricing workflows (ensuring the right data is available for pricing, and distributed correctly once prices are produced) Supporting asset & liability matching and hedging MI use-cases by providing trusted datasets to investment stakeholders and quant teams Modernising how data is served to consumers, including reworking APIs/data services to use more timely inputs and improve pricing accuracy Helping enable expansion of in-house capability to trade across multiple asset classes, by strengthening data foundations and delivery What you'll do Lead technical design and implementation across data services, APIs, and platform components Set and uphold engineering standards (patterns, code quality, reviews, testing approach) Drive modern delivery practices: CI/CD, test automation, DevOps ways of working Ensure solutions are secure, scalable, and production-ready in a regulated environment Mentor and support engineers through pairing, coaching, workshops and code reviews Work directly with front-office / investment stakeholders to translate outcomes into technical delivery Operate independently, owning your technical backlog and delivery plan with minimal day-to-day oversight What they're looking for (candidate profile) Essential experience Strong hands-on technical leadership in a front-office investment/trading environment Background in Investment Banking, Asset Management, or Wealth Management, ideally where the organisation trades and manages its own funds Proven ability to deliver data-intensive platforms or services used by front-office stakeholders (trading, portfolio, risk, quants) Strong engineering capability across Python, SQL, Snowflake, and AWS (modern cloud patterns; serverless experience is a strong advantage) Comfortable owning solution design, making decisions, and driving delivery with high autonomy Excellent stakeholder communication skills (can influence, align, and keep delivery moving) Desirable Experience with pricing / market data, ALM concepts, hedging MI, or data supplied to quant/model users Strong solution design skills (end-to-end data/service design) Experience working across multiple teams / scaled delivery environments Exposure to event-driven / serverless architectures and modern data engineering patterns Why this role stands out Modern stack: Snowflake + AWS (serverless), building on a new platform with real scope to shape engineering direction Autonomy and trust-based culture: strong ownership, ability to implement what you think is best Front-and-centre London team build-out: a key hire helping establish and grow a new capability close to the business Meaningful domain challenges: pricing, hedging, investment datasets - real-world impact and complexity Working pattern & package London hybrid: typically 3 days per week onsite Salary up to 140,000 (plus bonus) Strong benefits package (details shared during process) Open to flexible working discussions (including part-time/job share in principle) Guidant, Carbon60, Lorien & SRG - The Impellam Group Portfolio are acting as an Employment Business in relation to this vacancy.
Manager - Data Science
Moorhouse
A total cash package of up to £110,000 comprising of a base salary of £82,005 We are a dynamic consulting firm, focused on delivering sustainable change. We ensure our clients succeed in their long-term goals by helping them turn their strategy into action through exceptional delivery and a commitment to establishing a culture of change. Clients like what we do and how we work, and we are looking for people to join the Moorhouse team. We pride ourselves on being proactive, collaborative, and straightforward team players. We work efficiently and collaboratively as a team and both honesty and integrity are key to this. In return you will be part of a supportive and high-performing team that shares the workload, looks after each other and celebrates success together. You can be assured of an exciting opportunity that will help you grow your skills through meaningful challenges and equip you with skills, experience and knowledge that can help organisations respond to the turbulence, change and opportunity that will define the future of work. We encourage behaviours that promote transparency, collaboration and achievement of shared goals. Data Science at Moorhouse Our Data Science capability is a growing and strategically significant part of Moorhouse. We help organisations unlock the value of their data by combining deep technical expertise with the consulting skills needed to drive real-world change. Our team works across the full data and analytics lifecycle,from defining strategy and enabling data-driven cultures, to building advanced analytical models and delivering digital products that embed predictive insight into everyday decision-making. We partner with clients across multiple sectors, including healthcare, energy and utilities, life sciences, financial services and TMT. Much of our work today focuses on developing and deploying analytical tools and web-based products that deliver forecasting, optimisation, and automated insight. Our projects span machine learning, statistical modelling, data engineering, data visualisation and MLOps, always with a focus on driving tangible business outcomes. As demand for digital and AI-enabled transformation continues to grow, our Data Science team is expandingand contributing to some of the most impactful programmes across the firm. Responsibilities Why join the Data Science team at Moorhouse? We are seeking a hands-on, technically focused Manager to join the Data Science function, who will lead the delivery of complex data science and digital product projects. While there will be opportunities to contribute to business growth and propositions, the key component of this role is strong technical delivery : architecting solutions, guiding teams, and ensuring the highest-quality output. This role complements another Manager in the team who is focused on proposition development and commercial growth. You will bring the deep engineering and delivery expertise necessary to ensure our solutions are robust, scalable and production-ready. In this role, you will have the opportunity to: Lead complex end-to-end delivery of data science and digital product solutions Lead the delivery of high-impact projects involving machine learning, forecasting, optimisation and data-driven digital products. Design and architect analytical solutions, including modelling approaches, data pipelines, integration patterns and deployment frameworks. Promote delivery excellence, engineering discipline and best-practice coding standards across teams. Be the technical leadership on DevOps/MLOps Implement and embed CI/CD pipelines, automated testing frameworks, containerisation and monitoring. Guide cloud architecture and infrastructure-as-code approaches (AWS, Azure, GCP). Encourage best practices for operationalising machine learning models at scale. Lead multidisciplinary teams with confidence Coach, mentor and support junior team members, ensuring they deliver high-quality technical work. Foster engineering maturity and support the development of technical capability across the team. Engage with senior stakeholders Translate technical detail into actionable insights and recommendations for clients. Build trusted relationships with leaders across multiple sectors. Support commercial and proposition development Provide technical input into proposals and bids. Contribute to refining our Data Science and AI propositions. Essential skills What are we looking for? We are seeking candidates with 6-8+ years of experience in data science, data engineering, analytics or AI product delivery, with a strong emphasis on hands-on technical leadership. Deep Technical Delivery Expertise Advanced proficiency in Python and SQL ; experience designing and delivering production-grade solutions. Operationalisation of machine learning models, forecasting tools, optimisation algorithms or analytical applications. Experience designing data pipelines, ETL workflows, APIs and data integration approaches. Automated testing (unit, integration, data validation, ML testing) Infrastructure-as-code (Terraform, ARM, CloudFormation) Monitoring and observability Experience with cloud ecosystems (Azure, AWS, GCP, Snowflake, Databricks). Strong understanding of software engineering fundamentals and modern development practices. Leadership & Consulting Capability Proven ability to lead delivery of complex technical programmes. Strong people leadership and mentoring capabilities. Excellent communication skills and ability to engage senior stakeholders. Collaborative mindset and commitment to high-quality client delivery. What we can offer you: A total cash package of up to £110,000 comprising of a base salary of £82,005 and a combination of personal and company bonuses that are paid every six months. 25 days annual leave increasing by one day for every full year of service to a maximum of 30 days with the option to buy or sell up to five days of annual leave per year. Life Assurance, Private Medical Insurance, Group Personal Pension Scheme and a range of discounted lifestyle and well-being benefits through Perkbox. A culture where you will not need to compete with others because of promotion quotas or the typical distribution curves that govern performance management in other organisations. We recognise and reward performance consistently and transparently across the firm so that everyone knows where they stand. We offer flexible working arrangements with our offices near Liverpool Street although you can expect to spend some time as part of a team onUKclient sites.Wesupport flexibility wherever possible. Moorhouse is proud to be an equal opportunities employer, and our values underpin a working environment that is inclusive for all those who work for us. We encourage people to bring their whole selves to work, contribute ideas, take the initiative and be responsible for their impact on others internally and externally. We value your privacy
Mar 08, 2026
Full time
A total cash package of up to £110,000 comprising of a base salary of £82,005 We are a dynamic consulting firm, focused on delivering sustainable change. We ensure our clients succeed in their long-term goals by helping them turn their strategy into action through exceptional delivery and a commitment to establishing a culture of change. Clients like what we do and how we work, and we are looking for people to join the Moorhouse team. We pride ourselves on being proactive, collaborative, and straightforward team players. We work efficiently and collaboratively as a team and both honesty and integrity are key to this. In return you will be part of a supportive and high-performing team that shares the workload, looks after each other and celebrates success together. You can be assured of an exciting opportunity that will help you grow your skills through meaningful challenges and equip you with skills, experience and knowledge that can help organisations respond to the turbulence, change and opportunity that will define the future of work. We encourage behaviours that promote transparency, collaboration and achievement of shared goals. Data Science at Moorhouse Our Data Science capability is a growing and strategically significant part of Moorhouse. We help organisations unlock the value of their data by combining deep technical expertise with the consulting skills needed to drive real-world change. Our team works across the full data and analytics lifecycle,from defining strategy and enabling data-driven cultures, to building advanced analytical models and delivering digital products that embed predictive insight into everyday decision-making. We partner with clients across multiple sectors, including healthcare, energy and utilities, life sciences, financial services and TMT. Much of our work today focuses on developing and deploying analytical tools and web-based products that deliver forecasting, optimisation, and automated insight. Our projects span machine learning, statistical modelling, data engineering, data visualisation and MLOps, always with a focus on driving tangible business outcomes. As demand for digital and AI-enabled transformation continues to grow, our Data Science team is expandingand contributing to some of the most impactful programmes across the firm. Responsibilities Why join the Data Science team at Moorhouse? We are seeking a hands-on, technically focused Manager to join the Data Science function, who will lead the delivery of complex data science and digital product projects. While there will be opportunities to contribute to business growth and propositions, the key component of this role is strong technical delivery : architecting solutions, guiding teams, and ensuring the highest-quality output. This role complements another Manager in the team who is focused on proposition development and commercial growth. You will bring the deep engineering and delivery expertise necessary to ensure our solutions are robust, scalable and production-ready. In this role, you will have the opportunity to: Lead complex end-to-end delivery of data science and digital product solutions Lead the delivery of high-impact projects involving machine learning, forecasting, optimisation and data-driven digital products. Design and architect analytical solutions, including modelling approaches, data pipelines, integration patterns and deployment frameworks. Promote delivery excellence, engineering discipline and best-practice coding standards across teams. Be the technical leadership on DevOps/MLOps Implement and embed CI/CD pipelines, automated testing frameworks, containerisation and monitoring. Guide cloud architecture and infrastructure-as-code approaches (AWS, Azure, GCP). Encourage best practices for operationalising machine learning models at scale. Lead multidisciplinary teams with confidence Coach, mentor and support junior team members, ensuring they deliver high-quality technical work. Foster engineering maturity and support the development of technical capability across the team. Engage with senior stakeholders Translate technical detail into actionable insights and recommendations for clients. Build trusted relationships with leaders across multiple sectors. Support commercial and proposition development Provide technical input into proposals and bids. Contribute to refining our Data Science and AI propositions. Essential skills What are we looking for? We are seeking candidates with 6-8+ years of experience in data science, data engineering, analytics or AI product delivery, with a strong emphasis on hands-on technical leadership. Deep Technical Delivery Expertise Advanced proficiency in Python and SQL ; experience designing and delivering production-grade solutions. Operationalisation of machine learning models, forecasting tools, optimisation algorithms or analytical applications. Experience designing data pipelines, ETL workflows, APIs and data integration approaches. Automated testing (unit, integration, data validation, ML testing) Infrastructure-as-code (Terraform, ARM, CloudFormation) Monitoring and observability Experience with cloud ecosystems (Azure, AWS, GCP, Snowflake, Databricks). Strong understanding of software engineering fundamentals and modern development practices. Leadership & Consulting Capability Proven ability to lead delivery of complex technical programmes. Strong people leadership and mentoring capabilities. Excellent communication skills and ability to engage senior stakeholders. Collaborative mindset and commitment to high-quality client delivery. What we can offer you: A total cash package of up to £110,000 comprising of a base salary of £82,005 and a combination of personal and company bonuses that are paid every six months. 25 days annual leave increasing by one day for every full year of service to a maximum of 30 days with the option to buy or sell up to five days of annual leave per year. Life Assurance, Private Medical Insurance, Group Personal Pension Scheme and a range of discounted lifestyle and well-being benefits through Perkbox. A culture where you will not need to compete with others because of promotion quotas or the typical distribution curves that govern performance management in other organisations. We recognise and reward performance consistently and transparently across the firm so that everyone knows where they stand. We offer flexible working arrangements with our offices near Liverpool Street although you can expect to spend some time as part of a team onUKclient sites.Wesupport flexibility wherever possible. Moorhouse is proud to be an equal opportunities employer, and our values underpin a working environment that is inclusive for all those who work for us. We encourage people to bring their whole selves to work, contribute ideas, take the initiative and be responsible for their impact on others internally and externally. We value your privacy
Senior Sales Engineer
Ataccama
We are Ataccama, and we are on a mission to power a better future with data. Our product enables both technical and less technical 'data people' across their organizations to create high-quality, governed, safe, and reusable data products. It's what made us a Leader in the Gartner Magic Quadrant for Data Quality Solutions , and what inspired Bain Capital Tech Opportunities to invest in our future growth. Our vision is to be the leading AI-powered cloud data management company and to do that, we're making Ataccama a great place to work and grow. Our people are located across the globe. They succeed by collaborating as a team and thrive in our company culture defined by these core values: Challenging Fun ONE Team Customer Centric Candid and Caring Aim High Senior Sales Engineer - Your Challenge As a Senior Sales Engineer, you will work closely with Account Executives to drive sales engineering activities across the entire deal lifecycle. This role requires adaptability, deep technical expertise, and the ability to create customized solutions that resonate with clients across diverse industries. Sales Engineering Lifecycle Management: Oversee and execute all sales engineering activities throughout the deal lifecycle, from initial engagement to deal closure. Customized Client Solutions: Develop presentations and configure demonstrations to meet the specific needs of the audience, tailoring it to each prospect's industry and unique needs. Spearhead Proof of Concepts (POCs): Lead and execute Proof of Concepts (POCs), demonstrating the platform's effectiveness in addressing client-specific business pains and customizing the platform to technical requirements. Advanced Client Engagement: Lead in-depth technical discussions with clients, effectively aligning Ataccama's solutions with their needs. AI Feature Champion: Actively showcase and articulate the value of Ataccama's expanding AI capabilities (e.g., for data quality, governance, and master data management) in all presentations and demonstrations. Industry and Product Expertise: Maintain a thorough understanding of relevant technologies, competitors, business cases and industry specifics to effectively align solutions. Collaboration with Account Executives: Work closely with Account Executives, providing technical insights and support to ensure a cohesive sales strategy within the given territory. Professional Development: Maintain a commitment to continuous learning and development, staying ahead of industry trends and Ataccama product advancements Is This You? 5+ years experience in a client-facing technical role (Sales Engineering, Solutions Consulting, etc.) Experience working with Data Quality, Data Observability, Data Governance or Master Data Management tools Proficient in SQL, data pipelines, Databricks/Snowflake, APIs, and preferably programming experience ideally in Python. Proven success working with large enterprises with complex technical environments Experience leveraging Python for AI/ML prototyping or data science tasks is a strong plus. Strong problem-solving and creative thinking skills Perks & Benefits Long-Term Incentive Program 5 sick days and 25 days of vacation, with the option to request additional Enhanced Time-Off days when needed The Global Family Support Program - a paid leave program to help all parents focus on the new addition to their family Pension plan "Bring Your Friend" referral program Flexible working hours & hybrid work setup Health insurance provided by Vitality Online courses & company access to Udemy to hone your skills Conference tickets to the best industry events of the year Cycle to work scheme Work equipment Company laptop Company mobile phone At Ataccama, our core values are Candid & Caring, so we are upfront about our process and details that are important to you. We sometimes use AI tools to help us with things like reviewing applications, taking notes from screening conversations, scheduling interviews, or supporting assessments. These tools make the process smoother and fairer - but don't worry, they never make the final decision. Every hiring decision is made by our Talent Acquisition Partners and Hiring Managers, with AI only acting as a helpful assistant. We believe technology should support the process, not replace the human touch. We currently use AI-assisted tools - Metaview for interview notes and Lever Talent Fit to help highlight key experience. While we highly value cooperation with all our business partners, we don't accept unsolicited resumes from any sources other than directly from a candidate. We reserve the right not to pay any fee for sending an unsolicited offer containing the details or resume of a job candidate, even if the relevant candidate is employed by our company.
Mar 06, 2026
Full time
We are Ataccama, and we are on a mission to power a better future with data. Our product enables both technical and less technical 'data people' across their organizations to create high-quality, governed, safe, and reusable data products. It's what made us a Leader in the Gartner Magic Quadrant for Data Quality Solutions , and what inspired Bain Capital Tech Opportunities to invest in our future growth. Our vision is to be the leading AI-powered cloud data management company and to do that, we're making Ataccama a great place to work and grow. Our people are located across the globe. They succeed by collaborating as a team and thrive in our company culture defined by these core values: Challenging Fun ONE Team Customer Centric Candid and Caring Aim High Senior Sales Engineer - Your Challenge As a Senior Sales Engineer, you will work closely with Account Executives to drive sales engineering activities across the entire deal lifecycle. This role requires adaptability, deep technical expertise, and the ability to create customized solutions that resonate with clients across diverse industries. Sales Engineering Lifecycle Management: Oversee and execute all sales engineering activities throughout the deal lifecycle, from initial engagement to deal closure. Customized Client Solutions: Develop presentations and configure demonstrations to meet the specific needs of the audience, tailoring it to each prospect's industry and unique needs. Spearhead Proof of Concepts (POCs): Lead and execute Proof of Concepts (POCs), demonstrating the platform's effectiveness in addressing client-specific business pains and customizing the platform to technical requirements. Advanced Client Engagement: Lead in-depth technical discussions with clients, effectively aligning Ataccama's solutions with their needs. AI Feature Champion: Actively showcase and articulate the value of Ataccama's expanding AI capabilities (e.g., for data quality, governance, and master data management) in all presentations and demonstrations. Industry and Product Expertise: Maintain a thorough understanding of relevant technologies, competitors, business cases and industry specifics to effectively align solutions. Collaboration with Account Executives: Work closely with Account Executives, providing technical insights and support to ensure a cohesive sales strategy within the given territory. Professional Development: Maintain a commitment to continuous learning and development, staying ahead of industry trends and Ataccama product advancements Is This You? 5+ years experience in a client-facing technical role (Sales Engineering, Solutions Consulting, etc.) Experience working with Data Quality, Data Observability, Data Governance or Master Data Management tools Proficient in SQL, data pipelines, Databricks/Snowflake, APIs, and preferably programming experience ideally in Python. Proven success working with large enterprises with complex technical environments Experience leveraging Python for AI/ML prototyping or data science tasks is a strong plus. Strong problem-solving and creative thinking skills Perks & Benefits Long-Term Incentive Program 5 sick days and 25 days of vacation, with the option to request additional Enhanced Time-Off days when needed The Global Family Support Program - a paid leave program to help all parents focus on the new addition to their family Pension plan "Bring Your Friend" referral program Flexible working hours & hybrid work setup Health insurance provided by Vitality Online courses & company access to Udemy to hone your skills Conference tickets to the best industry events of the year Cycle to work scheme Work equipment Company laptop Company mobile phone At Ataccama, our core values are Candid & Caring, so we are upfront about our process and details that are important to you. We sometimes use AI tools to help us with things like reviewing applications, taking notes from screening conversations, scheduling interviews, or supporting assessments. These tools make the process smoother and fairer - but don't worry, they never make the final decision. Every hiring decision is made by our Talent Acquisition Partners and Hiring Managers, with AI only acting as a helpful assistant. We believe technology should support the process, not replace the human touch. We currently use AI-assisted tools - Metaview for interview notes and Lever Talent Fit to help highlight key experience. While we highly value cooperation with all our business partners, we don't accept unsolicited resumes from any sources other than directly from a candidate. We reserve the right not to pay any fee for sending an unsolicited offer containing the details or resume of a job candidate, even if the relevant candidate is employed by our company.
Data Science Manager
Huron Consulting Group Inc.
Data Science Manager page is loaded Data Science Managerremote type: Hybridlocations: Belfast - 20 Adelaide Streetposted on: Posted Todayjob requisition id: JR-Huron is a global consultancy that collaborates with clients to drive strategic growth, ignite innovation and navigate constant change. Through a combination of strategy, expertise and creativity, we help clients accelerate operational, digital and cultural transformation, enabling the change they need to own their future. Join our team as the expert you are now and create your future. Data Science Manager We're seeking a Data Science Manager to join the Data Science & Machine Learning team in our Commercial Digital practice, where you'll lead advanced analytics initiatives that transform how Fortune 500 companies make decisions across Financial Services, Manufacturing, Energy & Utilities, and other commercial industries.Managers play a vibrant, integral role at Huron. Their invaluable knowledge reflects in the projects they manage and the teams they lead. Known for building long-standing partnerships with clients, they collaborate with colleagues to solve their most important challenges. Our Managers also spend significant time mentoring junior staff on the engagement team-sharing expertise, feedback, and encouragement. This promotes a culture of respect, unity, collaboration, and personal achievement.This isn't a reporting role or a dashboard factory-you'll own the full analytics lifecycle from hypothesis formulation through insight delivery, while leading and developing a team of data scientists and analysts. You'll work on problems that matter: experimental designs that validate multi-million-dollar strategies, predictive models that surface hidden patterns in complex data, and deep learning pipelines that extract signal from unstructured text, images, and time-series. Our clients are Fortune 500 companies looking for partners who can find the signal in the noise and tell the story that drives action.The variety is real. In your first year, you might lead a customer segmentation and lifetime value analysis for a financial services firm, design and analyze a pricing experiment for a global manufacturer, and build an agentic anomaly detection system for a utility company's operational data-all while developing the next generation of data science talent at Huron. If you thrive on rigorous analysis, clear communication of complex findings, and building high-performing teams, this role is for you.# What You'll Do Lead and mentor junior data scientists and analysts -provide technical guidance, review analytical approaches and code, and support professional development. Foster a culture of intellectual curiosity, rigorous methodology, and clear communication within the team. Manage complex multi-workstream analytics projects -oversee project planning, resource allocation, and delivery timelines. Ensure analyses meet quality standards and client expectations while maintaining methodological rigor. Design and execute end-to-end data science workflows -from problem framing and hypothesis development through exploratory analysis, modeling, validation, and insight delivery. Own the analytical approach and ensure conclusions are defensible. Lead development of both traditional statistical and modern AI-powered analyses -including regression, classification, clustering, causal inference, A/B testing, and modern deep learning approaches using embeddings, transformer architectures, and foundation models for text, time-series, and multimodal analysis. Build predictive and prescriptive models that drive business decisions-customer segmentation, churn prediction, demand forecasting, pricing optimization, risk scoring, and operational efficiency analysis for commercial enterprises. Translate complex analytical findings into actionable insights -create compelling data narratives, develop executive-ready presentations, and communicate technical results to non-technical stakeholders in ways that drive decisions. Serve as a trusted advisor to clients -build long-standing partnerships, deeply understand business problems, formulate the right analytical questions, and deliver insights that create measurable value. Contribute to practice development -participate in business development activities, develop reusable analytical frameworks and methodologies, and help shape the technical direction of Huron's DSML capabilities.# Required Qualifications 5+ years of hands-on experience conducting data science and advanced analytics -not just ad-hoc analysis, but structured analytical projects that drove business decisions. You've framed problems, developed hypotheses, analyzed data, and delivered insights that created measurable impact. Experience leading and developing technical teams -including coaching, mentorship, methodology review, and performance management. Demonstrated ability to build high-performing teams and develop junior talent. Strong Python and SQL programming skills with deep experience in the data science ecosystem (Pandas, NumPy, Scikit-learn, statsmodels, visualization libraries). Comfortable writing production-quality code, not just notebooks. Solid foundation in statistics and machine learning : hypothesis testing, regression analysis, classification, clustering, experimental design, causal inference, and understanding of when different approaches are appropriate for different questions. Experience with deep learning and modern neural architectures -understanding of transformer models, embeddings, transfer learning, and how to leverage foundation models for analytical tasks. You know when ML approaches add value over classical methods, and how to integrate them into rigorous analytical workflows. Proficiency with data platforms : Microsoft Fabric, Snowflake, Databricks, or similar cloud analytics environments. You're comfortable working with large datasets and can optimize queries for performance. Exceptional communication and data storytelling skills -ability to distill complex analyses into clear narratives, create compelling visualizations, lead client meetings, and build trusted relationships with executive audiences. This is non-negotiable. Bachelor's degree in Statistics, Mathematics, Economics, Computer Science, or related quantitative field (or equivalent practical experience). Flexibility to work in a hybrid model with periodic travel to client sites as needed.# Preferred Qualifications Experience in Financial Services, Manufacturing, or Energy & Utilities industries. Background in experimental design, A/B testing, and causal inference methodologies-including propensity score matching, difference-in-differences, or instrumental variables. Hands-on experience with deep learning frameworks (PyTorch, TensorFlow) and neural architectures-including transformers, attention mechanisms, and fine-tuning pretrained models for NLP, time-series, or tabular data applications. Experience building AI-assisted analytical workflows-leveraging foundation model APIs, vector databases, and retrieval systems to accelerate insight extraction from unstructured data. Experience with Bayesian methods, probabilistic programming (PyMC, NumPyro, etc.), or uncertainty quantification in business contexts. Strong visualization and data interface design and development skills using programmatic visualization libraries (Plotly, Altair, D3). Proficiency with AI-assisted rapid data application development using Cursor, Lovable, v0, etc. Experience with time-series analysis, forecasting methods (ARIMA, Prophet, neural forecasting), and demand planning applications. Cloud certifications (Azure Data Scientist, Databricks ML Associate, AWS ML Specialty). Consulting experience or demonstrated ability to work across multiple domains and adapt quickly to new
Feb 28, 2026
Full time
Data Science Manager page is loaded Data Science Managerremote type: Hybridlocations: Belfast - 20 Adelaide Streetposted on: Posted Todayjob requisition id: JR-Huron is a global consultancy that collaborates with clients to drive strategic growth, ignite innovation and navigate constant change. Through a combination of strategy, expertise and creativity, we help clients accelerate operational, digital and cultural transformation, enabling the change they need to own their future. Join our team as the expert you are now and create your future. Data Science Manager We're seeking a Data Science Manager to join the Data Science & Machine Learning team in our Commercial Digital practice, where you'll lead advanced analytics initiatives that transform how Fortune 500 companies make decisions across Financial Services, Manufacturing, Energy & Utilities, and other commercial industries.Managers play a vibrant, integral role at Huron. Their invaluable knowledge reflects in the projects they manage and the teams they lead. Known for building long-standing partnerships with clients, they collaborate with colleagues to solve their most important challenges. Our Managers also spend significant time mentoring junior staff on the engagement team-sharing expertise, feedback, and encouragement. This promotes a culture of respect, unity, collaboration, and personal achievement.This isn't a reporting role or a dashboard factory-you'll own the full analytics lifecycle from hypothesis formulation through insight delivery, while leading and developing a team of data scientists and analysts. You'll work on problems that matter: experimental designs that validate multi-million-dollar strategies, predictive models that surface hidden patterns in complex data, and deep learning pipelines that extract signal from unstructured text, images, and time-series. Our clients are Fortune 500 companies looking for partners who can find the signal in the noise and tell the story that drives action.The variety is real. In your first year, you might lead a customer segmentation and lifetime value analysis for a financial services firm, design and analyze a pricing experiment for a global manufacturer, and build an agentic anomaly detection system for a utility company's operational data-all while developing the next generation of data science talent at Huron. If you thrive on rigorous analysis, clear communication of complex findings, and building high-performing teams, this role is for you.# What You'll Do Lead and mentor junior data scientists and analysts -provide technical guidance, review analytical approaches and code, and support professional development. Foster a culture of intellectual curiosity, rigorous methodology, and clear communication within the team. Manage complex multi-workstream analytics projects -oversee project planning, resource allocation, and delivery timelines. Ensure analyses meet quality standards and client expectations while maintaining methodological rigor. Design and execute end-to-end data science workflows -from problem framing and hypothesis development through exploratory analysis, modeling, validation, and insight delivery. Own the analytical approach and ensure conclusions are defensible. Lead development of both traditional statistical and modern AI-powered analyses -including regression, classification, clustering, causal inference, A/B testing, and modern deep learning approaches using embeddings, transformer architectures, and foundation models for text, time-series, and multimodal analysis. Build predictive and prescriptive models that drive business decisions-customer segmentation, churn prediction, demand forecasting, pricing optimization, risk scoring, and operational efficiency analysis for commercial enterprises. Translate complex analytical findings into actionable insights -create compelling data narratives, develop executive-ready presentations, and communicate technical results to non-technical stakeholders in ways that drive decisions. Serve as a trusted advisor to clients -build long-standing partnerships, deeply understand business problems, formulate the right analytical questions, and deliver insights that create measurable value. Contribute to practice development -participate in business development activities, develop reusable analytical frameworks and methodologies, and help shape the technical direction of Huron's DSML capabilities.# Required Qualifications 5+ years of hands-on experience conducting data science and advanced analytics -not just ad-hoc analysis, but structured analytical projects that drove business decisions. You've framed problems, developed hypotheses, analyzed data, and delivered insights that created measurable impact. Experience leading and developing technical teams -including coaching, mentorship, methodology review, and performance management. Demonstrated ability to build high-performing teams and develop junior talent. Strong Python and SQL programming skills with deep experience in the data science ecosystem (Pandas, NumPy, Scikit-learn, statsmodels, visualization libraries). Comfortable writing production-quality code, not just notebooks. Solid foundation in statistics and machine learning : hypothesis testing, regression analysis, classification, clustering, experimental design, causal inference, and understanding of when different approaches are appropriate for different questions. Experience with deep learning and modern neural architectures -understanding of transformer models, embeddings, transfer learning, and how to leverage foundation models for analytical tasks. You know when ML approaches add value over classical methods, and how to integrate them into rigorous analytical workflows. Proficiency with data platforms : Microsoft Fabric, Snowflake, Databricks, or similar cloud analytics environments. You're comfortable working with large datasets and can optimize queries for performance. Exceptional communication and data storytelling skills -ability to distill complex analyses into clear narratives, create compelling visualizations, lead client meetings, and build trusted relationships with executive audiences. This is non-negotiable. Bachelor's degree in Statistics, Mathematics, Economics, Computer Science, or related quantitative field (or equivalent practical experience). Flexibility to work in a hybrid model with periodic travel to client sites as needed.# Preferred Qualifications Experience in Financial Services, Manufacturing, or Energy & Utilities industries. Background in experimental design, A/B testing, and causal inference methodologies-including propensity score matching, difference-in-differences, or instrumental variables. Hands-on experience with deep learning frameworks (PyTorch, TensorFlow) and neural architectures-including transformers, attention mechanisms, and fine-tuning pretrained models for NLP, time-series, or tabular data applications. Experience building AI-assisted analytical workflows-leveraging foundation model APIs, vector databases, and retrieval systems to accelerate insight extraction from unstructured data. Experience with Bayesian methods, probabilistic programming (PyMC, NumPyro, etc.), or uncertainty quantification in business contexts. Strong visualization and data interface design and development skills using programmatic visualization libraries (Plotly, Altair, D3). Proficiency with AI-assisted rapid data application development using Cursor, Lovable, v0, etc. Experience with time-series analysis, forecasting methods (ARIMA, Prophet, neural forecasting), and demand planning applications. Cloud certifications (Azure Data Scientist, Databricks ML Associate, AWS ML Specialty). Consulting experience or demonstrated ability to work across multiple domains and adapt quickly to new
Vice President, Private Credit Investment Analytics
Pantheon
Vice President, Private Credit Investment Analytics Pantheon has been at the forefront of private markets investing for more than 40 years, earning a reputation for an innovative approach to investing in secondaries, co investments, and primary fund investments, as well as capital formation across commingled funds, evergreen vehicles and customized solutions. Pantheon currently manages approximately $82.3 billion in AUM across all its strategies, serving more than 750 institutional and 638 private wealth clients worldwide. The Vice President of Investment Analytics (Private Credit) will lead the development of Pantheon's private credit data and intelligence capability, transforming investment data into structured, decision ready insight for Portfolio Management teams. This role sits at the intersection of Investment, Data & Operations, and will focus on strengthening the firm's ability to generate consistent, comparable, and actionable intelligence from credit portfolio data - across funds, GPs, structures, and geographies. The VP will architect and operate scalable analytical frameworks that standardize how private credit data is defined, sourced, governed, and translated into insight. The role will partner closely with Investment Data Operations and Data Engineering teams to embed credit analytics into Pantheon's evolving data platforms, reducing reliance on bespoke spreadsheets and increasing transparency, comparability, and confidence. This is a high impact opportunity to shape Pantheon's private credit intelligence capability at a time of significant growth and platform transformation. Key Responsibilities Build private credit investment analytics across Pantheon to help evaluate and monitor investments. Act as a senior analytics partner to Investment and Portfolio Management teams, translating complex investment needs into structured data requirements and analytical frameworks. Own the definition, sourcing, and validation of investment relevant datasets, ensuring analytical outputs are complete, comparable, and decision ready. Deliver differentiated, investment grade insights by combining internal investment operations data with GP reported and external market data, moving beyond descriptive reporting to comparative, diagnostic, and thematic analysis. Serve as the primary data and analytics interface with GPs and external data/technology providers, shaping data standards, templates, and requirements to improve transparency, coverage, and usability. Operate Investment Analytics as a business facing function with a product mindset, creating a roadmap that prioritises initiatives based on investment impact, adoption, and measurable decision enablement. Lead, develop, and mentor a high performing Investment Analytics team, fostering disciplined delivery and strong stakeholder relationships. Knowledge & Experience Required Significant experience in private markets, with strong exposure to private credit, portfolio data, analytics or investment operations. Deep understanding of private credit structures, performance drivers, and portfolio monitoring. Strong analytical mindset with experience supporting investment or portfolio decision making. Strong understanding of data governance, comparability challenges, and portfolio transparency. Demonstrated ability to operate at both strategic and hands on levels. Proven ability to partner effectively with investment professionals and influence senior stakeholders. Strong written and verbal communication skills; able to articulate complex information clearly and concisely. Track record of delivering high quality outputs under tight timelines. Experience leading and developing teams in a collaborative environment. Experience with working with 3rd party providers, data vendors & offshore teams. Professional qualification (CFA, ACA, ACCA, IMC) or equivalent experience preferred but not essential. Familiarity with modern data environments and analytics tools (e.g., Power BI, SQL, Python, Databricks, Snowflake). Experience systemizing investment analytics within centralized data platforms. Exposure to automation or advanced analytical techniques to enhance portfolio monitoring. Demonstrated comfort integrating AI tools into daily analytical workflows (e.g., structured research, synthesis, data interrogation). Ability to evaluate AI outputs critically, refine prompts, and build repeatable AI assisted workflows. This job description is not to be construed as an exhaustive statement of duties, responsibilities, or requirements. You may be required to perform other job related duties as reasonably requested by your manager. Pantheon is an Equal Opportunities employer, we are committed to building a diverse and inclusive workforce so if you're excited about this role but your past experience doesn't perfectly align we'd still encourage you to apply.
Feb 28, 2026
Full time
Vice President, Private Credit Investment Analytics Pantheon has been at the forefront of private markets investing for more than 40 years, earning a reputation for an innovative approach to investing in secondaries, co investments, and primary fund investments, as well as capital formation across commingled funds, evergreen vehicles and customized solutions. Pantheon currently manages approximately $82.3 billion in AUM across all its strategies, serving more than 750 institutional and 638 private wealth clients worldwide. The Vice President of Investment Analytics (Private Credit) will lead the development of Pantheon's private credit data and intelligence capability, transforming investment data into structured, decision ready insight for Portfolio Management teams. This role sits at the intersection of Investment, Data & Operations, and will focus on strengthening the firm's ability to generate consistent, comparable, and actionable intelligence from credit portfolio data - across funds, GPs, structures, and geographies. The VP will architect and operate scalable analytical frameworks that standardize how private credit data is defined, sourced, governed, and translated into insight. The role will partner closely with Investment Data Operations and Data Engineering teams to embed credit analytics into Pantheon's evolving data platforms, reducing reliance on bespoke spreadsheets and increasing transparency, comparability, and confidence. This is a high impact opportunity to shape Pantheon's private credit intelligence capability at a time of significant growth and platform transformation. Key Responsibilities Build private credit investment analytics across Pantheon to help evaluate and monitor investments. Act as a senior analytics partner to Investment and Portfolio Management teams, translating complex investment needs into structured data requirements and analytical frameworks. Own the definition, sourcing, and validation of investment relevant datasets, ensuring analytical outputs are complete, comparable, and decision ready. Deliver differentiated, investment grade insights by combining internal investment operations data with GP reported and external market data, moving beyond descriptive reporting to comparative, diagnostic, and thematic analysis. Serve as the primary data and analytics interface with GPs and external data/technology providers, shaping data standards, templates, and requirements to improve transparency, coverage, and usability. Operate Investment Analytics as a business facing function with a product mindset, creating a roadmap that prioritises initiatives based on investment impact, adoption, and measurable decision enablement. Lead, develop, and mentor a high performing Investment Analytics team, fostering disciplined delivery and strong stakeholder relationships. Knowledge & Experience Required Significant experience in private markets, with strong exposure to private credit, portfolio data, analytics or investment operations. Deep understanding of private credit structures, performance drivers, and portfolio monitoring. Strong analytical mindset with experience supporting investment or portfolio decision making. Strong understanding of data governance, comparability challenges, and portfolio transparency. Demonstrated ability to operate at both strategic and hands on levels. Proven ability to partner effectively with investment professionals and influence senior stakeholders. Strong written and verbal communication skills; able to articulate complex information clearly and concisely. Track record of delivering high quality outputs under tight timelines. Experience leading and developing teams in a collaborative environment. Experience with working with 3rd party providers, data vendors & offshore teams. Professional qualification (CFA, ACA, ACCA, IMC) or equivalent experience preferred but not essential. Familiarity with modern data environments and analytics tools (e.g., Power BI, SQL, Python, Databricks, Snowflake). Experience systemizing investment analytics within centralized data platforms. Exposure to automation or advanced analytical techniques to enhance portfolio monitoring. Demonstrated comfort integrating AI tools into daily analytical workflows (e.g., structured research, synthesis, data interrogation). Ability to evaluate AI outputs critically, refine prompts, and build repeatable AI assisted workflows. This job description is not to be construed as an exhaustive statement of duties, responsibilities, or requirements. You may be required to perform other job related duties as reasonably requested by your manager. Pantheon is an Equal Opportunities employer, we are committed to building a diverse and inclusive workforce so if you're excited about this role but your past experience doesn't perfectly align we'd still encourage you to apply.
Square One Resources
Snowflake Data Engineer
Square One Resources City, London
Job Title: Snowflake Data Engineer Location: London (2 days on-site per week) Salary/Rate: 550 - 600 per day inside IR35 Start Date: March Job Type: Initial 3-6 month contract Company Introduction We have an exciting opportunity now available with one of our sector-leading consultancy clients! They are currently looking for a skilled Snowflake Data Engineer to help on their cloud migration project. Job Responsibilities/Objectives You will be responsible for designing and building scalable data pipelines, Data Vault models/Dimension Model, and Snowflake/dbt workloads for cloud migration projects. ? Implement Data Vault 2.0 (Hubs, Links, Satellites) /Dimension Model on Snowflake. ? Build ELT pipelines using Snowflake, dbt, Python/PySpark. ? Develop ingestion from APIs, databases, streams. ? Optimize Snowflake warehouses, cost, and performance. ? Collaborate with architects, analysts, and DevOps. ? Maintain documentation, lineage, governance standards. Required Skills/Experience The ideal candidate will have the following: ? Strong SQL; Snowflake ELT; dbt experience. ? Python/PySpark, ETL/ELT design. ? Data Vault 2.0 or dimensional modeling. ? AWS services (S3, Glue, Lambda, Redshift) or GCP equivalents. ? Experience with CI/CD for data pipelines. Good to have skills Although not essential, the following skills are desired by the client: ? Kafka/Kinesis, Airflow, CodePipeline. ? BI tools (Power BI/Tableau). ? Docker/OpenShift; metadata driven pipelines. ? 3-8+ years Data Engineering experience. ? Cloud data engineering and Snowflake/dbt hands on exposure. If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format. Disclaimer Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies. Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.
Feb 27, 2026
Contractor
Job Title: Snowflake Data Engineer Location: London (2 days on-site per week) Salary/Rate: 550 - 600 per day inside IR35 Start Date: March Job Type: Initial 3-6 month contract Company Introduction We have an exciting opportunity now available with one of our sector-leading consultancy clients! They are currently looking for a skilled Snowflake Data Engineer to help on their cloud migration project. Job Responsibilities/Objectives You will be responsible for designing and building scalable data pipelines, Data Vault models/Dimension Model, and Snowflake/dbt workloads for cloud migration projects. ? Implement Data Vault 2.0 (Hubs, Links, Satellites) /Dimension Model on Snowflake. ? Build ELT pipelines using Snowflake, dbt, Python/PySpark. ? Develop ingestion from APIs, databases, streams. ? Optimize Snowflake warehouses, cost, and performance. ? Collaborate with architects, analysts, and DevOps. ? Maintain documentation, lineage, governance standards. Required Skills/Experience The ideal candidate will have the following: ? Strong SQL; Snowflake ELT; dbt experience. ? Python/PySpark, ETL/ELT design. ? Data Vault 2.0 or dimensional modeling. ? AWS services (S3, Glue, Lambda, Redshift) or GCP equivalents. ? Experience with CI/CD for data pipelines. Good to have skills Although not essential, the following skills are desired by the client: ? Kafka/Kinesis, Airflow, CodePipeline. ? BI tools (Power BI/Tableau). ? Docker/OpenShift; metadata driven pipelines. ? 3-8+ years Data Engineering experience. ? Cloud data engineering and Snowflake/dbt hands on exposure. If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format. Disclaimer Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies. Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.
Spectrum IT Recruitment
Solution Architect
Spectrum IT Recruitment
We are seeking an experienced Solution Architect to play a key role in an excellent client's large-scale platform migration programme focused on consolidating multiple payment and card-processing systems into a unified global ecosystem. The successful candidate will bring expertise in system migrations, ETL/data transformation pipelines, and cross-functional collaboration. You will work with engineering, product, operations, compliance, and partner teams to define and govern architecture that delivers secure, compliant, resilient, and high-performing outcomes. This can either be a mainly remote or hybrid role depending on what the successful candidate prefers. As well as competitive salary's, our client offers a comprehensive benefits package. Key Responsibilities Lead solution architecture for the migration and consolidation of regional card-processing and payments platforms into a modern, unified global environment. Design and own large-scale data migration and ETL processes, including data mapping, transformation logic, orchestration, validation, lineage, and implementation. Translate complex business requirements into: Data models and data flow diagrams Sequence diagrams Migration runbooks Integration specifications and architectural patterns Provide hands-on technical guidance to engineering teams during solution buildout, ensuring alignment to architectural standards. Collaborate with compliance functions to ensure architectural designs meet regulatory obligations (e.g., PCI-DSS, PSD2, AML, GDPR, electronic money regulations). Promote security-by-design across all components, focusing on card data handling, encryption, tokenisation, and access controls. Facilitate technical workshops and architecture design reviews with internal teams and external partners. Key Skills & Experience Experience as a Solution Architect in fintech, payments, card issuing/acquiring, or financial services. Demonstrated experience with large-scale platform, system, and data migrations. Hands-on expertise in designing ETL and data integration pipelines (Python, SQL-based orchestration, cloud-native ETL tools, messaging ingestion). Solid understanding of data engineering concepts with practical experience in SQL Server, MongoDB, Synapse, Fabric, and Snowflake. Familiarity with card-processing systems, scheme integrations, authorisation flows, transaction life cycles, and settlement processes; knowledge of cryptographic key migrations (e.g., EMV, PEK) is advantageous. Experience architecting microservices, REST APIs, event-driven architectures, and secure cloud services (Azure). Exceptional communication and collaboration skills, capable of working with senior stakeholders, external partners, and technical teams. Ability to balance strategic thinking with hands-on technical problem solving. Spectrum IT Recruitment (South) Limited is acting as an Employment Agency in relation to this vacancy.
Feb 27, 2026
Full time
We are seeking an experienced Solution Architect to play a key role in an excellent client's large-scale platform migration programme focused on consolidating multiple payment and card-processing systems into a unified global ecosystem. The successful candidate will bring expertise in system migrations, ETL/data transformation pipelines, and cross-functional collaboration. You will work with engineering, product, operations, compliance, and partner teams to define and govern architecture that delivers secure, compliant, resilient, and high-performing outcomes. This can either be a mainly remote or hybrid role depending on what the successful candidate prefers. As well as competitive salary's, our client offers a comprehensive benefits package. Key Responsibilities Lead solution architecture for the migration and consolidation of regional card-processing and payments platforms into a modern, unified global environment. Design and own large-scale data migration and ETL processes, including data mapping, transformation logic, orchestration, validation, lineage, and implementation. Translate complex business requirements into: Data models and data flow diagrams Sequence diagrams Migration runbooks Integration specifications and architectural patterns Provide hands-on technical guidance to engineering teams during solution buildout, ensuring alignment to architectural standards. Collaborate with compliance functions to ensure architectural designs meet regulatory obligations (e.g., PCI-DSS, PSD2, AML, GDPR, electronic money regulations). Promote security-by-design across all components, focusing on card data handling, encryption, tokenisation, and access controls. Facilitate technical workshops and architecture design reviews with internal teams and external partners. Key Skills & Experience Experience as a Solution Architect in fintech, payments, card issuing/acquiring, or financial services. Demonstrated experience with large-scale platform, system, and data migrations. Hands-on expertise in designing ETL and data integration pipelines (Python, SQL-based orchestration, cloud-native ETL tools, messaging ingestion). Solid understanding of data engineering concepts with practical experience in SQL Server, MongoDB, Synapse, Fabric, and Snowflake. Familiarity with card-processing systems, scheme integrations, authorisation flows, transaction life cycles, and settlement processes; knowledge of cryptographic key migrations (e.g., EMV, PEK) is advantageous. Experience architecting microservices, REST APIs, event-driven architectures, and secure cloud services (Azure). Exceptional communication and collaboration skills, capable of working with senior stakeholders, external partners, and technical teams. Ability to balance strategic thinking with hands-on technical problem solving. Spectrum IT Recruitment (South) Limited is acting as an Employment Agency in relation to this vacancy.
Data Engineer
Youngs Employment Services
Data Engineer London + 2 or 3 days work from home Circ £60,000 - £70,000 + Excellent Benefits Package A fantastic opportunity is available for a Data Engineer that enjoys working in a fast paced and collaborative team playing work environment. Our client has been expanding at a remarkable pace and have transformed their technical landscape with leading edge solutions. Having implemented a new MS Fabric based Data platform, the need is now to scale up and deliver data driven insights and strategies right across the business globally. The Data Engineer will be joining a close-knit team that is the hub of our client s global data & analytics operation. Previous experience with MS Fabric would be beneficial but is by no means essential. Interested candidates must have experience in a similar role with MS Azure Data Platforms, Synapse, Databricks or other Cloud platforms such as AWS, GCP, Snowflake etc. Key Responsibilities will include; Design, implement, and optimize end-to-end solutions using Fabric components: o Data Factory (pipelines, orchestration) o Data Engineering (Lakehouse, notebooks, Apache Spark) o Data Warehouse (SQL endpoints, schemas, MPP performance tuning) o Real-Time Analytics (KQL databases, event ingestion) o Manage and enhance OneLake architecture, delta lake tables, security policies, and data governance within Fabric. o Build scalable, reusable data assets and engineering patterns that support analytics, reporting, and machine learning workloads. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Troubleshoot and resolve data-related issues in a timely manner. Key Experience, Skills and Knowledge: Proven 2 yrs+ experience as a Data Engineer or similar role, with a strong focus on PySpark, SQL, Microsoft Azure Data platforms and Power BI an advantage Proficiency in development languages suitable for intermediate-level data engineers, such as: Python / PySpark: Widely used for data manipulation, analysis, and scripting. SQL: Essential for querying and managing relational databases. Understanding of D365 F&O Data Structures is highly desirable Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Salary will be dependent on experience and expected to be in the region of £60,000 - £70,000 + an attractive benefits package including bonus scheme. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business
Feb 27, 2026
Full time
Data Engineer London + 2 or 3 days work from home Circ £60,000 - £70,000 + Excellent Benefits Package A fantastic opportunity is available for a Data Engineer that enjoys working in a fast paced and collaborative team playing work environment. Our client has been expanding at a remarkable pace and have transformed their technical landscape with leading edge solutions. Having implemented a new MS Fabric based Data platform, the need is now to scale up and deliver data driven insights and strategies right across the business globally. The Data Engineer will be joining a close-knit team that is the hub of our client s global data & analytics operation. Previous experience with MS Fabric would be beneficial but is by no means essential. Interested candidates must have experience in a similar role with MS Azure Data Platforms, Synapse, Databricks or other Cloud platforms such as AWS, GCP, Snowflake etc. Key Responsibilities will include; Design, implement, and optimize end-to-end solutions using Fabric components: o Data Factory (pipelines, orchestration) o Data Engineering (Lakehouse, notebooks, Apache Spark) o Data Warehouse (SQL endpoints, schemas, MPP performance tuning) o Real-Time Analytics (KQL databases, event ingestion) o Manage and enhance OneLake architecture, delta lake tables, security policies, and data governance within Fabric. o Build scalable, reusable data assets and engineering patterns that support analytics, reporting, and machine learning workloads. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Troubleshoot and resolve data-related issues in a timely manner. Key Experience, Skills and Knowledge: Proven 2 yrs+ experience as a Data Engineer or similar role, with a strong focus on PySpark, SQL, Microsoft Azure Data platforms and Power BI an advantage Proficiency in development languages suitable for intermediate-level data engineers, such as: Python / PySpark: Widely used for data manipulation, analysis, and scripting. SQL: Essential for querying and managing relational databases. Understanding of D365 F&O Data Structures is highly desirable Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Salary will be dependent on experience and expected to be in the region of £60,000 - £70,000 + an attractive benefits package including bonus scheme. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business
Software Engineer III- Data Engineer, Java/Python
JPMorgan Chase & Co.
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JP Morgan Chase within the Corporate Risk Technology, you serve as a seasoned member of an Agile Engineering & Architectureteam to design and deliver trusted market leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and applied experience. Proficiency in Data Engineering & Architecture, AI/ML with hands on experience in designing, implementing, testing, and ensuring the operational stability of large scale enterprise platforms and solutions Advanced in one or more programming language(s) eg. Java, Python , C/C++, C# Working knowledge of relational and NoSQL databases and data lake architectures Experience in developing, debugging, and maintaining code (preferably in a large corporate environment) with one or more modern programming languages and database querying languages with good overlap of application & DB. Experience in large scale data processing, using micro services, API design, Kafka, Redis, MemCached, Observability (Dynatrace , Splunk, Grafana or similar), Orchestration (Airflow, Temporal) Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Practical cloud native experience Preferred qualifications, capabilities, and skills Experience with modern data technologies such as Databricks or Snowflake. Hands on experience with Spark/PySpark and other big data processing technologies Knowledge of the financial services industry and their IT systems
Feb 25, 2026
Full time
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JP Morgan Chase within the Corporate Risk Technology, you serve as a seasoned member of an Agile Engineering & Architectureteam to design and deliver trusted market leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and applied experience. Proficiency in Data Engineering & Architecture, AI/ML with hands on experience in designing, implementing, testing, and ensuring the operational stability of large scale enterprise platforms and solutions Advanced in one or more programming language(s) eg. Java, Python , C/C++, C# Working knowledge of relational and NoSQL databases and data lake architectures Experience in developing, debugging, and maintaining code (preferably in a large corporate environment) with one or more modern programming languages and database querying languages with good overlap of application & DB. Experience in large scale data processing, using micro services, API design, Kafka, Redis, MemCached, Observability (Dynatrace , Splunk, Grafana or similar), Orchestration (Airflow, Temporal) Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Practical cloud native experience Preferred qualifications, capabilities, and skills Experience with modern data technologies such as Databricks or Snowflake. Hands on experience with Spark/PySpark and other big data processing technologies Knowledge of the financial services industry and their IT systems
ARM
AWS Cloud Engineer
ARM
AWS Cloud Engineer 6-month contract - Inside IR35 - up to 480 per day London based - hybrid working - 3 days office based Responsibilities: Responsible for technical delivery of managed services across customer account base. Working as part of a team providing a Shared Managed Service. The following is a list of expected responsibilities: To manage and support a customer's AWS and Data platform To be technical hands on Provide Incident and problem management on the AWS IaaS and PaaS Platform Monitoring and observability of system and platform performance Collaboration with development and build teams on application and platform deployments and changes Involvement in the resolution of Incidents and problems in an efficient and timely manner Actively monitor an AWS platform and components for technical issues Implement and improve on existing monitoring and observability solution To be involved in the resolution of technical incidents tickets Assist in the root cause analysis of incidents Assist with improving efficiency and processes within the team Examining traces and logs Escalate incidents and problems to the appropriate teams Working with third party suppliers and AWS to jointly resolve incidents Experience and Skills Requirements: Essential Technical troubleshooting and problem solving AWS management of large-scale IaaS PaaS solutions Monitoring and troubleshooting servers, networks, and applications Cloud networking and security fundamentals Collaboration and communication skills Highly adaptable to changes in a technical environment Desirable Experience using monitoring and observer ability toolsets inc. Splunk, Datadog Experience using Github Actions Experience using AWS RDS/SQL based solutions Experience using containerization in AWS Working data warehouse knowledge Redshift and Snowflake preferred Working with IaC - Terraform and Cloud Formation Working understanding of scripting languages including Python and Shell Experience working with streaming technologies inc. Kafka, Apache Flink Experience working with a ETL environments Experience working with a confluent cloud platform Disclaimer: This vacancy is being advertised by either Advanced Resource Managers Limited, Advanced Resource Managers IT Limited or Advanced Resource Managers Engineering Limited ("ARM"). ARM is a specialist talent acquisition and management consultancy. We provide technical contingency recruitment and a portfolio of more complex resource solutions. Our specialist recruitment divisions cover the entire technical arena, including some of the most economically and strategically important industries in the UK and the world today. We will never send your CV without your permission. Where the role is marked as Outside IR35 in the advertisement this is subject to receipt of a final Status Determination Statement from the end Client and may be subject to change.
Nov 05, 2025
Contractor
AWS Cloud Engineer 6-month contract - Inside IR35 - up to 480 per day London based - hybrid working - 3 days office based Responsibilities: Responsible for technical delivery of managed services across customer account base. Working as part of a team providing a Shared Managed Service. The following is a list of expected responsibilities: To manage and support a customer's AWS and Data platform To be technical hands on Provide Incident and problem management on the AWS IaaS and PaaS Platform Monitoring and observability of system and platform performance Collaboration with development and build teams on application and platform deployments and changes Involvement in the resolution of Incidents and problems in an efficient and timely manner Actively monitor an AWS platform and components for technical issues Implement and improve on existing monitoring and observability solution To be involved in the resolution of technical incidents tickets Assist in the root cause analysis of incidents Assist with improving efficiency and processes within the team Examining traces and logs Escalate incidents and problems to the appropriate teams Working with third party suppliers and AWS to jointly resolve incidents Experience and Skills Requirements: Essential Technical troubleshooting and problem solving AWS management of large-scale IaaS PaaS solutions Monitoring and troubleshooting servers, networks, and applications Cloud networking and security fundamentals Collaboration and communication skills Highly adaptable to changes in a technical environment Desirable Experience using monitoring and observer ability toolsets inc. Splunk, Datadog Experience using Github Actions Experience using AWS RDS/SQL based solutions Experience using containerization in AWS Working data warehouse knowledge Redshift and Snowflake preferred Working with IaC - Terraform and Cloud Formation Working understanding of scripting languages including Python and Shell Experience working with streaming technologies inc. Kafka, Apache Flink Experience working with a ETL environments Experience working with a confluent cloud platform Disclaimer: This vacancy is being advertised by either Advanced Resource Managers Limited, Advanced Resource Managers IT Limited or Advanced Resource Managers Engineering Limited ("ARM"). ARM is a specialist talent acquisition and management consultancy. We provide technical contingency recruitment and a portfolio of more complex resource solutions. Our specialist recruitment divisions cover the entire technical arena, including some of the most economically and strategically important industries in the UK and the world today. We will never send your CV without your permission. Where the role is marked as Outside IR35 in the advertisement this is subject to receipt of a final Status Determination Statement from the end Client and may be subject to change.

Modal Window

  • Home
  • Contact
  • About Us
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • Facebook
  • Twitter
  • Google Plus
  • LinkedIn
Parent and Partner sites: IT Job Board | Jobs Near Me | RightTalent.co.uk | Quantity Surveyor jobs | Building Surveyor jobs | Construction Recruitment | Talent Recruiter | Construction Job Board | Property jobs | myJobsnearme.com | Jobs near me
© 2008-2026 Jobsite Jobs | Designed by Web Design Agency