• Home
  • Search Jobs
  • Register CV
  • Post a Job
  • Employer Pricing
  • Contact Us
  • Sign in
  • Sign up
  • Home
  • Search Jobs
  • Register CV
  • Post a Job
  • Employer Pricing
  • Contact Us
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

4 jobs found

Email me jobs like this
Refine Search
Current Search
athena data engineer
MCS Group
Contract Data Engineer
MCS Group
MCS Group is working with an AI enabled software house who is seeking an experienced Contract Data Engineer to support a key automation initiative. This role will focus on building scalable automation across their data platform while supporting migration and modernisation of existing data infrastructure. This is an excellent contract/project for someone who has previously worked on data migration, cloud transformation, or platform modernisation projects, and enjoys building robust, automated data solutions in a cloud-first environment. The Role Design, build, and maintain scalable data pipelines and automation for data ingestion, transformation, and delivery. Develop and optimise SQL-based data models across large relational databases. Build and manage AWS-based data infrastructure using services such as Redshift, Glue, Lambda, S3, and Athena. Implement and maintain ETL/ELT pipelines to integrate data from multiple systems. Support data platform migration and modernisation initiatives. Ensure high standards of data quality, performance optimisation, and governance. Contribute to DevOps and automation practices, including infrastructure-as-code and CI/CD where appropriate. The Person Extensive proven experience as a Data Engineer is essential. Strong hands-on expertise with large relational databases and advanced SQL optimisation. Experience working with AWS data services is essential (including Redshift, Glue, Lambda, S3, and Athena.) Strong experience building ETL/ELT pipelines and data integration workflows. Experience with Python, Spark, or other data processing technologies. Familiarity with Terraform, CI/CD pipelines, and infrastructure-as-code practices. IR35: Outside Duration: 6 months Rate: £410 per day Location: Belfast or London - Please note, in-office working is required in either location To speak in absolute confidence about this opportunity please send an up-to-date CV via the link provided or contact Jill Johnston, Head of IT Contracts, at MCS Group Even if this position is not right for you, we may have others that are. Please visit MCS Group to view a wide selection of our current jobs MCS Group is committed to Equality, Diversity, and Inclusion for all, being the first recruitment agency in NI to achieve Bronze Diversity Mark accreditation, if you have a disability which means you require a reasonable adjustment to be considered at any stage of the recruitment process, please contact us and we will endeavour to facilitate the request. Not all agencies are the same MCS Group are passionate about providing a first-class service to all our customers and have an independent review rating of 4.9 stars on Google.
Apr 03, 2026
Full time
MCS Group is working with an AI enabled software house who is seeking an experienced Contract Data Engineer to support a key automation initiative. This role will focus on building scalable automation across their data platform while supporting migration and modernisation of existing data infrastructure. This is an excellent contract/project for someone who has previously worked on data migration, cloud transformation, or platform modernisation projects, and enjoys building robust, automated data solutions in a cloud-first environment. The Role Design, build, and maintain scalable data pipelines and automation for data ingestion, transformation, and delivery. Develop and optimise SQL-based data models across large relational databases. Build and manage AWS-based data infrastructure using services such as Redshift, Glue, Lambda, S3, and Athena. Implement and maintain ETL/ELT pipelines to integrate data from multiple systems. Support data platform migration and modernisation initiatives. Ensure high standards of data quality, performance optimisation, and governance. Contribute to DevOps and automation practices, including infrastructure-as-code and CI/CD where appropriate. The Person Extensive proven experience as a Data Engineer is essential. Strong hands-on expertise with large relational databases and advanced SQL optimisation. Experience working with AWS data services is essential (including Redshift, Glue, Lambda, S3, and Athena.) Strong experience building ETL/ELT pipelines and data integration workflows. Experience with Python, Spark, or other data processing technologies. Familiarity with Terraform, CI/CD pipelines, and infrastructure-as-code practices. IR35: Outside Duration: 6 months Rate: £410 per day Location: Belfast or London - Please note, in-office working is required in either location To speak in absolute confidence about this opportunity please send an up-to-date CV via the link provided or contact Jill Johnston, Head of IT Contracts, at MCS Group Even if this position is not right for you, we may have others that are. Please visit MCS Group to view a wide selection of our current jobs MCS Group is committed to Equality, Diversity, and Inclusion for all, being the first recruitment agency in NI to achieve Bronze Diversity Mark accreditation, if you have a disability which means you require a reasonable adjustment to be considered at any stage of the recruitment process, please contact us and we will endeavour to facilitate the request. Not all agencies are the same MCS Group are passionate about providing a first-class service to all our customers and have an independent review rating of 4.9 stars on Google.
Head Resourcing
Technical Lead - Data
Head Resourcing Leeds, Yorkshire
Data Technical Lead - Leeds - up to £100K plus benefits (Hybrid working) Head Resourcing are delighted to be working with our client to recruit an experienced Technical Lead - Data. This is a fantastic opportunity to lead a high-performing Data Engineering team while owning a strategic data roadmap within a forward-thinking organisation. The Role Lead, mentor, and develop a team of Data Engineers Own and drive the data engineering roadmap Act as the technical authority for data architecture and design Deliver scalable, secure AWS-based data platforms (Lakehouse) Collaborate with stakeholders across technology and the wider business Key Skills Proven leadership experience within Data Engineering teams Strong hands-on expertise with AWS (Lambda, Glue, S3, Athena, Step Functions) Proficiency in Python, SQL, Terraform and Git Experience delivering complex data solutions and influencing stakeholders Strong understanding of data pipelines, testing, monitoring and data quality Desirable: Data migrations, modern architectures (Lakehouse/serverless), CI/CD & DevOps and regulated environments. This is a key opportunity to shape data strategy, architecture, and engineering best practices while driving impactful transformation initiatives.
Apr 02, 2026
Full time
Data Technical Lead - Leeds - up to £100K plus benefits (Hybrid working) Head Resourcing are delighted to be working with our client to recruit an experienced Technical Lead - Data. This is a fantastic opportunity to lead a high-performing Data Engineering team while owning a strategic data roadmap within a forward-thinking organisation. The Role Lead, mentor, and develop a team of Data Engineers Own and drive the data engineering roadmap Act as the technical authority for data architecture and design Deliver scalable, secure AWS-based data platforms (Lakehouse) Collaborate with stakeholders across technology and the wider business Key Skills Proven leadership experience within Data Engineering teams Strong hands-on expertise with AWS (Lambda, Glue, S3, Athena, Step Functions) Proficiency in Python, SQL, Terraform and Git Experience delivering complex data solutions and influencing stakeholders Strong understanding of data pipelines, testing, monitoring and data quality Desirable: Data migrations, modern architectures (Lakehouse/serverless), CI/CD & DevOps and regulated environments. This is a key opportunity to shape data strategy, architecture, and engineering best practices while driving impactful transformation initiatives.
Rackspace
Solution Director (Analytics & Al/ML
Rackspace
Rackspace Technology is a leading provider of expertise and managed services across all the major public and private cloud technologies. We've evolved Fanatical Support to encompass the entire customer journey - providing Fanatical Experience from first consultation to daily operations. Our passionate experts combine the power of proactive, always on service and expertise with best in class tools and automation to deliver technology when and how our customers need it. We are seeking a highly accomplished Solution Director (Analytics & Al/ML) to lead the design and sales of two critical solution portfolios: Generative AI/LLM solutions and Data modernization/Lakehouse architectures on AWS. This pivotal role requires mastery of both domains - leveraging generative AI capabilities (Amazon Q, Amazon Bedrock, QuickSight) to drive executive conversations and opportunity creation, while delivering enterprise data modernization through Lakehouse architectures using AWS native services (Glue, SageMaker Unified Studio) and leading platforms (Databricks on AWS, Snowflake on AWS). This is a presales role that demands cross functional experience with proven ability to engage C level stakeholders, drive top of funnel opportunity creation, and maintain comprehensive account ownership across the entire customer lifecycle. The ideal candidate will excel at both selling the vision of generative AI transformation and delivering the reality of enterprise data modernization, combining deep technical expertise with exceptional business acumen and executive presence. Responsibilities Strategic Leadership & Opportunity Development • Drive top of funnel opportunity creation through two parallel tracks: engaging C level stakeholders with generative AI demonstrations (Amazon Q, Amazon Bedrock) and identifying data modernization needs for Lakehouse transformations. • Lead the design and architecture of dual solution portfolios: Generative AI Solutions: Amazon Bedrock implementations, Amazon Q deployments, QuickSight with Q capabilities, RAG architectures, and custom LLM solutions. Data Modernization: Enterprise Lakehouse architectures using AWS Glue, SageMaker Unified Studio, Databricks on AWS, and Snowflake on AWS. • Act as the trusted advisor, positioning generative AI as the transformational vision while grounding delivery in robust data platform modernization. • Develop compelling business cases that connect AI aspirations with practical data foundation requirements, demonstrating ROI across both portfolios. • Stay current with advancements in generative AI (foundation models, LLMs) and modern data architectures (Lakehouse patterns, data mesh, unified analytics). • Contribute to Rackspace's intellectual property through reference architectures covering both generative AI implementations and Lakehouse design patterns. • Mentor and provide leadership to Solution Architects by guiding technical development and fostering skill growth across both generative AI and data modernization solution areas. Customer Engagement & Solution Delivery • Serve as the primary technical lead orchestrating both generative AI discussions and data modernization programs for strategic accounts. • Build strategic relationships using two engagement models: Executive Level: Amazon Q demonstrations, QuickSight analytics with generative BI, art of the possible sessions. Technical Level: Lakehouse architecture workshops, platform assessments (Databricks vs Snowflake vs AWS native), migration planning. • Lead comprehensive consultative engagements that begin with generative AI vision (Amazon Q, Bedrock) and translate into concrete data modernization roadmaps. • Develop proposals that balance innovative AI capabilities with foundational data platform requirements. Guide customers through parallel journeys: generative AI adoption (POCs to production) and data platform modernization (legacy to Lakehouse). • Collaborate with sales teams to position both solution portfolios strategically based on customer maturity and needs. Technical Excellence & Market Awareness Maintain deep expertise across both solution domains: 1) Generative AI: Amazon Bedrock, Amazon Q, QuickSight Q, SageMaker JumpStart, prompt engineering, RAG architectures, vector databases. 2) Data Platforms: AWS Glue, SageMaker Unified Studio, Databricks on AWS, Snowflake on AWS, Redshift, EMR, Apache Iceberg, Delta Lake. • Position AWS solutions effectively against other cloud platforms' offerings in both generative AI (Azure OpenAI, Vertex AI) and data platforms (Azure Synapse, BigQuery). • Guide architectural decisions on build vs. buy for both AI capabilities and data platform components. Required Experience Deep experience with generative AI technologies: Amazon Bedrock, Amazon Q, LLM architectures, RAG implementations. Proven track record delivering data modernization: Lakehouse architectures, Databricks and/or Snowflake implementations, AWS Glue/EMR deployments. A bachelor's degree in computer science, Data Science, Engineering, Mathematics, or a related technical field is required. At the manager's discretion, additional relevant experience may substitute for the degree requirement. A minimum of 15 years of enterprise solution architecture experience. A minimum of 8 years of public cloud experience. A minimum of 5 years as a senior level architect or solutions leader with hands on experience in both AI/ML and data platform modernization. Proven Presales/Sales Engineering experience. Demonstrated success in engaging C level executives using generative AI demonstrations while delivering complex data platform transformations. Strong understanding across the full spectrum: AI/ML: Generative AI, foundation models, LLMs, traditional ML, prompt engineering, fine tuning. Data Platforms: Lakehouse architectures, data mesh, ETL/ELT, streaming, data governance, data quality. Proficiency in Python, SQL, and Spark with hands on experience in: Generative AI: LangChain, vector databases, embedding models. Data Engineering: PySpark, Apache Iceberg/Delta Lake, orchestration tools. A proven ability to articulate both visionary AI possibilities and practical data platform requirements to diverse audiences. Preferred Qualifications An advanced degree (Master's or PhD) in a relevant field. Experience with AWS professional services or AWS partner ecosystem across both AI and data domains. Multiple Lakehouse platforms: Databricks, Snowflake, AWS native (Glue + Athena + Redshift). Multiple AI platforms: AWS Bedrock, Azure OpenAI, Google Vertex AI. Industry certifications: AWS - Solutions Architect Professional, Machine Learning Specialty, Data Analytics Specialty. Platform specific - Databricks Certified, Snowflake SnowPro. Experience with regulated industries requiring governance for both AI and data platforms. Track record building practices that deliver both generative AI solutions and data modernization programs. Published thought leadership in generative AI applications and/or modern data architectures.
Apr 01, 2026
Full time
Rackspace Technology is a leading provider of expertise and managed services across all the major public and private cloud technologies. We've evolved Fanatical Support to encompass the entire customer journey - providing Fanatical Experience from first consultation to daily operations. Our passionate experts combine the power of proactive, always on service and expertise with best in class tools and automation to deliver technology when and how our customers need it. We are seeking a highly accomplished Solution Director (Analytics & Al/ML) to lead the design and sales of two critical solution portfolios: Generative AI/LLM solutions and Data modernization/Lakehouse architectures on AWS. This pivotal role requires mastery of both domains - leveraging generative AI capabilities (Amazon Q, Amazon Bedrock, QuickSight) to drive executive conversations and opportunity creation, while delivering enterprise data modernization through Lakehouse architectures using AWS native services (Glue, SageMaker Unified Studio) and leading platforms (Databricks on AWS, Snowflake on AWS). This is a presales role that demands cross functional experience with proven ability to engage C level stakeholders, drive top of funnel opportunity creation, and maintain comprehensive account ownership across the entire customer lifecycle. The ideal candidate will excel at both selling the vision of generative AI transformation and delivering the reality of enterprise data modernization, combining deep technical expertise with exceptional business acumen and executive presence. Responsibilities Strategic Leadership & Opportunity Development • Drive top of funnel opportunity creation through two parallel tracks: engaging C level stakeholders with generative AI demonstrations (Amazon Q, Amazon Bedrock) and identifying data modernization needs for Lakehouse transformations. • Lead the design and architecture of dual solution portfolios: Generative AI Solutions: Amazon Bedrock implementations, Amazon Q deployments, QuickSight with Q capabilities, RAG architectures, and custom LLM solutions. Data Modernization: Enterprise Lakehouse architectures using AWS Glue, SageMaker Unified Studio, Databricks on AWS, and Snowflake on AWS. • Act as the trusted advisor, positioning generative AI as the transformational vision while grounding delivery in robust data platform modernization. • Develop compelling business cases that connect AI aspirations with practical data foundation requirements, demonstrating ROI across both portfolios. • Stay current with advancements in generative AI (foundation models, LLMs) and modern data architectures (Lakehouse patterns, data mesh, unified analytics). • Contribute to Rackspace's intellectual property through reference architectures covering both generative AI implementations and Lakehouse design patterns. • Mentor and provide leadership to Solution Architects by guiding technical development and fostering skill growth across both generative AI and data modernization solution areas. Customer Engagement & Solution Delivery • Serve as the primary technical lead orchestrating both generative AI discussions and data modernization programs for strategic accounts. • Build strategic relationships using two engagement models: Executive Level: Amazon Q demonstrations, QuickSight analytics with generative BI, art of the possible sessions. Technical Level: Lakehouse architecture workshops, platform assessments (Databricks vs Snowflake vs AWS native), migration planning. • Lead comprehensive consultative engagements that begin with generative AI vision (Amazon Q, Bedrock) and translate into concrete data modernization roadmaps. • Develop proposals that balance innovative AI capabilities with foundational data platform requirements. Guide customers through parallel journeys: generative AI adoption (POCs to production) and data platform modernization (legacy to Lakehouse). • Collaborate with sales teams to position both solution portfolios strategically based on customer maturity and needs. Technical Excellence & Market Awareness Maintain deep expertise across both solution domains: 1) Generative AI: Amazon Bedrock, Amazon Q, QuickSight Q, SageMaker JumpStart, prompt engineering, RAG architectures, vector databases. 2) Data Platforms: AWS Glue, SageMaker Unified Studio, Databricks on AWS, Snowflake on AWS, Redshift, EMR, Apache Iceberg, Delta Lake. • Position AWS solutions effectively against other cloud platforms' offerings in both generative AI (Azure OpenAI, Vertex AI) and data platforms (Azure Synapse, BigQuery). • Guide architectural decisions on build vs. buy for both AI capabilities and data platform components. Required Experience Deep experience with generative AI technologies: Amazon Bedrock, Amazon Q, LLM architectures, RAG implementations. Proven track record delivering data modernization: Lakehouse architectures, Databricks and/or Snowflake implementations, AWS Glue/EMR deployments. A bachelor's degree in computer science, Data Science, Engineering, Mathematics, or a related technical field is required. At the manager's discretion, additional relevant experience may substitute for the degree requirement. A minimum of 15 years of enterprise solution architecture experience. A minimum of 8 years of public cloud experience. A minimum of 5 years as a senior level architect or solutions leader with hands on experience in both AI/ML and data platform modernization. Proven Presales/Sales Engineering experience. Demonstrated success in engaging C level executives using generative AI demonstrations while delivering complex data platform transformations. Strong understanding across the full spectrum: AI/ML: Generative AI, foundation models, LLMs, traditional ML, prompt engineering, fine tuning. Data Platforms: Lakehouse architectures, data mesh, ETL/ELT, streaming, data governance, data quality. Proficiency in Python, SQL, and Spark with hands on experience in: Generative AI: LangChain, vector databases, embedding models. Data Engineering: PySpark, Apache Iceberg/Delta Lake, orchestration tools. A proven ability to articulate both visionary AI possibilities and practical data platform requirements to diverse audiences. Preferred Qualifications An advanced degree (Master's or PhD) in a relevant field. Experience with AWS professional services or AWS partner ecosystem across both AI and data domains. Multiple Lakehouse platforms: Databricks, Snowflake, AWS native (Glue + Athena + Redshift). Multiple AI platforms: AWS Bedrock, Azure OpenAI, Google Vertex AI. Industry certifications: AWS - Solutions Architect Professional, Machine Learning Specialty, Data Analytics Specialty. Platform specific - Databricks Certified, Snowflake SnowPro. Experience with regulated industries requiring governance for both AI and data platforms. Track record building practices that deliver both generative AI solutions and data modernization programs. Published thought leadership in generative AI applications and/or modern data architectures.
Randstad Technologies Recruitment
Lead PySpark Engineer
Randstad Technologies Recruitment City, London
PySpark Engineer Lead As the Technical Lead, you will drive the high-stakes migration of legacy SAS analytics to a modern, cloud-native PySpark ecosystem on AWS. This isn't just a lift and shift you will refactor complex procedural logic into scalable, production-ready distributed pipelines for a Tier-1 financial services environment. Core Responsibilities Engineering Leadership: Design and develop complex ETL/ELT pipelines and Data Marts using PySpark, EMR, and Glue. Legacy Modernisation: Architect the conversion of SAS Base/Macros into modular, testable Python code using SAS2PY and manual refactoring. Performance Tuning: Optimise Spark execution (partitioning, shuffling, caching) to ensure cost-efficient processing of massive financial datasets. Quality & Governance: Implement rigorous CI/CD, unit testing, and data reconciliation frameworks to ensure "penny-perfect" accuracy. Technical Stack Engine: PySpark (Expert), Python (Clean Code/SOLID principles). AWS: EMR, Glue, S3, Athena, IAM, Lambda. Data Modeling: SCD Type 2, Fact/Dimension tables, Data Vault/Star Schema. Legacy: Proficiency in reading/debugging SAS (Base, Macros, DI Studio). DevOps: Git-based workflows, Jenkins/GitLab CI, Terraform. Randstad Technologies is acting as an Employment Business in relation to this vacancy.
Mar 15, 2026
Contractor
PySpark Engineer Lead As the Technical Lead, you will drive the high-stakes migration of legacy SAS analytics to a modern, cloud-native PySpark ecosystem on AWS. This isn't just a lift and shift you will refactor complex procedural logic into scalable, production-ready distributed pipelines for a Tier-1 financial services environment. Core Responsibilities Engineering Leadership: Design and develop complex ETL/ELT pipelines and Data Marts using PySpark, EMR, and Glue. Legacy Modernisation: Architect the conversion of SAS Base/Macros into modular, testable Python code using SAS2PY and manual refactoring. Performance Tuning: Optimise Spark execution (partitioning, shuffling, caching) to ensure cost-efficient processing of massive financial datasets. Quality & Governance: Implement rigorous CI/CD, unit testing, and data reconciliation frameworks to ensure "penny-perfect" accuracy. Technical Stack Engine: PySpark (Expert), Python (Clean Code/SOLID principles). AWS: EMR, Glue, S3, Athena, IAM, Lambda. Data Modeling: SCD Type 2, Fact/Dimension tables, Data Vault/Star Schema. Legacy: Proficiency in reading/debugging SAS (Base, Macros, DI Studio). DevOps: Git-based workflows, Jenkins/GitLab CI, Terraform. Randstad Technologies is acting as an Employment Business in relation to this vacancy.

Modal Window

  • Home
  • Contact
  • About Us
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • Facebook
  • Twitter
  • Google Plus
  • LinkedIn
Parent and Partner sites: IT Job Board | Jobs Near Me | RightTalent.co.uk | Quantity Surveyor jobs | Building Surveyor jobs | Construction Recruitment | Talent Recruiter | Construction Job Board | Property jobs | myJobsnearme.com | Jobs near me
© 2008-2026 Jobsite Jobs | Designed by Web Design Agency