• Home
  • Search Jobs
  • Register CV
  • Post a Job
  • Employer Pricing
  • Contact Us
  • Sign in
  • Sign up
  • Home
  • Search Jobs
  • Register CV
  • Post a Job
  • Employer Pricing
  • Contact Us
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

2 jobs found

Email me jobs like this
Refine Search
Current Search
actuarial data analyst
Data Engineer
HDI
About us HDI is a Corporate & Specialty Insurer part of the Talanx Group. With over 120 years of experience, HDI operates across five continents, around 40 countries and employs over 5,000 people worldwide. The role The Data Engineer is responsible for designing, building, and maintaining robust, scalable data pipelines and cloud based data infrastructure to support analytics, reporting, data modelling, underwriting insights, and regulatory needs across HDI UK&I. This role ensures timely, trusted, well structured data delivery into appropriate data marts & warehouses, and downstream data feeds which are used by Actuarial, Finance, Operations, and Underwriting. The position forms a core part of HDI's data transformation agenda, enabling improved decision making, automation, and analytics maturity. Key accountabilities Design, develop, and maintain end to end ingestion pipelines into appropriate data technologies such as Snowflake from internal systems (policy admin, claims, finance) and external data sources. Build orchestrated ELT/ETL processes using modern tooling and best practice engineering patterns. Implement incremental refresh, schema evolution management, and data validation tests. Ensure data availability aligned to business SLAs (e.g., daily refresh for actuarial & finance repositories). Data Modelling & Warehouse Development Create well structured dimensional and relational data models for analytical use cases. Develop canonical, reusable datasets (curated marts) for Analytics, Actuarial, and Finance. Own the technical modelling layer in Snowflake including schema design, performance optimisation, cost control, and warehouse governance. Collaborate closely with Analytics Engineers using dbt, ensuring transformations are production grade, tested, and fully documented. Data Quality, Testing & Governance Implement automated testing suites, data contracts, lineage, and monitoring frameworks. Partner with Data Governance to embed quality rules, SLAs, and metadata standards into pipelines. Resolve data quality issues proactively and own improvements to source to target data flows. Cross Functional Collaboration Work with business areas as needed to supply structured data sets for relevant business processes. Drive the building of automated, trusted data feeds for analytics requirements. Partner with Data Analysts to accelerate dashboarding and advanced analytics. Collaborate with Technology teams to ensure secure, reliable platform operation. Optimise Snowflake/SQL/Python query performance, warehouse sizing, storage costs, and compute efficiency. Implement workload separation, time travel optimisation, clustering, and pruning strategies. Documentation & Knowledge Sharing Produce comprehensive documentation for pipelines, data models, data flows, and architecture components. Provide technical guidance to junior team members and evangelise engineering best practices. Skills & experience Technical Skills Expert SQL engineering capability Advanced experience with schemas, warehouses, stages, tasks, streams, performance tuning. Experience of modern transformation frameworks (Snowflake/DBT preferred - but not essential) Python for scripting, automation, and orchestration. Experience with CICD pipelines (GitHub Actions / Azure DevOps), code reviews, and versioning. Strong understanding of data modelling, data warehousing patterns, and ELT best practice. Familiarity with PowerBI or BI model structures to support downstream analytics. Cloud platform experience (Azure preferred). Business & Domain Skills Prior experience in insurance, especially commercial/specialty lines, claims, actuarial or finance data structures Understanding of regulatory expectations around data quality, lineage, and auditability (desired but not essential) Professional: Degree in Computer Science, Engineering, Mathematics or similar (or equivalent professional experience). dbt certification beneficial. Snowflake certifications advantageous. Other As an equal opportunities employer, we are committed to creating an inclusive environment for all employees, recognising that a diverse and inclusive workplace is a creative and prosperous one. If you require support with your application, please contact UK&.
Apr 17, 2026
Full time
About us HDI is a Corporate & Specialty Insurer part of the Talanx Group. With over 120 years of experience, HDI operates across five continents, around 40 countries and employs over 5,000 people worldwide. The role The Data Engineer is responsible for designing, building, and maintaining robust, scalable data pipelines and cloud based data infrastructure to support analytics, reporting, data modelling, underwriting insights, and regulatory needs across HDI UK&I. This role ensures timely, trusted, well structured data delivery into appropriate data marts & warehouses, and downstream data feeds which are used by Actuarial, Finance, Operations, and Underwriting. The position forms a core part of HDI's data transformation agenda, enabling improved decision making, automation, and analytics maturity. Key accountabilities Design, develop, and maintain end to end ingestion pipelines into appropriate data technologies such as Snowflake from internal systems (policy admin, claims, finance) and external data sources. Build orchestrated ELT/ETL processes using modern tooling and best practice engineering patterns. Implement incremental refresh, schema evolution management, and data validation tests. Ensure data availability aligned to business SLAs (e.g., daily refresh for actuarial & finance repositories). Data Modelling & Warehouse Development Create well structured dimensional and relational data models for analytical use cases. Develop canonical, reusable datasets (curated marts) for Analytics, Actuarial, and Finance. Own the technical modelling layer in Snowflake including schema design, performance optimisation, cost control, and warehouse governance. Collaborate closely with Analytics Engineers using dbt, ensuring transformations are production grade, tested, and fully documented. Data Quality, Testing & Governance Implement automated testing suites, data contracts, lineage, and monitoring frameworks. Partner with Data Governance to embed quality rules, SLAs, and metadata standards into pipelines. Resolve data quality issues proactively and own improvements to source to target data flows. Cross Functional Collaboration Work with business areas as needed to supply structured data sets for relevant business processes. Drive the building of automated, trusted data feeds for analytics requirements. Partner with Data Analysts to accelerate dashboarding and advanced analytics. Collaborate with Technology teams to ensure secure, reliable platform operation. Optimise Snowflake/SQL/Python query performance, warehouse sizing, storage costs, and compute efficiency. Implement workload separation, time travel optimisation, clustering, and pruning strategies. Documentation & Knowledge Sharing Produce comprehensive documentation for pipelines, data models, data flows, and architecture components. Provide technical guidance to junior team members and evangelise engineering best practices. Skills & experience Technical Skills Expert SQL engineering capability Advanced experience with schemas, warehouses, stages, tasks, streams, performance tuning. Experience of modern transformation frameworks (Snowflake/DBT preferred - but not essential) Python for scripting, automation, and orchestration. Experience with CICD pipelines (GitHub Actions / Azure DevOps), code reviews, and versioning. Strong understanding of data modelling, data warehousing patterns, and ELT best practice. Familiarity with PowerBI or BI model structures to support downstream analytics. Cloud platform experience (Azure preferred). Business & Domain Skills Prior experience in insurance, especially commercial/specialty lines, claims, actuarial or finance data structures Understanding of regulatory expectations around data quality, lineage, and auditability (desired but not essential) Professional: Degree in Computer Science, Engineering, Mathematics or similar (or equivalent professional experience). dbt certification beneficial. Snowflake certifications advantageous. Other As an equal opportunities employer, we are committed to creating an inclusive environment for all employees, recognising that a diverse and inclusive workplace is a creative and prosperous one. If you require support with your application, please contact UK&.
The Emerald Group
Actuary - 29436
The Emerald Group
They are seeking an Actuary to join their growing Data and Analytics team. Location: London Category: Non-life Actuarial Type: Permanent Key Duties (Including but not limited to): Develop, maintain, and enhance pricing models for various lines of business, ensuring they reflect current market conditions, claims experience and risk appetite, working with project managers, business analysts, developers, and QA testers Train underwriters on pricing models and assist with evaluation of large individual accounts or portfolio deals incorporating exposure and experience rating Analyse loss data, bordereaux and claims information to inform pricing assumptions, identify emerging risks, and recommend ongoing data requirements to improve future pricing Evaluate systems and data sources for suitability and potential integration into pricing solutions Fellow of the Institute & Faculty of Actuaries or equivalent Extensive insurance pricing experience in the London Market, particularly Lloyd's Strong technical pricing skills across multiple classes of business (Property and Marine knowledge preferred) Advanced Excel and VBA skills, with proficiency in Word and PowerPoint
Apr 07, 2026
Full time
They are seeking an Actuary to join their growing Data and Analytics team. Location: London Category: Non-life Actuarial Type: Permanent Key Duties (Including but not limited to): Develop, maintain, and enhance pricing models for various lines of business, ensuring they reflect current market conditions, claims experience and risk appetite, working with project managers, business analysts, developers, and QA testers Train underwriters on pricing models and assist with evaluation of large individual accounts or portfolio deals incorporating exposure and experience rating Analyse loss data, bordereaux and claims information to inform pricing assumptions, identify emerging risks, and recommend ongoing data requirements to improve future pricing Evaluate systems and data sources for suitability and potential integration into pricing solutions Fellow of the Institute & Faculty of Actuaries or equivalent Extensive insurance pricing experience in the London Market, particularly Lloyd's Strong technical pricing skills across multiple classes of business (Property and Marine knowledge preferred) Advanced Excel and VBA skills, with proficiency in Word and PowerPoint

Modal Window

  • Home
  • Contact
  • About Us
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • Facebook
  • Twitter
  • Google Plus
  • LinkedIn
Parent and Partner sites: IT Job Board | Jobs Near Me | RightTalent.co.uk | Quantity Surveyor jobs | Building Surveyor jobs | Construction Recruitment | Talent Recruiter | Construction Job Board | Property jobs | myJobsnearme.com | Jobs near me
© 2008-2026 Jobsite Jobs | Designed by Web Design Agency