Lumanity
Overview / About us Lumanity is dedicated to improving patient health by accelerating and optimizing access to medical advances. We partner with life sciences companies around the world to generate evidence to demonstrate the value of their product, translate the science and data into compelling product narratives, and enable commercial decisions that position these products for success in the market. We do this through strategic and complimentary areas of focus: Strategy & Insights, Value, Access, & Outcomes, and Medical Strategy & Communications. Responsibilities / Position overview As a member of our Application Development & Innovation team, the Data Scientist will work with complex data to create Small Language Models and work with a broad range of stakeholders with a variety of commercial data requirements including Real World Evidence data ingestion and normalization, market and product data to support biopharma commercial product launches, and complex quant and qual research projects. This role is also responsible for GenerativeAI product development, applying complex statistical methods to a range of pharmaceutical market research data and utilizing rigorous testing methods. Statistical analysis/modelling of multiple data sources: research design (e.g. experiment/survey), data pre-processing with a data scientist mentality ("automate & re-use"). Small Language Models: be able to prepare and use data to build small language models in conjunction with the Lumanity Application Development & Innovation team. Segment complex (market research) data: normalize and ingest data into a range of platforms to aid the use of data by custom AI language models Work on the design and implementation of advanced statistical modeling and market research techniques , such as segmentation, demand assessment, choice-based modeling, statistical inference, and predictive modeling, contributing to the development of new analytics capabilities. Articulate advanced statistical and data science modelling features into (software) product requirements. Work as an integrated effective team member of a software development process. Qualifications Strong degree (2:1) in any analytics related field (e.g. statistics, mathematics, physics, engineering) subject and/or professional qualification. 5+ years' experience in research or professional data science-driven services organization 5 years' experience in research-based analytics/statistical project support role Proficient Analytics Toolkit: R, SPSS, Excel, Git, Python, SQL, Postgres & cloud computing expertise desirable. Consistently able to apply a range of research methods and demonstrate honed analytics skills Strong communication skills and interpersonal skills Solid project management skills Demonstrated experience building and maintaining client relationships Commercially focused mindset Coaching, Leadership and Management experience Technical skillset: Segmentation & Discriminant analysis & tools (essential) Predictive modelling (e.g. logistic regression) (essential) Choice/Allocation based models (essential) Decision/Regression based trees (essential) Key Driver analysis methods (essential) Multivariate analysis (essential) Machine learning/AI methods (essential) Experimental design (desirable) Bayesian methods (desirable) Time series analytics (desirable) Text analytics (desirable) Data science best practices (e.g. version control, scripts & libraries, reproducibility) and structured data wrangling (desirable) BI & web-app development (desirable) Unstructured data wrangling (desirable) Benefits We offer our employees a comprehensive benefits package that focuses on what matters to you - health and well-being, personal finances, professional development, and a healthy work/life balance: Competitive salary plus annual bonus scheme Private health insurance plus enhanced dental and optical cover Generous pension scheme Generous number of days paid holiday Enhanced maternity and paternity pay for employees with 2+ years of service Access to comprehensive Mortgage Advisor Service Group income protection Life assurance coverage at 4x base salary EV car scheme and more -
Overview / About us Lumanity is dedicated to improving patient health by accelerating and optimizing access to medical advances. We partner with life sciences companies around the world to generate evidence to demonstrate the value of their product, translate the science and data into compelling product narratives, and enable commercial decisions that position these products for success in the market. We do this through strategic and complimentary areas of focus: Strategy & Insights, Value, Access, & Outcomes, and Medical Strategy & Communications. Responsibilities / Position overview As a member of our Application Development & Innovation team, the Data Scientist will work with complex data to create Small Language Models and work with a broad range of stakeholders with a variety of commercial data requirements including Real World Evidence data ingestion and normalization, market and product data to support biopharma commercial product launches, and complex quant and qual research projects. This role is also responsible for GenerativeAI product development, applying complex statistical methods to a range of pharmaceutical market research data and utilizing rigorous testing methods. Statistical analysis/modelling of multiple data sources: research design (e.g. experiment/survey), data pre-processing with a data scientist mentality ("automate & re-use"). Small Language Models: be able to prepare and use data to build small language models in conjunction with the Lumanity Application Development & Innovation team. Segment complex (market research) data: normalize and ingest data into a range of platforms to aid the use of data by custom AI language models Work on the design and implementation of advanced statistical modeling and market research techniques , such as segmentation, demand assessment, choice-based modeling, statistical inference, and predictive modeling, contributing to the development of new analytics capabilities. Articulate advanced statistical and data science modelling features into (software) product requirements. Work as an integrated effective team member of a software development process. Qualifications Strong degree (2:1) in any analytics related field (e.g. statistics, mathematics, physics, engineering) subject and/or professional qualification. 5+ years' experience in research or professional data science-driven services organization 5 years' experience in research-based analytics/statistical project support role Proficient Analytics Toolkit: R, SPSS, Excel, Git, Python, SQL, Postgres & cloud computing expertise desirable. Consistently able to apply a range of research methods and demonstrate honed analytics skills Strong communication skills and interpersonal skills Solid project management skills Demonstrated experience building and maintaining client relationships Commercially focused mindset Coaching, Leadership and Management experience Technical skillset: Segmentation & Discriminant analysis & tools (essential) Predictive modelling (e.g. logistic regression) (essential) Choice/Allocation based models (essential) Decision/Regression based trees (essential) Key Driver analysis methods (essential) Multivariate analysis (essential) Machine learning/AI methods (essential) Experimental design (desirable) Bayesian methods (desirable) Time series analytics (desirable) Text analytics (desirable) Data science best practices (e.g. version control, scripts & libraries, reproducibility) and structured data wrangling (desirable) BI & web-app development (desirable) Unstructured data wrangling (desirable) Benefits We offer our employees a comprehensive benefits package that focuses on what matters to you - health and well-being, personal finances, professional development, and a healthy work/life balance: Competitive salary plus annual bonus scheme Private health insurance plus enhanced dental and optical cover Generous pension scheme Generous number of days paid holiday Enhanced maternity and paternity pay for employees with 2+ years of service Access to comprehensive Mortgage Advisor Service Group income protection Life assurance coverage at 4x base salary EV car scheme and more -
Lumanity
Overview/About Lumanity Lumanity is dedicated to improving patient health by accelerating and optimizing access to medical advances. We partner with life sciences companies around the world to generate evidence to demonstrate the value of their product, translate the science and data into compelling product narratives, and enable commercial decisions that position these products for success in the market. We do this through strategic and complimentary areas of focus: Strategy Consulting & Insights, Value, Access, & Outcomes, and Medical Strategy and Communications. Responsibilities / Position overview As a Senior Data Engineer, you will work closely with another Senior Data Engineer to manage and optimize the company's data infrastructure. Your primary focus will be on integrating and supporting our enterprise systems-including NetSuite and Kantata (our PSA tool)-as well as our Azure-based data platform. You will share responsibility for maintaining our Azure SQL Data Warehouse, orchestrating ETL pipelines using Azure Data Factory, and ensuring the Finance team has access to clean, timely data for Power BI reporting. Your role will involve working with diverse data formats (relational databases, hierarchical JSON/XML, delimited files, Excel) and leveraging programming languages such as Python, Java, and (optionally) C/C++ to build and maintain robust data integrations across multiple SaaS platforms. Ensuring data consistency, quality, and flow will be at the heart of your work. In addition, you'll collaborate with IT, business analysts, finance, and our managed services provider (MSP) for NetSuite to deliver scalable, compliant data solutions that empower data-driven decision-making across the business. Azure Data Platform Management: Share responsibility for the administration and optimization of the company's Azure SQL Data Warehouse. Develop, maintain, and monitor Azure Data Factory pipelines for data extraction, transformation, and loading (ETL) from NetSuite, Kantata, and other enterprise systems. Ensure data availability, accuracy, and performance for enterprise reporting needs. Data Pipeline Development & Optimization: Design, build, and maintain scalable and robust data pipelines to integrate data from ERP and other enterprise systems. ERP System Integration: Collaborate with ERP specialists to ensure seamless integration of data between ERP systems (Oracle NetSuite and Kantata) and other enterprise systems into centralized data models. Enable BI reporting: Work alongside our reporting team to ensure business goals and needs are met through appropriate, timely, and accurate data provision. ETL Processes: Develop and maintain ETL processes to ensure efficient data extraction, transformation, and loading into data environments. Data Modelling & Architecture: Design and implement data models that meet business requirements, focusing on efficiency, reliability, and scalability. Automation & Continuous Improvement: Automate routine data processing tasks to improve efficiency and accuracy across systems, identifying areas for optimization and improvements. Data Governance & Quality: Establish and enforce data quality standards, monitoring the accuracy and integrity of data across systems. Represent data engineering on the change advisory board. Ensure compliance with regulatory requirements specific to the life sciences industry. Collaboration: Work closely with business analysts and key stakeholders to understand data requirements and deliver solutions that support analytics, reporting, and business operations. Documentation & Best Practices: Create detailed documentation of processes, data flows, and system integrations. Promote best practices in data engineering and system integration across the organization. Support & Troubleshooting: Provide technical support for data-related issues within dataflows and enterprise systems, ensuring minimal downtime and continuity of operations. Security & Compliance: Ensure data security best practices are followed and all processes comply with applicable regulations such as GDPR, HIPAA, and other life sciences-specific regulations. Qualifications Bachelor's degree in computer science, data science, engineering, or related field or equivalent college qualification or 5 years equivalent work experience. 5+ years of relevant experience working in data engineering and data warehousing. Experience with designing and implementing data models for enterprise data initiatives. Demonstrated experience leading projects involving data warehousing, data modelling, and data analysis. Proficiecy in Programming languages such as Java, Python, and C/C++ and tools such as SQL, R, SAS, or Excel Proficiency in the design and implementation of modern data architectures and concepts leveraging Azure cloud services, real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake, Databricks) Proficiency with relational database technologies and SQL programming, to include writing complex views, stored procedures, and database triggers Understanding of entity-relationship modelling, metadata systems, and data quality tools and techniques Experience with business intelligence tools and technologies such as Azure Data Factory, Power BI, and Tableau Learning and adopting new technology, especially in the ML/AI realm Collaborating and excelling in complex, cross-functional teams involving data scientists, business analysts, and other stakeholders Benefits We offer our employees a comprehensive benefits package that focuses on what matters to you - health and well-being, personal finances, professional development, and a healthy work/life balance: Competitive salary plus annual bonus scheme Private health insurance plus enhanced dental and optical cover Generous pension scheme Generous amount of paid days holiday Enhanced maternity and paternity pay for employees with 2+ years of service Access to comprehensive Mortgage Advisor Service Group income protection Life assurance coverage at 4x base salary EV car scheme and more -
Overview/About Lumanity Lumanity is dedicated to improving patient health by accelerating and optimizing access to medical advances. We partner with life sciences companies around the world to generate evidence to demonstrate the value of their product, translate the science and data into compelling product narratives, and enable commercial decisions that position these products for success in the market. We do this through strategic and complimentary areas of focus: Strategy Consulting & Insights, Value, Access, & Outcomes, and Medical Strategy and Communications. Responsibilities / Position overview As a Senior Data Engineer, you will work closely with another Senior Data Engineer to manage and optimize the company's data infrastructure. Your primary focus will be on integrating and supporting our enterprise systems-including NetSuite and Kantata (our PSA tool)-as well as our Azure-based data platform. You will share responsibility for maintaining our Azure SQL Data Warehouse, orchestrating ETL pipelines using Azure Data Factory, and ensuring the Finance team has access to clean, timely data for Power BI reporting. Your role will involve working with diverse data formats (relational databases, hierarchical JSON/XML, delimited files, Excel) and leveraging programming languages such as Python, Java, and (optionally) C/C++ to build and maintain robust data integrations across multiple SaaS platforms. Ensuring data consistency, quality, and flow will be at the heart of your work. In addition, you'll collaborate with IT, business analysts, finance, and our managed services provider (MSP) for NetSuite to deliver scalable, compliant data solutions that empower data-driven decision-making across the business. Azure Data Platform Management: Share responsibility for the administration and optimization of the company's Azure SQL Data Warehouse. Develop, maintain, and monitor Azure Data Factory pipelines for data extraction, transformation, and loading (ETL) from NetSuite, Kantata, and other enterprise systems. Ensure data availability, accuracy, and performance for enterprise reporting needs. Data Pipeline Development & Optimization: Design, build, and maintain scalable and robust data pipelines to integrate data from ERP and other enterprise systems. ERP System Integration: Collaborate with ERP specialists to ensure seamless integration of data between ERP systems (Oracle NetSuite and Kantata) and other enterprise systems into centralized data models. Enable BI reporting: Work alongside our reporting team to ensure business goals and needs are met through appropriate, timely, and accurate data provision. ETL Processes: Develop and maintain ETL processes to ensure efficient data extraction, transformation, and loading into data environments. Data Modelling & Architecture: Design and implement data models that meet business requirements, focusing on efficiency, reliability, and scalability. Automation & Continuous Improvement: Automate routine data processing tasks to improve efficiency and accuracy across systems, identifying areas for optimization and improvements. Data Governance & Quality: Establish and enforce data quality standards, monitoring the accuracy and integrity of data across systems. Represent data engineering on the change advisory board. Ensure compliance with regulatory requirements specific to the life sciences industry. Collaboration: Work closely with business analysts and key stakeholders to understand data requirements and deliver solutions that support analytics, reporting, and business operations. Documentation & Best Practices: Create detailed documentation of processes, data flows, and system integrations. Promote best practices in data engineering and system integration across the organization. Support & Troubleshooting: Provide technical support for data-related issues within dataflows and enterprise systems, ensuring minimal downtime and continuity of operations. Security & Compliance: Ensure data security best practices are followed and all processes comply with applicable regulations such as GDPR, HIPAA, and other life sciences-specific regulations. Qualifications Bachelor's degree in computer science, data science, engineering, or related field or equivalent college qualification or 5 years equivalent work experience. 5+ years of relevant experience working in data engineering and data warehousing. Experience with designing and implementing data models for enterprise data initiatives. Demonstrated experience leading projects involving data warehousing, data modelling, and data analysis. Proficiecy in Programming languages such as Java, Python, and C/C++ and tools such as SQL, R, SAS, or Excel Proficiency in the design and implementation of modern data architectures and concepts leveraging Azure cloud services, real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake, Databricks) Proficiency with relational database technologies and SQL programming, to include writing complex views, stored procedures, and database triggers Understanding of entity-relationship modelling, metadata systems, and data quality tools and techniques Experience with business intelligence tools and technologies such as Azure Data Factory, Power BI, and Tableau Learning and adopting new technology, especially in the ML/AI realm Collaborating and excelling in complex, cross-functional teams involving data scientists, business analysts, and other stakeholders Benefits We offer our employees a comprehensive benefits package that focuses on what matters to you - health and well-being, personal finances, professional development, and a healthy work/life balance: Competitive salary plus annual bonus scheme Private health insurance plus enhanced dental and optical cover Generous pension scheme Generous amount of paid days holiday Enhanced maternity and paternity pay for employees with 2+ years of service Access to comprehensive Mortgage Advisor Service Group income protection Life assurance coverage at 4x base salary EV car scheme and more -