• Home
  • Search Jobs
  • Register CV
  • Post a Job
  • Employer Pricing
  • Contact Us
  • Sign in
  • Sign up
  • Home
  • Search Jobs
  • Register CV
  • Post a Job
  • Employer Pricing
  • Contact Us
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

2 jobs found

Email me jobs like this
Refine Search
Current Search
global equities analyst
Senior Quant Analyst / Research Scientist (Contract)
CFA Institute
Senior Quant Analyst / Research Scientist (Contract) Location: London (cross-Atlantic collaboration) Contract: 12 months (strong potential for extension/perm role) Overview A leading global asset management firm is seeking a senior buy-side Quant Analyst / Research Scientist to support its Fund-of-Funds portfolio within a Front Office-aligned Advanced Analytics function. This is a high-impact contract mandate focused on: Hybrid annuity asset allocation modelling Cashflow forecasting Quadratic and convex optimisation Portfolio construction frameworks Python-based prototyping and deployment The role sits at the intersection of quant research, front office portfolio management, and technology deployment, contributing directly to investment strategy and production-ready solutions. You will work within a high-profile Applied R&D environment supporting Active Equities, Fixed Income, Risk Management, Corporate Finance and broader multi-asset strategies. Core Mandate The primary objective is to design and prototype quantitative models that support: Hybrid and annuity-style asset allocation Fund-of-funds portfolio construction Cashflow forecasting and liability-style modelling Constrained and quadratic optimisation problems This is fundamentally a modelling-first mandate, requiring deep applied mathematics capability and hands-on implementation in Python. AI/ML exposure is desirable but secondary - the role is not a pure AI engineering position. Key Responsibilities Develop and implement quantitative models for hybrid annuity asset allocation Solve quadratic, convex, and mixed-integer optimisation problems Apply portfolio construction standards including: Markowitz / Modern Portfolio Theory Black-Litterman Factor models Forecast portfolio cashflows and support annuity-style allocation structures Build robust prototype frameworks in Python Create comprehensive evaluation frameworks including: Out-of-sample validation Simulation Back-testing Analyse model performance and robustness Collaborate directly with front office PMs on assumptions and outputs Engage with quant research and AI teams to industrialise modelling solutions Ensure models are production-ready and operationalised effectively within internal systems Technical Environment Python (production-level proficiency required) Cloud-based research and development platforms: SageMaker Databricks Enterprise data infrastructure: Snowflake Systematic research and quantitative workflows Investment management datasets across multi-asset strategies Required Experience Senior quant experience within buy-side asset management or fund-of-funds environments Strong background in mathematical optimisation and applied modelling Portfolio construction / asset allocation expertise Hands-on Python development capability (not purely supervisory) Experience with equities; fixed income or hybrid portfolio exposure strongly preferred Experience creating model evaluation frameworks (OOS, simulation, back-testing) Experience working with investment management data Ability to read and computationally reproduce academic research Experience translating research outputs into production-grade solutions Comfortable collaborating across international teams (London / US) Desirable Experience Exposure to ML / deep learning architectures Experience integrating AI/ML prototypes into production environments Multi-asset, insurance, or annuity product exposure Experience working alongside Front Office technology teams or PM management tools CFA participation or strong applied financial markets knowledge Graduate degree in a STEM discipline, or equivalent industrial research experience
Mar 16, 2026
Full time
Senior Quant Analyst / Research Scientist (Contract) Location: London (cross-Atlantic collaboration) Contract: 12 months (strong potential for extension/perm role) Overview A leading global asset management firm is seeking a senior buy-side Quant Analyst / Research Scientist to support its Fund-of-Funds portfolio within a Front Office-aligned Advanced Analytics function. This is a high-impact contract mandate focused on: Hybrid annuity asset allocation modelling Cashflow forecasting Quadratic and convex optimisation Portfolio construction frameworks Python-based prototyping and deployment The role sits at the intersection of quant research, front office portfolio management, and technology deployment, contributing directly to investment strategy and production-ready solutions. You will work within a high-profile Applied R&D environment supporting Active Equities, Fixed Income, Risk Management, Corporate Finance and broader multi-asset strategies. Core Mandate The primary objective is to design and prototype quantitative models that support: Hybrid and annuity-style asset allocation Fund-of-funds portfolio construction Cashflow forecasting and liability-style modelling Constrained and quadratic optimisation problems This is fundamentally a modelling-first mandate, requiring deep applied mathematics capability and hands-on implementation in Python. AI/ML exposure is desirable but secondary - the role is not a pure AI engineering position. Key Responsibilities Develop and implement quantitative models for hybrid annuity asset allocation Solve quadratic, convex, and mixed-integer optimisation problems Apply portfolio construction standards including: Markowitz / Modern Portfolio Theory Black-Litterman Factor models Forecast portfolio cashflows and support annuity-style allocation structures Build robust prototype frameworks in Python Create comprehensive evaluation frameworks including: Out-of-sample validation Simulation Back-testing Analyse model performance and robustness Collaborate directly with front office PMs on assumptions and outputs Engage with quant research and AI teams to industrialise modelling solutions Ensure models are production-ready and operationalised effectively within internal systems Technical Environment Python (production-level proficiency required) Cloud-based research and development platforms: SageMaker Databricks Enterprise data infrastructure: Snowflake Systematic research and quantitative workflows Investment management datasets across multi-asset strategies Required Experience Senior quant experience within buy-side asset management or fund-of-funds environments Strong background in mathematical optimisation and applied modelling Portfolio construction / asset allocation expertise Hands-on Python development capability (not purely supervisory) Experience with equities; fixed income or hybrid portfolio exposure strongly preferred Experience creating model evaluation frameworks (OOS, simulation, back-testing) Experience working with investment management data Ability to read and computationally reproduce academic research Experience translating research outputs into production-grade solutions Comfortable collaborating across international teams (London / US) Desirable Experience Exposure to ML / deep learning architectures Experience integrating AI/ML prototypes into production environments Multi-asset, insurance, or annuity product exposure Experience working alongside Front Office technology teams or PM management tools CFA participation or strong applied financial markets knowledge Graduate degree in a STEM discipline, or equivalent industrial research experience
Data Engineer, Unified Platform
P2P
Overview DRW is a diversified trading firm with over 3 decades of experience bringing sophisticated technology and exceptional people together to operate in markets around the world. We value autonomy and the ability to quickly pivot to capture opportunities, so we operate using our own capital and trading at our own risk. Headquartered in Chicago with offices throughout the U.S., Canada, Europe, and Asia, we trade a variety of asset classes including Fixed Income, ETFs, Equities, FX, Commodities and Energy across all major global markets. We have also leveraged our expertise and technology to expand into three non-traditional strategies: real estate, venture capital and cryptoassets. We operate with respect, curiosity and open minds. The people who thrive here share our belief that it's not just what we do that matters-it's how we do it. DRW is a place of high expectations, integrity, innovation and a willingness to challenge consensus. Role As a Data Engineer on our Data Experience team, you will play an integral role in bringing vendor datasets into our data platform, governing our centralized data pipelines, supporting rapid data product development, and working alongside individual Traders, Quantitative Researchers, and Back-Office personnel to best utilize the firm's data and platform tools. Technical Requirements Summary Have experience designing and building data pipelines Have experience working within modern batch or streaming data ecosystems An expert in SQL and have experience in Java or Python Can apply data modeling techniques Able to own the delivery of data products, working with analysts and stakeholders to understand requirements and implement solutions Able to contribute to project management and project reporting What you will do in this role Help model, build, and manage data products built atop DRW's Unified Data Platform. Work closely with Data Strategists to determine appropriate data sources and implement processes to onboard and manage new data sources for trading, research, and back-office purposes. Contribute to data governance processes that enable discovery, cost-sharing, usage tracking, access controls, and quality control of datasets to address the needs of DRW trading teams and strategies. Continually monitor data ingestion pipelines and data quality to ensure stability, reliability, and quality of the data. Contribute to the monitoring and quality control software and processes. Own the technical aspects of vendor ingestion pipelines, coordinating with vendor relationship managers on upcoming changes, performing routine data operations without breaking internal users, and contributing to the team's on-call rotation to respond to unanticipated changes. Rapidly respond to user requests, identifying platform gaps and self-service opportunities that make the user experience more efficient. What you will need in this role 3+ years of experience working with modern data technologies and/or building data-first products. Excellent written and verbal communication skills. Proven ability to work in a collaborative, agile, and fast-paced environment, prioritizing multiple tasks and projects, and efficiently handle the demands of a trading environment. Proven ability to deliver rapid results within processes that span multiple stakeholders. Strong technical problem-solving skills. Extensive familiarity with SQL and Java or Python, with a proven ability to develop and deliver maintainable data transformations for production data pipelines. Experience leveraging data modeling techniques and ability to articulate the trade-offs of different approaches. Experience with one or more data processing technologies (e.g. Flink, Spark, Polars, Dask, etc.) Experience with multiple data storage technologies (e.g. S3, RDBMS, NoSQL, Delta/Iceberg, Cassandra, Clickhouse, Kafka, etc.) and knowledge of their associated trade-offs. Experience with multiple data formats and serialization systems (e.g. Arrow, Parquet, Protobuf/gRPC, Avro, Thrift, JSON, etc.) Experience managing data pipeline orchestration systems (e.g. Kubernetes, Argo Workflows, Airflow, Prefect, Dagster, etc.) Proven experience in managing the operational aspects of large data pipelines such as backfilling datasets, rerunning batch jobs, and handling dead-letter queues. Prior experience triaging data quality control processes, correcting data gaps and inaccuracies. For more information about DRW's processing activities and our use of job applicants' data, please view our Privacy Notice at California residents, please review the California Privacy Notice for information about certain legal rights at
Feb 26, 2026
Full time
Overview DRW is a diversified trading firm with over 3 decades of experience bringing sophisticated technology and exceptional people together to operate in markets around the world. We value autonomy and the ability to quickly pivot to capture opportunities, so we operate using our own capital and trading at our own risk. Headquartered in Chicago with offices throughout the U.S., Canada, Europe, and Asia, we trade a variety of asset classes including Fixed Income, ETFs, Equities, FX, Commodities and Energy across all major global markets. We have also leveraged our expertise and technology to expand into three non-traditional strategies: real estate, venture capital and cryptoassets. We operate with respect, curiosity and open minds. The people who thrive here share our belief that it's not just what we do that matters-it's how we do it. DRW is a place of high expectations, integrity, innovation and a willingness to challenge consensus. Role As a Data Engineer on our Data Experience team, you will play an integral role in bringing vendor datasets into our data platform, governing our centralized data pipelines, supporting rapid data product development, and working alongside individual Traders, Quantitative Researchers, and Back-Office personnel to best utilize the firm's data and platform tools. Technical Requirements Summary Have experience designing and building data pipelines Have experience working within modern batch or streaming data ecosystems An expert in SQL and have experience in Java or Python Can apply data modeling techniques Able to own the delivery of data products, working with analysts and stakeholders to understand requirements and implement solutions Able to contribute to project management and project reporting What you will do in this role Help model, build, and manage data products built atop DRW's Unified Data Platform. Work closely with Data Strategists to determine appropriate data sources and implement processes to onboard and manage new data sources for trading, research, and back-office purposes. Contribute to data governance processes that enable discovery, cost-sharing, usage tracking, access controls, and quality control of datasets to address the needs of DRW trading teams and strategies. Continually monitor data ingestion pipelines and data quality to ensure stability, reliability, and quality of the data. Contribute to the monitoring and quality control software and processes. Own the technical aspects of vendor ingestion pipelines, coordinating with vendor relationship managers on upcoming changes, performing routine data operations without breaking internal users, and contributing to the team's on-call rotation to respond to unanticipated changes. Rapidly respond to user requests, identifying platform gaps and self-service opportunities that make the user experience more efficient. What you will need in this role 3+ years of experience working with modern data technologies and/or building data-first products. Excellent written and verbal communication skills. Proven ability to work in a collaborative, agile, and fast-paced environment, prioritizing multiple tasks and projects, and efficiently handle the demands of a trading environment. Proven ability to deliver rapid results within processes that span multiple stakeholders. Strong technical problem-solving skills. Extensive familiarity with SQL and Java or Python, with a proven ability to develop and deliver maintainable data transformations for production data pipelines. Experience leveraging data modeling techniques and ability to articulate the trade-offs of different approaches. Experience with one or more data processing technologies (e.g. Flink, Spark, Polars, Dask, etc.) Experience with multiple data storage technologies (e.g. S3, RDBMS, NoSQL, Delta/Iceberg, Cassandra, Clickhouse, Kafka, etc.) and knowledge of their associated trade-offs. Experience with multiple data formats and serialization systems (e.g. Arrow, Parquet, Protobuf/gRPC, Avro, Thrift, JSON, etc.) Experience managing data pipeline orchestration systems (e.g. Kubernetes, Argo Workflows, Airflow, Prefect, Dagster, etc.) Proven experience in managing the operational aspects of large data pipelines such as backfilling datasets, rerunning batch jobs, and handling dead-letter queues. Prior experience triaging data quality control processes, correcting data gaps and inaccuracies. For more information about DRW's processing activities and our use of job applicants' data, please view our Privacy Notice at California residents, please review the California Privacy Notice for information about certain legal rights at

Modal Window

  • Home
  • Contact
  • About Us
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • Facebook
  • Twitter
  • Google Plus
  • LinkedIn
Parent and Partner sites: IT Job Board | Jobs Near Me | RightTalent.co.uk | Quantity Surveyor jobs | Building Surveyor jobs | Construction Recruitment | Talent Recruiter | Construction Job Board | Property jobs | myJobsnearme.com | Jobs near me
© 2008-2026 Jobsite Jobs | Designed by Web Design Agency