Peregrine
We are Data Services, our mission is to unlock the value of data by delivering high-quality, reliable, and secure data services that are accessible, understandable, and actionable. We continuously evolve our offerings, leveraging modern cloud-based technologies, and fostering strong partnerships to help our colleagues in the Bank navigate the complexities of a data-driven world and achieve their strategic objectives. Active SC Clearance Job Description: The world of data in Central Banking is evolving rapidly. With the rise of detailed data collection in financial regulation and the swift advancements in cloud-native data technologies, the demand for visionary data engineers is growing. We re seeking a senior Data Engineer to join our Data Engineering team and play a pivotal role in shaping the Bank s strategic cloud-first data platform. As a senior member of the team, you will play a key role in designing and delivering robust, scalable data solutions that support the Bank s core responsibilities around monetary policy, financial stability, and regulatory supervision. You ll contribute to technical design decisions, mentor engineers, and collaborate across teams to ensure our data infrastructure continues to evolve and meet future demands. Role Responsibilities Lead the design, development, and deployment of scalable, secure, and cost-effective distributed data solutions using Azure services (e.g., Azure Databricks, Azure Data Lake Storage, Azure Data Factory). Architect and implement advanced data pipelines using Databricks, Delta Lake, Python and Spark, ensuring performance, reliability, and maintainability across cloud and on-prem environments. Champion data quality, governance, and observability, ensuring data is accurate, timely, and fit-for-purpose for analytics, BI, and operational use cases. Drive the modernization of legacy systems, leading the migration of data infrastructure to Azure with minimal disruption and long-term scalability. Act as a technical authority on Azure-native data engineering, guiding best practices and setting standards across the team. Mentor and coach junior and mid-level engineers, fostering a culture of continuous learning, innovation, and technical excellence. Collaborate with architects, analysts, and stake holders to align data engineering efforts with strategic business goals and enterprise data strategy. Evaluate and introduce emerging technologies, tools, and methodologies to enhance the Bank s data capabilities. Own the end-to-end delivery of complex data solutions, from requirements gathering to production deployment and support. Contribute to the development of reusable frameworks, templates, and patterns to accelerate delivery and ensure consistency across projects. Minimum Criteria Extensive experience with Azure services including Azure Databricks, Azure Data Lake Storage, and Azure Data Factory. Advanced proficiency in SQL, Python, and Spark (PySpark), with a strong focus on performance optimization and distributed processing. Proven experience in CI/CD practices using industry-standard tools (e.g., GitHub Actions, Azure DevOps). Strong understanding of data architecture principles and cloud-native design patterns. Essential Criteria Demonstrated ability to lead technical delivery, mentor engineering teams and collaborate with stakeholders to ensure alignment between data solutions and business strategy. Proficiency in Linux/Unix environments and shell scripting. Deep understanding of source control, testing strategies, and agile development practices. Self-motivated with a strategic mindset and a passion for driving innovation in data engineering. Desirable Criteria Experience delivering data pipelines on Hortonworks/Cloudera on-prem and leading cloud migration initiatives. Familiarity with: Apache Airflow Data modelling and metadata management Experience influencing enterprise data strategy and contributing to architectural governance. Changed this now. I was confusing this with PDE role as I am working on that in parallel. Hope this makes sense now. data solutions rather than architectures? Should add Python here as a key tech we use Have mentioned Python in 'Minimum Criteria' section below, but will add here too this could be added to Essential Criteria ? stakeholder and project management ? Have updated in essential criteria below. But I have now used the previous version to create requisition in OBS. Will see if it can be changed. What is the difference between "minimum" and "essential" criteria. Both imply that they are mandatory and so could be one list? This is a bit confusing. I used to have just one, but this is the standard format of JD that the Bank wants us to follow. Here is the difference: Min Criteria: This must list the minimum technical skills/experience/qualifications required to do the job and should be measurable/scoreable. The screening questions you select must link to these, in order to allow candidates to best demonstrate their suitability for the role. Essential: This lists other important technical skills/experience/qualifications, and also more behavioural competencies. These are ones that are better assessed at interview rather than on screening questions on the application form Ok, I think we could go back and ask HR about this as it does seem confusing and to me doesn't give a good impression of the Bank to applicants at it looks like 2 lists for the same thing. I had checked this earlier, but seems they want us to follow this format. When I advertised last time, I just mentioned Minimum Criteria, but they said it has to be split into Minimum and Essential. Don't think we need to mention Atlas or Cloudera Manager as we hardly ever use those. Airflow could be useful so would leave that in.
We are Data Services, our mission is to unlock the value of data by delivering high-quality, reliable, and secure data services that are accessible, understandable, and actionable. We continuously evolve our offerings, leveraging modern cloud-based technologies, and fostering strong partnerships to help our colleagues in the Bank navigate the complexities of a data-driven world and achieve their strategic objectives. Active SC Clearance Job Description: The world of data in Central Banking is evolving rapidly. With the rise of detailed data collection in financial regulation and the swift advancements in cloud-native data technologies, the demand for visionary data engineers is growing. We re seeking a senior Data Engineer to join our Data Engineering team and play a pivotal role in shaping the Bank s strategic cloud-first data platform. As a senior member of the team, you will play a key role in designing and delivering robust, scalable data solutions that support the Bank s core responsibilities around monetary policy, financial stability, and regulatory supervision. You ll contribute to technical design decisions, mentor engineers, and collaborate across teams to ensure our data infrastructure continues to evolve and meet future demands. Role Responsibilities Lead the design, development, and deployment of scalable, secure, and cost-effective distributed data solutions using Azure services (e.g., Azure Databricks, Azure Data Lake Storage, Azure Data Factory). Architect and implement advanced data pipelines using Databricks, Delta Lake, Python and Spark, ensuring performance, reliability, and maintainability across cloud and on-prem environments. Champion data quality, governance, and observability, ensuring data is accurate, timely, and fit-for-purpose for analytics, BI, and operational use cases. Drive the modernization of legacy systems, leading the migration of data infrastructure to Azure with minimal disruption and long-term scalability. Act as a technical authority on Azure-native data engineering, guiding best practices and setting standards across the team. Mentor and coach junior and mid-level engineers, fostering a culture of continuous learning, innovation, and technical excellence. Collaborate with architects, analysts, and stake holders to align data engineering efforts with strategic business goals and enterprise data strategy. Evaluate and introduce emerging technologies, tools, and methodologies to enhance the Bank s data capabilities. Own the end-to-end delivery of complex data solutions, from requirements gathering to production deployment and support. Contribute to the development of reusable frameworks, templates, and patterns to accelerate delivery and ensure consistency across projects. Minimum Criteria Extensive experience with Azure services including Azure Databricks, Azure Data Lake Storage, and Azure Data Factory. Advanced proficiency in SQL, Python, and Spark (PySpark), with a strong focus on performance optimization and distributed processing. Proven experience in CI/CD practices using industry-standard tools (e.g., GitHub Actions, Azure DevOps). Strong understanding of data architecture principles and cloud-native design patterns. Essential Criteria Demonstrated ability to lead technical delivery, mentor engineering teams and collaborate with stakeholders to ensure alignment between data solutions and business strategy. Proficiency in Linux/Unix environments and shell scripting. Deep understanding of source control, testing strategies, and agile development practices. Self-motivated with a strategic mindset and a passion for driving innovation in data engineering. Desirable Criteria Experience delivering data pipelines on Hortonworks/Cloudera on-prem and leading cloud migration initiatives. Familiarity with: Apache Airflow Data modelling and metadata management Experience influencing enterprise data strategy and contributing to architectural governance. Changed this now. I was confusing this with PDE role as I am working on that in parallel. Hope this makes sense now. data solutions rather than architectures? Should add Python here as a key tech we use Have mentioned Python in 'Minimum Criteria' section below, but will add here too this could be added to Essential Criteria ? stakeholder and project management ? Have updated in essential criteria below. But I have now used the previous version to create requisition in OBS. Will see if it can be changed. What is the difference between "minimum" and "essential" criteria. Both imply that they are mandatory and so could be one list? This is a bit confusing. I used to have just one, but this is the standard format of JD that the Bank wants us to follow. Here is the difference: Min Criteria: This must list the minimum technical skills/experience/qualifications required to do the job and should be measurable/scoreable. The screening questions you select must link to these, in order to allow candidates to best demonstrate their suitability for the role. Essential: This lists other important technical skills/experience/qualifications, and also more behavioural competencies. These are ones that are better assessed at interview rather than on screening questions on the application form Ok, I think we could go back and ask HR about this as it does seem confusing and to me doesn't give a good impression of the Bank to applicants at it looks like 2 lists for the same thing. I had checked this earlier, but seems they want us to follow this format. When I advertised last time, I just mentioned Minimum Criteria, but they said it has to be split into Minimum and Essential. Don't think we need to mention Atlas or Cloudera Manager as we hardly ever use those. Airflow could be useful so would leave that in.
Peregrine
City, London
The Role We are transforming our Information Security program from a compliance-based checklist to a dynamic, risk-based operation. We are looking for a Senior Policy Administrator to lead the modernization of our governance framework.This is not a clerical role. You will not just be formatting Word documents. You will be a strategic partner to our Security Architecture and Engineering teams, translating complex technical controls (Cloud Security, Identity, Zero Trust) into clear, enforceable standards. You will serve as the bridge between "What the Framework says" (NIST/TPN) and "What the Architecture does." Key Responsibilities 1. Governance Framework Architecture Build the Engine: Design and maintain the comprehensive hierarchy of Information Security documents ( Policy o Standard o Procedure o Guideline ). Ensure the framework is scalable, searchable, and mapped to the NIST CSF 2.0 and ISO 27001 controls. Lifecycle Management: Move beyond "annual reviews." Implement a continuous review cycle triggered by architectural changes or emerging threats, ensuring our standards never drift from reality. 2. Security Architecture Collaboration (Critical) Technical Translation: Work side-by-side with Principal Security Architects to extract technical specifications (e.g., encryption algorithms, IAM protocols, cloud hardening baselines) and codify them into formal Security Standards . Reality Checks: Challenge the status quo. If a proposed policy cannot be technically enforced by the Architecture team, you are responsible for flagging the gap and negotiating a realistic control or a formal risk exception. Baseline Management: Assist Engineering in defining and documenting "Golden Image" and secure configuration baselines (CIS Benchmarks) that underpin the broader policy statements. 3. LogicGate & Tooling Administration Platform Architect: Serve as the primary architect for our LogicGate Risk Cloud Policy Module. You will design the metadata schema, automated workflows, and approval routing logic. Automated Assurance: Configure the tool to link Policies directly to Risks and Controls . When a Standard is updated, the tool should automatically flag related Risks for re-evaluation. 4. Compliance & TPN Alignment TPN "Gold Shield": Ensure all policies meet the strict physical and digital security requirements of the Trusted Partner Network (TPN) . You will be the authority on whether a policy change jeopardizes our "Gold Shield" status. Audit Defense: Maintain a "state of readiness" where policies are tagged with evidence requirements, allowing for rapid export during client or regulatory audits. Qualifications Required Experience: Experience: 5-8+ years in Information Security, GRC, or Technical Writing in a highly regulated technical environment. Frameworks: Expert-level knowledge of NIST CSF 2.0 , ISO 27001 , and NIST 800-53 . Familiarity with TPN (MPA) or SOC 2 is highly preferred. Technical Fluency: You do not need to be a coder, but you must understand core security concepts (e.g., SAML Container Security , Network Segmentation ) well enough to debate standards with Engineers. Skills & Competencies: LogicGate / GRC Tools: Proven experience configuring and managing enterprise GRC platforms (LogicGate, ServiceNow, Archer, OneTrust). Strategic Autonomy: Ability to manage the entire document lifecycle without micromanagement. You can sit in an Architecture Review Board meeting and identify policy impacts in real-time. Communication: Exceptional written communication skills with the ability to strip away "legalese" and write policies that developers can actually read and follow. Nice-to-Have: Certifications: CISA , CRISC , CISM , or CISSP . Experience in the Video Game, Media, or Software Development industries. Why This Role? You will be the "Legislator" of our security state. Instead of chasing signatures, you will be defining the rules of the road for a global creative organization. If you are tired of "paper compliance" and want to build a governance framework that actually improves security posture, this is the role for you.
The Role We are transforming our Information Security program from a compliance-based checklist to a dynamic, risk-based operation. We are looking for a Senior Policy Administrator to lead the modernization of our governance framework.This is not a clerical role. You will not just be formatting Word documents. You will be a strategic partner to our Security Architecture and Engineering teams, translating complex technical controls (Cloud Security, Identity, Zero Trust) into clear, enforceable standards. You will serve as the bridge between "What the Framework says" (NIST/TPN) and "What the Architecture does." Key Responsibilities 1. Governance Framework Architecture Build the Engine: Design and maintain the comprehensive hierarchy of Information Security documents ( Policy o Standard o Procedure o Guideline ). Ensure the framework is scalable, searchable, and mapped to the NIST CSF 2.0 and ISO 27001 controls. Lifecycle Management: Move beyond "annual reviews." Implement a continuous review cycle triggered by architectural changes or emerging threats, ensuring our standards never drift from reality. 2. Security Architecture Collaboration (Critical) Technical Translation: Work side-by-side with Principal Security Architects to extract technical specifications (e.g., encryption algorithms, IAM protocols, cloud hardening baselines) and codify them into formal Security Standards . Reality Checks: Challenge the status quo. If a proposed policy cannot be technically enforced by the Architecture team, you are responsible for flagging the gap and negotiating a realistic control or a formal risk exception. Baseline Management: Assist Engineering in defining and documenting "Golden Image" and secure configuration baselines (CIS Benchmarks) that underpin the broader policy statements. 3. LogicGate & Tooling Administration Platform Architect: Serve as the primary architect for our LogicGate Risk Cloud Policy Module. You will design the metadata schema, automated workflows, and approval routing logic. Automated Assurance: Configure the tool to link Policies directly to Risks and Controls . When a Standard is updated, the tool should automatically flag related Risks for re-evaluation. 4. Compliance & TPN Alignment TPN "Gold Shield": Ensure all policies meet the strict physical and digital security requirements of the Trusted Partner Network (TPN) . You will be the authority on whether a policy change jeopardizes our "Gold Shield" status. Audit Defense: Maintain a "state of readiness" where policies are tagged with evidence requirements, allowing for rapid export during client or regulatory audits. Qualifications Required Experience: Experience: 5-8+ years in Information Security, GRC, or Technical Writing in a highly regulated technical environment. Frameworks: Expert-level knowledge of NIST CSF 2.0 , ISO 27001 , and NIST 800-53 . Familiarity with TPN (MPA) or SOC 2 is highly preferred. Technical Fluency: You do not need to be a coder, but you must understand core security concepts (e.g., SAML Container Security , Network Segmentation ) well enough to debate standards with Engineers. Skills & Competencies: LogicGate / GRC Tools: Proven experience configuring and managing enterprise GRC platforms (LogicGate, ServiceNow, Archer, OneTrust). Strategic Autonomy: Ability to manage the entire document lifecycle without micromanagement. You can sit in an Architecture Review Board meeting and identify policy impacts in real-time. Communication: Exceptional written communication skills with the ability to strip away "legalese" and write policies that developers can actually read and follow. Nice-to-Have: Certifications: CISA , CRISC , CISM , or CISSP . Experience in the Video Game, Media, or Software Development industries. Why This Role? You will be the "Legislator" of our security state. Instead of chasing signatures, you will be defining the rules of the road for a global creative organization. If you are tired of "paper compliance" and want to build a governance framework that actually improves security posture, this is the role for you.