Deblock Group
City, London
Dec 13, 2025
Full time
Overview Strategic, high-impact & high-ownership role building the data foundation for our AI-first fintech & crypto platform. Build the foundational data platform for Deblock's AI-first transformation. You'll consolidate our data ecosystem (PostgreSQL, Kafka, BigQuery, GCS) into a clean, governed, and ML-ready platform. You'll work closely with our CTO and ML team to enable AI features through robust data infrastructure. Core Requirements Strong experience with GCP (BigQuery, Dataflow, Cloud Storage, DataStream, AlloyDB/CloudSQL) Expert-level SQL Experience with dbt or similar for data modeling and testing Hands-on with streaming platforms (Kafka, Kafka Connect) Understanding of CDC tools (Debezium or similar) Experience building batch and real-time data pipelines Experience building dimensional models (fact/dimension tables) for analytics and ML features Experience implementing data governance: PII tagging, column-level security, access controls Experience with data quality monitoring with automated checks and alerting Understanding of observability: data freshness monitoring, schema change detection, pipeline health dashboards Experience optimising BigQuery performance (partitioning, clustering, query optimisation) Nice to Have Experience with Feature Store architecture and ML feature requirements Understanding of real-time vs batch feature serving patterns Prior work with financial services or regulated data environments Familiarity with Vertex AI ecosystem Experience with Apache Beam/Dataflow transformations Background collaborating with ML/data science teams Knowledge of vector databases or semantic search concepts Benefits Competitive salary + stock options Private dental + health insurance The best tech for your job 30 days of paid holidays (excl. bank holidays) Option to work 100% remotely or come to the office - your choice! Ability to work abroad for 4 months a year Leading position with huge impact, autonomy and ownership