Snowflake Data Architect - 550 Inside IR35 - Hybrid
We are seeking an experienced Data Architect to design, build, and maintain scalable, secure, and high-performing data platforms. The ideal candidate will have strong expertise in Azure-based data solutions, Snowflake, and modern data engineering tools, and will play a key role in shaping our enterprise data architecture to support analytics, reporting, and advanced data use cases.
Key Responsibilities
Design and implement end-to-end data architectures using Azure cloud services
Architect and optimize data solutions on Snowflake for performance, scalability, and cost efficiency
Build and maintain data pipelines using Azure Data Factory (ADF)
Develop and manage transformation workflows using DBT
Design and support ET/ELT processes for structured and semi-structured data
Develop data engineering solutions using Python for data processing, automation, and orchestration
Implement monitoring and observability for data systems using Prometheus
Define data models, schemas, and standards to ensure data consistency and quality
Collaborate with data engineers, analysts, and business stakeholders to translate requirements into technical solutions
Ensure data security, governance, and compliance with organizational and regulatory standards
Troubleshoot and optimize data pipelines and architectures for reliability and performance
Required Qualifications
Proven experience as a Data Architect or Senior Data Engineer
Strong hands-on experience with Microsoft Azure data services
Extensive experience with Snowflake data warehousing
Proficiency in Azure Data Factory (ADF) for data orchestration
Strong Python programming skills
Hands-on experience with DBT for data transformation and modeling
Solid understanding of ET/ELT architecture and best practices
Experience with monitoring and observability tools such as Prometheus
Strong knowledge of data modeling, data warehousing concepts, and cloud architecture
Excellent problem-solving and communication skills
Preferred Qualifications
Experience with CI/CD for data pipelines
Familiarity with infrastructure-as-code tools
Experience working in Agile or DevOps environments
Knowledge of data governance, metadata management, and data quality frameworks
To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed).
Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.