Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Experience Level
Senior
Qualifications
Proven experience as a Data Engineer or in a similar role. Expertise in SQL, Python, and data modeling. Experience with big data technologies such as Hadoop, Spark, or similar. Strong understanding of ETL processes and data warehousing. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment.
About the job
hmgroup needs a Senior Data Engineer based in London to influence how data is used across the company. This position centers on building and maintaining systems that move and organize large amounts of data, helping teams make informed decisions.
Main responsibilities
Design and manage data pipelines that process significant data volumes
Build architecture to ensure reliable data processing and analytics
Convert raw data into insights that support teams throughout hmgroup
About hmgroup
hmgroup is a forward-thinking organization committed to leveraging technology for innovative solutions. We pride ourselves on our dynamic work culture and our commitment to professional development. Join us to be a part of a team that values creativity, collaboration, and continuous improvement.
Who are we? 👋At Ki Insurance, we’re making headlines by insuring everything from space shuttles to world tours, wind farms, and even footballers' legs.Our mission is clear: to digitally transform and revolutionize a 335-year-old insurance market. Collaborating with Google and UCL, we’ve developed a platform that harnesses algorithms, machine learning, and l…
Full-time|On-site|City of London, England, United Kingdom
About UsAt Ki Insurance, we are at the forefront of innovation in the insurance industry. With a diverse range of clients from space shuttles to world tours, we ensure coverage for extraordinary ventures. Our mission is to digitally disrupt a 335-year-old market by leveraging advanced technologies.Partnering with Google and UCL, we have developed a cutting-edge platform that utilizes algorithms, machine learning, and large language models to provide insurance brokers with quotes in a matter of seconds, rather than days.As the largest global algorithmic insurance carrier and the fastest-growing syndicate in the Lloyd's of London market, Ki has made history by achieving $100 million in profit within just three years. Our teams, comprised of individuals from various backgrounds, collaborate in an agile and cross-functional environment to enhance customer experience and challenge the status quo.Your RoleIn this pivotal role, you will support our middle and back-office operations, crucial for our continued success. As part of a multi-disciplinary team, you will collaborate with software and data engineers, full-stack developers, platform operations experts, algorithm researchers, and data scientists to deliver impactful solutions. Your primary focus will be on designing and developing sophisticated data processing modules and reporting using BigQuery and Tableau.Key Responsibilities: Collaborate with business teams in finance and actuarial science, along with data scientists and engineers, to design, build, optimize, and maintain production-grade data pipelines and reporting from our internal Data Warehouse solution based on GCP/BigQuery.Engage with various stakeholders to identify opportunities for leveraging new internal and external data sources.Partner with EY and IBM to ensure the robustness of data model design and engineering, supporting our ambitious growth and scalability plans.Oversee business-as-usual (BAU) management of data models, reporting, and integration pipelines.Establish frameworks, infrastructure, and systems for effective management and governance of Ki’s data assets.Create detailed documentation to facilitate ongoing BAU support and maintenance of data structures, schemas, and reporting.Collaborate with the broader engineering community to enhance our data and MLOps capabilities.Ensure compliance with internal and external data quality and governance standards.Monitor and resolve data pipeline issues to maintain reliability and accuracy.
Valtech partners with well-known brands to design and deliver customer experiences. The company emphasizes growth, international opportunities, and a culture shaped by clear values. The role The Conversational Analytics Engineer position centers on developing digital solutions that improve how customers interact with businesses. This role involves exploring new approaches, questioning established methods, and helping to create customer experiences that can influence entire industries. What to expect Work on projects that focus on conversational analytics and digital innovation Contribute to solutions that aim to improve customer engagement Participate in a team culture that values learning and professional growth Location This position is based in London.
Join Devoteam as a Lead Consultant specializing in Google Cloud Platform (GCP) and Data & AI/ML pre-sales architecture. In this pivotal role, you will leverage your expertise to drive innovative solutions that empower our clients to harness the power of data and artificial intelligence.Your responsibilities will include collaborating with sales teams to identify client needs, designing tailored solutions, and presenting compelling proposals that highlight our capabilities in GCP and AI/ML technologies.
Join Pace Consulting as a GCP Architect, where you will lead the design and implementation of innovative cloud solutions on the Google Cloud Platform. You will collaborate with cross-functional teams to drive cloud transformation initiatives, ensuring that our clients achieve their business objectives efficiently and effectively.
Lead Data Engineer - Google Cloud Platform (GCP) Location: London (Hybrid) | Department: Technology & Engineering | Employment Type: Permanent Transform Financial Services with Innovative Data Solutions About the RoleWe are seeking a skilled Lead Data Engineer specializing in Google Cloud Platform to enhance our dynamic Technology & Engineering team based in London. In this pivotal role, you will collaborate with top-tier financial institutions on groundbreaking transformation initiatives, leveraging your extensive knowledge of GCP to develop resilient, scalable data architectures. This opportunity is perfect for engineers who are passionate about innovation, eager to tackle challenges head-on, and thrive in collaborative, agile environments. Key Responsibilities Architect and implement robust, scalable data pipelines utilizing GCP offerings such as BigQuery, Dataflow, and Dataproc. Engage closely with clients to ascertain business needs and design tailored data solutions. Implement engineering best practices throughout the software development lifecycle (SDLC), including CI/CD, testing, and version control using Git. Lead the development of well-tested, fault-tolerant data engineering solutions. Mentor and support junior engineers, fostering a culture of knowledge sharing within the team. Desired Qualifications Extensive hands-on experience with GCP, including BigQuery, Pub/Sub, Cloud Composer, and IAM. Proficient in Python, SQL, and PySpark, with a solid background in data lakes, warehousing, and data modeling. Strong understanding of DevOps methodologies, containerization, and CI/CD tools such as Jenkins or GitHub Actions. A proven track record of delivering data engineering solutions in agile settings. Excellent communication skills, capable of articulating technical concepts to non-technical audiences. Preferred Qualifications Familiarity with real-time streaming technologies like Kafka or Spark Streaming. Experience in MLOps and maintaining machine learning models in production. Knowledge of data visualization tools such as Power BI, Qlik, or Tableau. Experience developing applications on Vertex AI is a plus. Understanding of SQL and NoSQL databases, with the ability to evaluate ETL vs ELT trade-offs. Why Capco? Drive impactful technology solutions for leading financial institutions. Be part of a collaborative culture that values innovation and personal growth. Enjoy a hybrid work environment that promotes work-life balance.
Join Capco as a Senior Data Engineer specializing in Google Cloud Platform (GCP) and help shape the future of financial services through innovative cloud-based data solutions. In this hybrid role based in London, you'll collaborate with Tier 1 financial institutions, leveraging your expertise to design and implement robust, scalable data architectures. This position is perfect for engineers who excel in fast-paced environments and are passionate about data-driven transformation.
Join Trainline as a Data Engineer and become an integral part of our dynamic data team. In this role, you will leverage cutting-edge technologies to build and maintain robust data infrastructure that supports our mission of providing seamless travel solutions. Your expertise will contribute to optimizing data pipelines, ensuring data quality, and delivering actionable insights that drive strategic decisions.
Role overview hmgroup needs a Senior Data Engineer based in London to influence how data is used across the company. This position centers on building and maintaining systems that move and organize large amounts of data, helping teams make informed decisions. Main responsibilities Design and manage data pipelines that process significant data volumes Build architecture to ensure reliable data processing and analytics Convert raw data into insights that support teams throughout hmgroup
Full-time|On-site|London, Greater London, United Kingdom
Role Overview:We are looking for a talented Data Engineer to contribute significantly to a vital client initiative. This role will involve the design, construction, and optimization of data pipelines within a cloud-centric data and analytics ecosystem. The ideal candidate will possess a solid foundation in data engineering principles, a commitment to delivering reliable solutions, and the capability to work collaboratively with both technical and business teams.Key Responsibilities:Data Engineering & Pipeline DevelopmentDesign, build, and sustain scalable data pipelines for both batch and streaming workloads.Develop robust ingestion, transformation, and curation processes using contemporary data engineering tools and methodologies.Create and maintain data models, curated datasets, and integration layers to facilitate analytics and reporting.Enhance data processing for optimum performance, reliability, and cost-effectiveness.Data Quality, Governance & SecurityEstablish data quality checks, validation processes, and automated monitoring.Ensure comprehensive and accurate data lineage, metadata management, and documentation.Implement necessary security measures, including role-based access controls, data masking, and protection of sensitive information.Cloud & Platform EngineeringUtilize cloud-native services for storage, compute, and workflow orchestration.Oversee pipeline and platform performance, identifying and resolving any issues or bottlenecks.Assist in configuring environments, standardizing processes, and implementing best practices.DevOps, Automation & ObservabilityLeverage CI/CD practices to deploy code, configurations, and data assets across different environments.Maintain version control, engage in code reviews, and adhere to release processes.Construct monitoring, alerting, and logging systems to support high-quality operations and quick issue resolution.Collaboration & DeliveryWork closely with technical and business stakeholders to ensure alignment and successful project delivery.
Note: This position requires 4 days of in-office work in Central London and 1 day of remote work.Key Responsibilities:Develop features, resolve issues, and perform daily monitoring on the cloud delta lakehouse data platform.Collaborate within an agile technical team, actively engaging in all ceremonies, including daily stand-ups and backlog refinement sessions.Make architectural decisions, mentor junior engineers, and contribute to the establishment of engineering standards, documentation, and reusable frameworks.Experience Required:Minimum of 10 years of overall experience in data engineering.At least 5 years of experience in a data warehouse solution landscape.Proven experience leading complex data initiatives and making architectural decisions.Familiar with agile delivery methodologies, preferably SCRUM or Kanban.Ability to take ownership of data issues, conduct investigations and root cause analysis, and implement fixes through code and data modifications.Understanding of data warehouse testing strategies and quality assurance processes.Proficient in git branching strategies and code management.Solid understanding of technical release management processes in Azure DevOps.Strong communication skills, capable of driving alignment, collaboration, and efficiency within teams.
Join our team as a Senior Data Engineer specializing in AWS!Location: London (Hybrid) | Practice Area: Technology & Engineering | Type: PermanentAbout the RoleWe are seeking a talented Senior Data Engineer with expertise in AWS to be a key player in our dynamic team of engineering professionals. You will spearhead the design, construction, and deployment of innovative cloud-based data pipelines that drive transformative outcomes for Tier 1 financial services clients. Your contributions will significantly enhance data ingestion, transformation, and delivery at scale across the financial services landscape.Your ResponsibilitiesDesign and implement end-to-end data pipelines using AWS-native tools and cutting-edge data architectures.Collaborate closely with clients to gather requirements, craft tailored solutions, and deploy production-ready systems.Utilize AWS Well-Architected Principles to ensure your solutions are scalable, secure, and resilient.Lead the development of robust, thoroughly tested, and fault-tolerant data engineering solutions.Mentor junior engineers, fostering a culture of knowledge sharing and continuous learning within the team.What We Seek
Join Us in Revolutionizing Compensation.At Ravio, we recognize that fair compensation is crucial to both individual success and organizational growth. With a mission to transform how companies manage employee pay, we leverage advanced data solutions to enable our clients to attract and retain top talent. Our innovative real-time data platform redefines compensation management, offering insights into salary, equity, and benefits, ensuring everyone receives their rightful earnings regardless of their background.As a rapidly growing company, Ravio has positioned itself as a leader in Europe with over 1,200 clients, and we are on a journey to become the global authority in compensation data and tools. Joining our ambitious team provides an unparalleled opportunity to contribute to a product that is shaping the future of pay.We are currently seeking a Senior Data Engineer to enhance our data infrastructure, drive innovation, and contribute to our mission. If you’re excited about the prospect of scaling a startup into a global powerhouse, we want to hear from you!
Join the leading gift experience provider in the UK, Buyagift and Red Letter Days, where we transform special moments into unforgettable memories! With a diverse selection of over 4000 experiences ranging from spa days to thrilling skydives, we excel in delivering the gift of surprise. As part of the esteemed Moonpig Group, our vision is to establish ourselves as the most reliable platform within the UK gift experience sector. Our mission centers on spreading joy through thoughtful, enriching experiences that resonate with our customers. People are integral to our operations. Since becoming a part of the Moonpig Group in 2022, we have thrived on strong values, innovative ideas, and a unified passion for making a genuine impact. Here, you will have the opportunity to create unforgettable experiences and foster meaningful connections.Position: Data Engineering Manager | London – Hybrid (2 days in office) | Competitive Salary + BenefitsRole OverviewWe are seeking a talented Data Engineering Manager to spearhead our Data Engineering team at Buyagift & Red Letter Days, a proud member of the Experience More group.This pivotal role involves utilizing data to enhance our business understanding and improve customer service. You will lead a team of data engineers dedicated to developing and maintaining a robust, scalable, and secure data infrastructure, ensuring timely access to the right data for the right individuals.Your responsibilities will blend team leadership with hands-on technical oversight, guiding our data strategy, data models, and architecture while enhancing data quality and governance. Collaborating closely with business stakeholders, you will design and implement structured, reliable data that supports reporting, analytics, and informed, data-driven decisions.
About XYZ RealityXYZ Reality is an innovative, award-winning Series-A startup on the cusp of our next funding round. Our goal is to enhance our platform by improving its features, performance, and scalability, all while transforming the construction industry.As a dynamic multi-disciplinary organization, we operate across various fields such as cloud development, data governance, data processing pipelines, electronics, embedded software/hardware, mechanical design/manufacturing, AI & computer vision, and data science, all contributing to the efficacy of our Building Information Modeling (BIM) Platform.To help propel our mission, we are in search of a Senior Data Engineer who possesses extensive experience in data modeling, database management, and data pipeline development. The ideal candidate will play a pivotal role in maintaining our existing tech stack and developing new features with an emphasis on performance and scalability. Collaboration with our API/backend development and data pipeline teams will be crucial to create robust and efficient solutions.
Join our dynamic team at Jane Street as a Data Centre Engineer in London! In this role, you will be pivotal in ensuring the smooth operation, maintenance, and optimization of our data centre facilities. You will work closely with various teams to implement innovative infrastructure solutions that enhance performance and reliability.
Role Overview Utility Warehouse is looking for a Data Engineering Manager to shape and execute its data strategy in London. This role leads a team of data engineers focused on building and maintaining data pipelines that power analytics and business intelligence across the company. What You Will Do Manage and mentor a team of data engineers, supporting their growth and performance Design and implement data pipelines to meet analytics and reporting needs Work with cross-functional partners to understand data requirements and deliver solutions Guide the team through technical challenges in data architecture and engineering Help turn data into actionable insights that support business growth What We’re Looking For Experience leading data engineering teams Strong background in data architecture and pipeline development Ability to collaborate across teams and communicate technical concepts clearly Interest in using data to support decision-making and business outcomes This position is based in London.
blueinnrecruitment is excited to announce an opportunity for a Temporary Works Coordinator (TWC) to join a prestigious Civil Engineering firm engaged in the Construction Industry. This is a permanent position, and the successful candidate will play a pivotal role in the landmark HS2 project located in Euston. The ideal TWC candidate will be responsible for overseeing all facets of design, including coordination, compliance, submission, and management of engineering and assurance processes, alongside maintaining and updating the Temporary Works Procedure (WI300). Previous experience in similar roles within the construction sector is essential.Key ResponsibilitiesThe responsibilities of the TWC encompass, but are not limited to:Acting as the primary liaison between Temporary Works designers and the site team.Coordinating all Temporary Works activities.Designating Temporary Works Supervisors as required.Formulating procedures for the management of Temporary Works and ensuring their implementation on-site.Assigning responsibilities to relevant parties, including Designated Individuals and the Construction team.Preparing and ensuring the adequacy of the design brief in alignment with site requirements.Overseeing satisfactory Temporary Works design execution and ensuring design checks are conducted and approved.Delegating design checks to ensure independence where necessary.Making Temporary Works design available to all interested parties, including the CDM Coordinator.Maintaining comprehensive records related to the final design of Temporary Works, including drawings and calculations.Providing constructors with detailed design information, limitations, and guidance notes, along with specific method statements.Ensuring implementation of the Temporary Works design according to specifications and drawings.Reviewing actual conditions against assumed conditions in the Temporary Works design.
About YouLendYouLend is an innovative and rapidly expanding FinTech company, recognized as the leading embedded financing platform for top-tier e-commerce platforms, technology firms, and Payment Service Providers. Our cutting-edge software empowers partners to enhance their offerings by providing customized financing solutions to their merchants under their own branding, all while mitigating capital risk.Backed by EQT, a prominent Private Equity firm, YouLend has achieved remarkable growth, exceeding 100% year-over-year since 2020. Based in London, we also operate across various European countries and the United States, serving renowned partners such as eBay, Amazon, Just Eat, Shopify, and Stripe.Position OverviewWe are on the lookout for a Data Engineer to become an integral part of our expanding Data Engineering & Platform team. This role is pivotal, bridging infrastructure, DevOps, and advanced data tools, with a primary focus on facilitating rapid, secure, and scalable analytics. You will play a crucial role in constructing and scaling a premier data platform that supports a wide array of functions, including dashboards, experimentation, machine learning, and compliance.Key Responsibilities:Develop and oversee the infrastructure for our data platform, utilizing technologies such as AWS, Snowflake, dbt, and Airflow.Design and execute CI/CD pipelines for dbt and various data workflows.Automate data platform operations using Python and infrastructure-as-code tools like Pulumi and Terraform.Collaborate with analytics, machine learning, product, and engineering teams to enhance data solutions.Maintain data quality, lineage, and governance through rigorous testing and monitoring.Utilize cost observability tools to promote efficient platform usage.Qualifications:The ideal candidate will possess the following qualifications:Demonstrable experience with cloud-based data platforms such as Snowflake, Redshift, or BigQuery.Strong proficiency in Python and SQL for automation and analytics purposes.Familiarity with CI/CD processes, particularly in dbt or similar data platforms.Practical experience with Infrastructure as Code (IaC) tools including Pulumi, Terraform, or CloudFormation.Solid understanding of orchestration tools like Airflow.Exceptional communication skills with a history of cross-functional collaboration.Desirable Skills:Experience with AWS services (S3, Lambda, MWAA) and Azure DevOps.Familiarity with monitoring tools such as DataDog.Why Choose YouLend?Recognized as one of the “Best Places to Work in 2024”, YouLend offers a dynamic workplace environment.
About UsAt Intropic, we are at the forefront of financial intelligence, merging extensive market knowledge with cutting-edge AI technology. Established in the heart of London’s Canary Wharf, our mission is to convert intricate data into clear insights that drive actionable results. Our work culture is defined by principles of truth-seeking, speed, and accountability, which guide our daily collaboration and innovation. We prioritize fast-paced learning and uphold the highest standards of integrity and impact. Curiosity is central to our ethos—if you are motivated by challenges, inspired by innovation, and eager to enhance your intelligence alongside a team of brilliant minds, Intropic is the perfect place for your ideas to thrive.Who We Are Looking ForWe are in search of a dynamic and technically proficient Market Data Engineer who excels in constructing resilient, low-latency systems. If you possess a passion for financial markets and expertise in developing real-time data pipelines, we would be excited to connect with you.