Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Experience Level
Mid to Senior
About the job
Join Sopra Steria as a Big Data Engineer, where you will play a critical role in transforming financial services through innovative data solutions. You will be responsible for designing, developing, and implementing big data architectures to support our clients in achieving their business objectives. Collaborate with cross-functional teams to deliver high-quality data-driven insights that enhance decision-making processes.
Join Sopra Steria as a Big Data Engineer, where you will play a critical role in transforming financial services through innovative data solutions. You will be responsible for designing, developing, and implementing big data architectures to support our clients in achieving their business objectives. Collaborate with cross-functional teams to deliver high-qu…
Join Our TeamDoctolib is seeking a Senior Data Engineer - Analytics to become a vital part of our Analytics Engineering team.In this role, you will be instrumental in creating data products that yield actionable insights and enhance decision-making across Doctolib, contributing to the evolution of healthcare access. You will collaborate with a dynamic team to develop data pipelines and solutions that bolster our organization's capacity for data-driven decision-making and support our AI initiatives.At Doctolib, our tech team is dedicated to crafting innovative products and features that significantly improve the day-to-day experiences of both care teams and patients. We operate in feature teams within an agile framework, fostering close collaboration with product, design, and business teams.Your Key Responsibilities:Develop and maintain data pipelines using Python (Dagster) to align with Doctolib's AI strategy.Build and sustain data marts in BigQuery using SQL/Jinja, DBT.Create insightful dashboards for high-level reporting utilizing Tableau.Engage with stakeholders to identify their data requirements and outline specifications.Ensure data integrity, security (compliance with GDPR), and accessibility through continuous monitoring and optimization.Our Technical Environment:Our solutions are built on a fully cloud-native platform that supports web and mobile interfaces, multiple languages, and is tailored to various country and healthcare specialty needs. We modularize our platform to operate in a distributed architecture leveraging reusable components.Our technology stack includes Rails, TypeScript, Java, Python, Kotlin, Swift, and React Native.We ethically leverage AI across our products to empower patients and health professionals. Learn more about our AI vision here and our first AI hackathon here!Our data stack incorporates Kafka/Debezium for data ingestion, Dagster/DBT for orchestration, GCS/BigQuery for data warehousing, and Metabase/Tableau for business intelligence and reporting.
Are you ready to kickstart your career as a Data Engineer? Join Labelium, a leading digital marketing agency, as a Consultant Data Engineer Intern in Paris. In this role, you will have the opportunity to work alongside experienced data professionals, gaining hands-on experience in data analysis, engineering, and project management.Your primary responsibilities will include supporting data-driven projects, assisting in the development of data pipelines, and contributing to the optimization of data workflows. This internship is an excellent opportunity to enhance your technical skills while working in a dynamic and innovative environment.
Position Overview:As part of our esteemed client who is a global leader in the beauty and cosmetics industry, you will play a pivotal role in the transformation of a worldwide project. Your primary responsibility will be to develop the Data Platform along with a suite of Data services that will be utilized by various teams within the organization. You will also contribute to the development of significant data use cases.Your key responsibilities will include:Designing the architecture and implementing the solution.Defining and developing Data Models.Ensuring code quality and best practices.Acting as a DevOps engineer (implementing CI/CD pipelines and providing Level 3 support for developments).Technical Environment:Google Cloud Platform (BigQuery, Cloud Run, Cloud Build).SQL.Python.DevOps tools (GitHub).API Development.Terraform.Agile Methodology.
Join Talan, a forward-thinking company dedicated to innovation and excellence in technology solutions. We are seeking a Data Engineer with substantial experience in Databricks to play a pivotal role in our engineering and development team. In this position, you will design, implement, and manage data systems that empower our clients to harness the full potential of their data.Your responsibilities will include developing and maintaining ETL processes, optimizing data workflows, and ensuring data quality and accessibility. Collaborating closely with data scientists and other engineers, you will contribute to the development of robust data solutions that drive business insights and innovation.
Join Labelium as a Freelance CRM Consultant and leverage your expertise to enhance client relationships and optimize customer engagement strategies. This role is ideal for professionals seeking flexibility while making a significant impact in a dynamic environment.
Overview:Are you a digital enthusiast with a knack for online research? Do you aspire to shape the future of AI-driven search technologies? If so, we invite you to join our team!At RWS, you will play a pivotal role in assessing and refining how our clients' search engines respond to everyday queries. Your expertise will help enhance AI data, improving the online search experience for users globally. Your Profile:- Naturally curious with a passion for AI advancements.- Strong online research capabilities.- Thrives in a dynamic work environment.- Committed to maintaining high standards of quality and accuracy.- Self-motivated with the ability to manage time effectively while working remotely.- Proficient in critical thinking, with the ability to evaluate information's relevance and significance.Quick Overview of the Flexible, Part-Time Position:Language Proficiency: English and FrenchLocation: France (Remote)Schedule: Flexible, approximately 15 hours per weekCompensation: Up to 13 EUR/hourWho Should Apply: This role is ideal for freelancers, students, stay-at-home parents, or anyone seeking flexible remote work to contribute to AI model improvement. No prior experience is necessary.Role Description: Join RWS Group as a Search Engine Evaluator and enhance search results. Your feedback on text, audio, images, and videos will refine AI data utilized by major search engines, social media platforms, streaming services, and more.
Exciting Opportunity at CAPFI TechnologyJoin CAPFI Technology and collaborate with a key player in the Data domain!Based in Paris and London, CAPFI Technology brings together our expertise in IT and financial markets. Our consultants co-develop strategic projects with our clients that blend innovation, IT, and financial challenges.We are currently seeking a Data Engineer specializing in Tableau / Power BI to work on Data Analytics and Data Platform management projects.Your Responsibilities As a Data Engineer (Tableau / Power BI), you will be integrated into the core production teams, ensuring mastery and coherence of functional and application Data architectures.Your main responsibilities will include:Tableau / Power BI Support and ExpertiseUnderstand and analyze DataViz needs across various functional domains.Design sustainable and tailored solutions.Promote new features and best practices to users.Coach new users by designing tailored training paths.Create MVP dashboards to showcase new Group features.Industrialization and Integration of SolutionsEnsure the integration of Tableau / Power BI within the client's information system.Automate and industrialize deployments.Deploy solutions in on-premise and potentially in cloud-native environments.Provide level 3 support for production incidents.Advise IT and business teams on data usage and modeling.Collaboration and Continuous ImprovementWork closely with Product Owners, Offering Managers, Architects, and Tech Leads to design effective Data solutions.
Overview: We are seeking a talented Data Engineer to architect and develop our data pipelines from the ground up—managing petabytes of logs, events, and model traces—while establishing a seamless, reliable environment for production, testing, and research tasks.About UsAt White Circle, we focus on AI Safety by creating a robust layer that ensures the safety, reliability, and optimization of AI systems. Our platform is driven by policies, which are straightforward natural-language rules that specify the acceptable behaviors of AI models. We automatically test, enforce, and enhance these policies at scale.We have successfully raised $11M from leading investors, including founders and executives from OpenAI, Anthropic, HuggingFace, Mistral, DeepMind, Datadog, and Sentry.Our system handles over 100 million API calls every month.We develop and fine-tune our own Large Language Models (LLMs) to ensure they operate faster and more economically than any open-source or proprietary alternatives.As a small, dedicated team, we value deep engagement with challenging problems, rapid deployment of your contributions, and the opportunity to significantly impact the construction of AI safety.Your ResponsibilitiesEstablish and uphold a clean and stable data environment, allowing team members to access petabytes of traces, logs, and model outputs in their required formats without delays or manual extraction.Create and manage internal data APIs, SDKs, and tools that empower engineering, product, and research teams to explore, query, and utilize data independently of infrastructure concerns.Monitor and enhance data performance, optimizing table layouts and query plans to ensure smooth analytics and research workflows as data volumes increase.Oversee data access and governance by defining and enforcing permissions, access rules, and security protocols.Desired QualificationsProven experience building or scaling a modern data stack—such as Snowflake, ClickHouse, or event streaming—within a startup or similarly fast-paced environment.Strong proficiency in SQL and Python, with the ability to work effectively with large, complex datasets.Excellent communication skills, enabling you to collaborate directly with engineers and researchers. Fluency in English is required.Bonus: Experience with data visualization tools like Metabase, Tableau, or similar.
Join VeepeeTech as a Site Reliability Engineer (SRE) and become an integral part of a dynamic and cross-functional SRE community while collaborating with a product-focused Data Platform team.In your role, you will enhance the reliability, scalability, and operability of essential data services by leveraging SRE and DevOps methodologies. You will also play a vital role in knowledge sharing across various teams.Our Data Platform is evolving into a cutting-edge lakehouse architecture hosted on VeepeeCloud (our proprietary on-premise platform), utilizing advanced technologies such as Trino, Iceberg, and object storage, with ambitious goals related to performance, cost efficiency, and platform stewardship.You will work in a distributed environment across France and Spain, alongside a talented team of 40-50 data professionals specializing in engineering, analytics, data science, and governance.As a key contributor, you will ensure the reliability and scalability of our next-generation data platform while supporting the transition from public cloud to hybrid and on-premise architectures.
Join Alten as a Data Engineer specializing in Google Cloud Platform (GCP). In this role, you will be at the forefront of data architecture and analytics, utilizing cutting-edge tools and technologies to streamline data processes and enhance decision-making.As part of our dynamic team, you will collaborate with cross-functional teams to design, develop, and maintain scalable data pipelines that support our business objectives. Your expertise will play a crucial role in transforming raw data into actionable insights.
We are seeking a talented and innovative Lead Data Architect to join our dynamic team at RTE1 in Paris. In this pivotal role, you will spearhead the development and implementation of advanced data architecture solutions, ensuring the integrity, quality, and security of our data systems.Your expertise will guide our data strategy, supporting critical business decisions and ensuring our data infrastructure is scalable and robust. If you are passionate about data architecture and eager to lead a talented team, we want to hear from you!
Join Veepee as a Senior Data Scientist and play a pivotal role in enhancing our innovative recommender systems. Work with a talented team across Paris, Lyon, Barcelona, and Brussels to develop cutting-edge algorithms that deliver personalized sales recommendations to millions of users in real-time. Utilize large-scale behavioral and product data to create models that drive engagement and conversion, collaborating closely with ML and Data Engineers to bring your ideas to life.
The Graphing Experience team at Datadog is dedicated to creating intuitive tools that empower users to dive into their data and effectively communicate insights through engaging visualizations. As the Engineering Manager for this dynamic team, you will spearhead key product initiatives that drive Datadog's growth and success. Your role will involve outlining our technical vision and guiding the product roadmap while collaborating closely with product managers and engineers. You will play a pivotal role in team development, including onboarding new talent and fostering the growth of your engineers. Your team will be integral in unlocking new product use cases, enhancing the user experience for diverse personas, and designing advanced UIs for data querying needs. At Datadog, we highly value our office culture, which fosters relationships, creativity, and collaboration. We embrace a hybrid work model, allowing our employees to achieve a harmonious work-life balance that suits their needs.
Join Sopra Steria, a leading European technology and consulting company, as a Data Engineer specializing in PySpark within our Financial Services division. This role offers you the opportunity to work on innovative data solutions that drive business insights for our financial clients.Your responsibilities will include designing and implementing data pipelines, optimizing data processing workflows, and collaborating with cross-functional teams to leverage data for strategic decision-making. We seek creative thinkers who can navigate complex data environments and provide valuable insights through advanced analytics.
Join our dynamic team at Sopra Steria as a Data Engineer specializing in PySpark within the Financial Services sector. In this role, you will leverage your technical expertise to design, develop, and optimize data processing pipelines, ensuring the efficient handling of large datasets. Your contributions will play a vital role in advancing our data strategies and driving impactful insights for our clients.
Join Polar Analytics: The Premier Data Platform for Consumer Brands At Polar Analytics, we transform complex data into actionable insights, empowering brands to make informed decisions swiftly and efficiently. Our platform is powerful yet accessible, providing DTC brands with the analytics they need to achieve scalable profitability. Our mission is clear: we aim to accelerate the growth of independent DTC brands—enabling them to compete effectively and profitably.Why Choose Polar Analytics? 4,000+ Brands and CountingWe’ve expanded to over 4,000 active merchants as of January 2025, with plans to reach 10,000+ within this year. Revolutionizing AnalyticsPolar is redefining the Shopify analytics landscape through innovative data infrastructure designed for seamless orchestration—placing us at the leading edge of AI-powered commerce automation. Supported by Top-Tier InvestorsWe have successfully raised $28.5M from prestigious investors like Frst, Point9, and Chalfen Ventures, recognized for identifying future industry leaders early. A Team of eCommerce & Data InnovatorsOur diverse team comprises seasoned professionals from leading eCommerce SaaS platforms and Silicon Valley data powerhouses, all united by a common goal: to forge the next industry leader.About UsWe are developing a cutting-edge data & AI operating system tailored for eCommerce—think of it as Datadog for retail.Our work involves processing and refining complex Shopify, advertising, and retention data, making it reliable and actionable for teams.We are seeking engineers who desire real ownership and impact.Your RoleIn this role, you will be engaged with core systems including data pipelines, semantic layers, AI evaluation, and experimentation tools that are vital for thousands of merchants daily.This is not a conventional ticket-taking position. You will:Take ownership of challenges from design through to productionSolve complex infrastructure issues (handling petabytes of data, near real-time processing, and high reliability)
Role Overview Nexton is looking for a Data Engineer focused on Business Intelligence and Visualization to join the team in Paris. This position centers on turning complex data into clear, useful insights that support strategic choices across the energy and environmental fields. What You Will Do Develop and maintain data models and visualization tools to support business intelligence needs Translate raw data into reports and dashboards that guide decision-making Work closely with colleagues to strengthen data-driven processes and outcomes About Nexton Nexton works at the intersection of energy and environmental innovation. The team values practical solutions that help organizations use data more effectively.
Sopra Steria is hiring a Data Engineer with strong Google Cloud Platform (GCP) skills to support projects in the financial services sector. This position is based in Paris. Role overview This role focuses on designing, building, and maintaining data pipelines that support financial services initiatives. Collaboration with cross-functional teams is central, as data must move efficiently across multiple platforms and systems. Key responsibilities Work closely with teams to create and manage data pipelines using GCP tools. Ensure data flows smoothly and securely between platforms. Implement and optimize data solutions that support business goals in financial services. What we are looking for Experience with Google Cloud Platform (GCP) in a data engineering context. Ability to design and maintain data pipelines. Interest in financial services and using data to drive business insights. Strong collaboration skills for working with diverse project teams.
We are seeking a talented AWS Data Engineer to join our dynamic team at Nexton. As a key player in our Telco Media Transport department, you will be responsible for designing and implementing data solutions that drive our business forward. Your expertise in AWS technologies will be crucial in building scalable data pipelines and ensuring data integrity.In this role, you will collaborate with cross-functional teams to understand data requirements, optimize data workflows, and enhance our data architecture. If you are passionate about data engineering and thrive in a vibrant environment, we want to hear from you!
Join Sopra Steria as a Big Data Engineer, where you will play a critical role in transforming financial services through innovative data solutions. You will be responsible for designing, developing, and implementing big data architectures to support our clients in achieving their business objectives. Collaborate with cross-functional teams to deliver high-qu…
Join Our TeamDoctolib is seeking a Senior Data Engineer - Analytics to become a vital part of our Analytics Engineering team.In this role, you will be instrumental in creating data products that yield actionable insights and enhance decision-making across Doctolib, contributing to the evolution of healthcare access. You will collaborate with a dynamic team to develop data pipelines and solutions that bolster our organization's capacity for data-driven decision-making and support our AI initiatives.At Doctolib, our tech team is dedicated to crafting innovative products and features that significantly improve the day-to-day experiences of both care teams and patients. We operate in feature teams within an agile framework, fostering close collaboration with product, design, and business teams.Your Key Responsibilities:Develop and maintain data pipelines using Python (Dagster) to align with Doctolib's AI strategy.Build and sustain data marts in BigQuery using SQL/Jinja, DBT.Create insightful dashboards for high-level reporting utilizing Tableau.Engage with stakeholders to identify their data requirements and outline specifications.Ensure data integrity, security (compliance with GDPR), and accessibility through continuous monitoring and optimization.Our Technical Environment:Our solutions are built on a fully cloud-native platform that supports web and mobile interfaces, multiple languages, and is tailored to various country and healthcare specialty needs. We modularize our platform to operate in a distributed architecture leveraging reusable components.Our technology stack includes Rails, TypeScript, Java, Python, Kotlin, Swift, and React Native.We ethically leverage AI across our products to empower patients and health professionals. Learn more about our AI vision here and our first AI hackathon here!Our data stack incorporates Kafka/Debezium for data ingestion, Dagster/DBT for orchestration, GCS/BigQuery for data warehousing, and Metabase/Tableau for business intelligence and reporting.
Are you ready to kickstart your career as a Data Engineer? Join Labelium, a leading digital marketing agency, as a Consultant Data Engineer Intern in Paris. In this role, you will have the opportunity to work alongside experienced data professionals, gaining hands-on experience in data analysis, engineering, and project management.Your primary responsibilities will include supporting data-driven projects, assisting in the development of data pipelines, and contributing to the optimization of data workflows. This internship is an excellent opportunity to enhance your technical skills while working in a dynamic and innovative environment.
Position Overview:As part of our esteemed client who is a global leader in the beauty and cosmetics industry, you will play a pivotal role in the transformation of a worldwide project. Your primary responsibility will be to develop the Data Platform along with a suite of Data services that will be utilized by various teams within the organization. You will also contribute to the development of significant data use cases.Your key responsibilities will include:Designing the architecture and implementing the solution.Defining and developing Data Models.Ensuring code quality and best practices.Acting as a DevOps engineer (implementing CI/CD pipelines and providing Level 3 support for developments).Technical Environment:Google Cloud Platform (BigQuery, Cloud Run, Cloud Build).SQL.Python.DevOps tools (GitHub).API Development.Terraform.Agile Methodology.
Join Talan, a forward-thinking company dedicated to innovation and excellence in technology solutions. We are seeking a Data Engineer with substantial experience in Databricks to play a pivotal role in our engineering and development team. In this position, you will design, implement, and manage data systems that empower our clients to harness the full potential of their data.Your responsibilities will include developing and maintaining ETL processes, optimizing data workflows, and ensuring data quality and accessibility. Collaborating closely with data scientists and other engineers, you will contribute to the development of robust data solutions that drive business insights and innovation.
Join Labelium as a Freelance CRM Consultant and leverage your expertise to enhance client relationships and optimize customer engagement strategies. This role is ideal for professionals seeking flexibility while making a significant impact in a dynamic environment.
Overview:Are you a digital enthusiast with a knack for online research? Do you aspire to shape the future of AI-driven search technologies? If so, we invite you to join our team!At RWS, you will play a pivotal role in assessing and refining how our clients' search engines respond to everyday queries. Your expertise will help enhance AI data, improving the online search experience for users globally. Your Profile:- Naturally curious with a passion for AI advancements.- Strong online research capabilities.- Thrives in a dynamic work environment.- Committed to maintaining high standards of quality and accuracy.- Self-motivated with the ability to manage time effectively while working remotely.- Proficient in critical thinking, with the ability to evaluate information's relevance and significance.Quick Overview of the Flexible, Part-Time Position:Language Proficiency: English and FrenchLocation: France (Remote)Schedule: Flexible, approximately 15 hours per weekCompensation: Up to 13 EUR/hourWho Should Apply: This role is ideal for freelancers, students, stay-at-home parents, or anyone seeking flexible remote work to contribute to AI model improvement. No prior experience is necessary.Role Description: Join RWS Group as a Search Engine Evaluator and enhance search results. Your feedback on text, audio, images, and videos will refine AI data utilized by major search engines, social media platforms, streaming services, and more.
Exciting Opportunity at CAPFI TechnologyJoin CAPFI Technology and collaborate with a key player in the Data domain!Based in Paris and London, CAPFI Technology brings together our expertise in IT and financial markets. Our consultants co-develop strategic projects with our clients that blend innovation, IT, and financial challenges.We are currently seeking a Data Engineer specializing in Tableau / Power BI to work on Data Analytics and Data Platform management projects.Your Responsibilities As a Data Engineer (Tableau / Power BI), you will be integrated into the core production teams, ensuring mastery and coherence of functional and application Data architectures.Your main responsibilities will include:Tableau / Power BI Support and ExpertiseUnderstand and analyze DataViz needs across various functional domains.Design sustainable and tailored solutions.Promote new features and best practices to users.Coach new users by designing tailored training paths.Create MVP dashboards to showcase new Group features.Industrialization and Integration of SolutionsEnsure the integration of Tableau / Power BI within the client's information system.Automate and industrialize deployments.Deploy solutions in on-premise and potentially in cloud-native environments.Provide level 3 support for production incidents.Advise IT and business teams on data usage and modeling.Collaboration and Continuous ImprovementWork closely with Product Owners, Offering Managers, Architects, and Tech Leads to design effective Data solutions.
Overview: We are seeking a talented Data Engineer to architect and develop our data pipelines from the ground up—managing petabytes of logs, events, and model traces—while establishing a seamless, reliable environment for production, testing, and research tasks.About UsAt White Circle, we focus on AI Safety by creating a robust layer that ensures the safety, reliability, and optimization of AI systems. Our platform is driven by policies, which are straightforward natural-language rules that specify the acceptable behaviors of AI models. We automatically test, enforce, and enhance these policies at scale.We have successfully raised $11M from leading investors, including founders and executives from OpenAI, Anthropic, HuggingFace, Mistral, DeepMind, Datadog, and Sentry.Our system handles over 100 million API calls every month.We develop and fine-tune our own Large Language Models (LLMs) to ensure they operate faster and more economically than any open-source or proprietary alternatives.As a small, dedicated team, we value deep engagement with challenging problems, rapid deployment of your contributions, and the opportunity to significantly impact the construction of AI safety.Your ResponsibilitiesEstablish and uphold a clean and stable data environment, allowing team members to access petabytes of traces, logs, and model outputs in their required formats without delays or manual extraction.Create and manage internal data APIs, SDKs, and tools that empower engineering, product, and research teams to explore, query, and utilize data independently of infrastructure concerns.Monitor and enhance data performance, optimizing table layouts and query plans to ensure smooth analytics and research workflows as data volumes increase.Oversee data access and governance by defining and enforcing permissions, access rules, and security protocols.Desired QualificationsProven experience building or scaling a modern data stack—such as Snowflake, ClickHouse, or event streaming—within a startup or similarly fast-paced environment.Strong proficiency in SQL and Python, with the ability to work effectively with large, complex datasets.Excellent communication skills, enabling you to collaborate directly with engineers and researchers. Fluency in English is required.Bonus: Experience with data visualization tools like Metabase, Tableau, or similar.
Join VeepeeTech as a Site Reliability Engineer (SRE) and become an integral part of a dynamic and cross-functional SRE community while collaborating with a product-focused Data Platform team.In your role, you will enhance the reliability, scalability, and operability of essential data services by leveraging SRE and DevOps methodologies. You will also play a vital role in knowledge sharing across various teams.Our Data Platform is evolving into a cutting-edge lakehouse architecture hosted on VeepeeCloud (our proprietary on-premise platform), utilizing advanced technologies such as Trino, Iceberg, and object storage, with ambitious goals related to performance, cost efficiency, and platform stewardship.You will work in a distributed environment across France and Spain, alongside a talented team of 40-50 data professionals specializing in engineering, analytics, data science, and governance.As a key contributor, you will ensure the reliability and scalability of our next-generation data platform while supporting the transition from public cloud to hybrid and on-premise architectures.
Join Alten as a Data Engineer specializing in Google Cloud Platform (GCP). In this role, you will be at the forefront of data architecture and analytics, utilizing cutting-edge tools and technologies to streamline data processes and enhance decision-making.As part of our dynamic team, you will collaborate with cross-functional teams to design, develop, and maintain scalable data pipelines that support our business objectives. Your expertise will play a crucial role in transforming raw data into actionable insights.
We are seeking a talented and innovative Lead Data Architect to join our dynamic team at RTE1 in Paris. In this pivotal role, you will spearhead the development and implementation of advanced data architecture solutions, ensuring the integrity, quality, and security of our data systems.Your expertise will guide our data strategy, supporting critical business decisions and ensuring our data infrastructure is scalable and robust. If you are passionate about data architecture and eager to lead a talented team, we want to hear from you!
Join Veepee as a Senior Data Scientist and play a pivotal role in enhancing our innovative recommender systems. Work with a talented team across Paris, Lyon, Barcelona, and Brussels to develop cutting-edge algorithms that deliver personalized sales recommendations to millions of users in real-time. Utilize large-scale behavioral and product data to create models that drive engagement and conversion, collaborating closely with ML and Data Engineers to bring your ideas to life.
The Graphing Experience team at Datadog is dedicated to creating intuitive tools that empower users to dive into their data and effectively communicate insights through engaging visualizations. As the Engineering Manager for this dynamic team, you will spearhead key product initiatives that drive Datadog's growth and success. Your role will involve outlining our technical vision and guiding the product roadmap while collaborating closely with product managers and engineers. You will play a pivotal role in team development, including onboarding new talent and fostering the growth of your engineers. Your team will be integral in unlocking new product use cases, enhancing the user experience for diverse personas, and designing advanced UIs for data querying needs. At Datadog, we highly value our office culture, which fosters relationships, creativity, and collaboration. We embrace a hybrid work model, allowing our employees to achieve a harmonious work-life balance that suits their needs.
Join Sopra Steria, a leading European technology and consulting company, as a Data Engineer specializing in PySpark within our Financial Services division. This role offers you the opportunity to work on innovative data solutions that drive business insights for our financial clients.Your responsibilities will include designing and implementing data pipelines, optimizing data processing workflows, and collaborating with cross-functional teams to leverage data for strategic decision-making. We seek creative thinkers who can navigate complex data environments and provide valuable insights through advanced analytics.
Join our dynamic team at Sopra Steria as a Data Engineer specializing in PySpark within the Financial Services sector. In this role, you will leverage your technical expertise to design, develop, and optimize data processing pipelines, ensuring the efficient handling of large datasets. Your contributions will play a vital role in advancing our data strategies and driving impactful insights for our clients.
Join Polar Analytics: The Premier Data Platform for Consumer Brands At Polar Analytics, we transform complex data into actionable insights, empowering brands to make informed decisions swiftly and efficiently. Our platform is powerful yet accessible, providing DTC brands with the analytics they need to achieve scalable profitability. Our mission is clear: we aim to accelerate the growth of independent DTC brands—enabling them to compete effectively and profitably.Why Choose Polar Analytics? 4,000+ Brands and CountingWe’ve expanded to over 4,000 active merchants as of January 2025, with plans to reach 10,000+ within this year. Revolutionizing AnalyticsPolar is redefining the Shopify analytics landscape through innovative data infrastructure designed for seamless orchestration—placing us at the leading edge of AI-powered commerce automation. Supported by Top-Tier InvestorsWe have successfully raised $28.5M from prestigious investors like Frst, Point9, and Chalfen Ventures, recognized for identifying future industry leaders early. A Team of eCommerce & Data InnovatorsOur diverse team comprises seasoned professionals from leading eCommerce SaaS platforms and Silicon Valley data powerhouses, all united by a common goal: to forge the next industry leader.About UsWe are developing a cutting-edge data & AI operating system tailored for eCommerce—think of it as Datadog for retail.Our work involves processing and refining complex Shopify, advertising, and retention data, making it reliable and actionable for teams.We are seeking engineers who desire real ownership and impact.Your RoleIn this role, you will be engaged with core systems including data pipelines, semantic layers, AI evaluation, and experimentation tools that are vital for thousands of merchants daily.This is not a conventional ticket-taking position. You will:Take ownership of challenges from design through to productionSolve complex infrastructure issues (handling petabytes of data, near real-time processing, and high reliability)
Role Overview Nexton is looking for a Data Engineer focused on Business Intelligence and Visualization to join the team in Paris. This position centers on turning complex data into clear, useful insights that support strategic choices across the energy and environmental fields. What You Will Do Develop and maintain data models and visualization tools to support business intelligence needs Translate raw data into reports and dashboards that guide decision-making Work closely with colleagues to strengthen data-driven processes and outcomes About Nexton Nexton works at the intersection of energy and environmental innovation. The team values practical solutions that help organizations use data more effectively.
Sopra Steria is hiring a Data Engineer with strong Google Cloud Platform (GCP) skills to support projects in the financial services sector. This position is based in Paris. Role overview This role focuses on designing, building, and maintaining data pipelines that support financial services initiatives. Collaboration with cross-functional teams is central, as data must move efficiently across multiple platforms and systems. Key responsibilities Work closely with teams to create and manage data pipelines using GCP tools. Ensure data flows smoothly and securely between platforms. Implement and optimize data solutions that support business goals in financial services. What we are looking for Experience with Google Cloud Platform (GCP) in a data engineering context. Ability to design and maintain data pipelines. Interest in financial services and using data to drive business insights. Strong collaboration skills for working with diverse project teams.
We are seeking a talented AWS Data Engineer to join our dynamic team at Nexton. As a key player in our Telco Media Transport department, you will be responsible for designing and implementing data solutions that drive our business forward. Your expertise in AWS technologies will be crucial in building scalable data pipelines and ensuring data integrity.In this role, you will collaborate with cross-functional teams to understand data requirements, optimize data workflows, and enhance our data architecture. If you are passionate about data engineering and thrive in a vibrant environment, we want to hear from you!