Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Experience Level
Experience
About the job
Join our dynamic team as a Databricks Operations Engineer, where you will play a key role in supporting our Azure-based Data Engineering platform, powered by Databricks. This position offers an exciting blend of DevOps practices, system reliability, and security auditing to ensure our data infrastructure is robust, secure, and scalable.
Your contributions will be vital in automating workflows, monitoring system health, and implementing security best practices throughout our data ecosystem.
T-Systems Information and Communication Technology India Private Limited
Full-time|On-site|Bengaluru
Join our dynamic team as a Databricks Operations Engineer, where you will play a key role in supporting our Azure-based Data Engineering platform, powered by Databricks. This position offers an exciting blend of DevOps practices, system reliability, and security auditing to ensure our data infrastructure is robust, secure, and scalable.Your contributions wil…
This position is with a client of Weekday.We are in search of an Analytics Manager to become a key player in our innovative team based in Bengaluru. This crucial role involves harnessing data-driven insights to influence strategic initiatives and improve our service offerings, particularly in last-mile deliveries and comprehensive supply chain solutions. If you have a passion for analytics and excel in a vibrant, fast-paced environment, we invite you to apply!Key Responsibilities: Analyze extensive datasets to uncover insights that foster business and operational advancements. Design and uphold data models, dashboards, and reports for essential stakeholders. Employ Python and SQL for sophisticated data analysis, automation, and reporting. Engage with product, operations, and business teams to refine decision-making processes. Identify key metrics and trends to boost last-mile delivery and supply chain efficiency. Facilitate A/B testing, forecasting, and predictive analytics for ongoing optimization. Guarantee data integrity and accuracy when managing large-scale datasets. What We Seek: 5+ years of experience in analytics, data science, or a related discipline. Expertise in Python and SQL for data manipulation, modeling, and automation. Solid understanding of product analytics and data visualization tools (e.g., Tableau, Power BI, Looker). Background in logistics, supply chain, or e-commerce data is advantageous. Exceptional problem-solving abilities and capacity to convert data into actionable insights. Strong communication skills for presenting findings to both technical and non-technical audiences.
Join Amol Technologies as a Senior DevOps Manager, where you will lead our efforts in automating and streamlining our operations and processes. You will be responsible for ensuring smooth and efficient deployment processes, maintaining cloud infrastructure, and fostering collaboration between development and operations teams.
We are seeking a highly skilled Senior Platform / DevOps Engineer with expertise in Real-time Media, WebRTC, Edge Computing, and Cloud Technologies. Join our dynamic team in Bengaluru as we develop a robust LiveKit-like real-time communications platform designed to scale to millions of simultaneous calls. This position offers an exciting opportunity to work with cutting-edge technologies while ensuring ultra-low latency and cloud reliability.About the Role:In this hands-on role, you will take ownership of production systems, focusing on performance and resilience. We are particularly interested in candidates with experience in scaling real-time and streaming infrastructures.Key Responsibilities:- Ensure the reliability and performance of signaling, SFU/media nodes, TURN, routing, failover, and capacity planning.- Build and manage multi-region Kubernetes platforms with secure networking and zero-downtime deployments.- Design edge and cloud architecture including PoPs, global routing, failover, autoscaling, and disaster recovery.- Implement SLOs/SLIs, incident response, postmortems, and maintain operational excellence.- Develop strong observability practices, including metrics, logs, tracing, and real-time QoE/latency metrics.Preferred Qualifications:- Proven experience with Kubernetes at scale (multi-cluster/multi-region).- Strong foundation in Linux and networking fundamentals (UDP/TCP, NAT, conntrack, DNS, load balancing).Nice to Have:- Experience in WebRTC/RTC operations (ICE, STUN/TURN, SFU scaling, packet loss/jitter tuning).- Knowledge of Edge/PoP and traffic management (global routing, Anycast/DNS strategies).- Familiarity with cost optimization for bandwidth-heavy workloads.- Previous experience operating real-time/streaming systems at high concurrency levels.Success Criteria:- Ability to maintain a real-time system's stability through traffic spikes, packet loss, ISP variability, and region failures.- Understanding of latency budgets, concurrency, bandwidth, and packet throughput, beyond just pods and nodes.- Capability to create platforms that are observable, automatable, and easy to manage.
Join Continental AG as a Senior DevOps/SysOps Engineer specializing in Server & Cloud Automation. In this role, you will be responsible for streamlining our cloud infrastructure and automation processes to enhance system efficiency and reliability. You will collaborate with cross-functional teams to develop and implement innovative solutions that drive operational excellence.
* Design, implement, and maintain robust data ingestion pipelines that are secure and scalable, sourcing data from diverse systems such as SAP, Salesforce, SharePoint, APIs, and legacy manufacturing platforms.* Develop and enhance metadata-driven services to facilitate discoverability, access governance, and operational transparency of enterprise data.* Act as a technical authority and cross-functional facilitator for the acquisition, quality, and compliance of both structured and unstructured data.* Establish and oversee comprehensive data quality management, including monitoring and reporting.* Contribute to a global data engineering team that supports all major business domains.* Spearhead the ingestion and metadata service implementation for over 100 enterprise data sources.* Collaborate with IT, cybersecurity, infrastructure, and architecture teams to ensure secure and sustainable data delivery.Main Responsibilities:▪ Construct and maintain extraction services using Python or Scala (e.g., Debezium Server, custom APIs, rclone).• Implement Change Data Capture (CDC), delta, and event-based patterns.• Facilitate push-based HTTP and Kerberos-authenticated DLT delivery.• Establish, operate, and troubleshoot SAP extraction using tools like Theobald Extract Universal.• Integrate with systems such as Salesforce, SharePoint, and other API or file-based endpoints.▪ Develop a user-friendly, web-accessible data catalog application, featuring dataset profiles, metadata, and usability enhancements.▪ Integrate dataset discoverability, preview/exploration features, and lineage information using Unity Catalog as a backend metadata system.▪ Design and implement structured access request workflows encompassing submission, approval chains, audit trails, and enablement triggers.• Conduct design reviews with the Cybersecurity team.• Ensure proper documentation and compliance for all interfaces and data ingress points.• Manage audit and traceability requirements.• Collaborate with IT and business users to translate requirements into scalable technical solutions.• Serve as a technical escalation point for complex source integration challenges.▪ Define and execute a multi-layered data quality framework, incorporating unit-level, integration-level, and cross-pipeline validation rules.▪ Establish centralized, version-controlled storage of Data Quality (DQ) rules, integrating them into orchestration and CI/CD pipelines.▪ Implement automated DQ monitoring with varying severity levels (Critical, High, Medium, Low) and enable flagging, filtering, and quarantining mechanisms at relevant pipeline stages.▪ Work closely with source system owners and business stakeholders to define meaningful and actionable DQ thresholds.
Join our innovative team at Continental as a Data Engineering IT Engineer, where you'll play a crucial role in driving our data initiatives. In this position, you will design, develop, and maintain robust data pipelines and architectures that support our advanced analytics and business intelligence goals. Collaborate with cross-functional teams to ensure data quality and integrity while leveraging cutting-edge technologies to enhance our data-driven decision-making.
Eurofins Scientific is hiring a Senior Specialist in Data Review for its MD-MV Analytical Chemistry team in Bengaluru. This position centers on the careful review of analytical data, with a strong focus on upholding strict quality standards. Role overview The Senior Specialist will examine analytical results to verify accuracy and ensure all data meets compliance requirements. Attention to detail and a thorough approach are essential in this work, as the integrity of client-facing results depends on precise data review. Impact By maintaining high standards in data review, this role directly supports Eurofins Scientific’s commitment to delivering reliable analytical outcomes for clients. The work contributes to the company’s reputation as a global leader in bio-analytical services.
Join Continental as a Data Engineering IT Engineer in our innovative team in Bengaluru. In this role, you will be responsible for designing and implementing data solutions that drive business intelligence and analytics. You will work closely with cross-functional teams to enhance data architecture and ensure data integrity, enabling data-driven decision-making across the organization.Your expertise in data processing and engineering will be crucial as you develop scalable data pipelines and optimize data workflows. This is an exciting opportunity to contribute to cutting-edge projects and make a significant impact on our technological landscape.
Join Altisource as a Business Analytics Manager!Are you a data-driven professional with a knack for turning insights into actionable strategies? Do you thrive in a dynamic environment where your analytical skills can lead to significant business impact? If so, we invite you to be part of our innovative team at Altisource, a leader in transforming the real estate transaction landscape in the United States.As a Business Analytics Manager reporting directly to the Vice President of Analytics and Strategy, you will lead a talented team focused on delivering key insights to enhance our operations. Your role will involve collaborating with various departments including Product, Technology, and Operations to identify challenges and develop scalable solutions that drive business growth.In this pivotal position, you will elevate our analytics function within Hubzu.com and Equator.com, ensuring we provide a superior customer experience for real estate buyers, sellers, and agents. You will implement best practices for data collection, reporting, and analysis, significantly improving user engagement across our platforms.Join us in leveraging your expertise in consumer psychology, quantitative modeling, and data science to uncover insights that propel our business forward!
Teamwork makes the stream work. Roku is revolutionizing the way the world experiences televisionAs the leading TV streaming platform in the U.S., Canada, and Mexico, Roku aims to empower every television globally. With a pioneering spirit, we connect users to their favorite content, support content creators in growing their audiences, and provide advertisers with innovative ways to engage viewers.From your first day at Roku, you'll be an integral part of our journey. As a fast-growing public company, everyone here plays a crucial role. Join us in delighting millions of TV streamers worldwide while gaining invaluable experience across various disciplines. About the TeamThe Roku Data Engineering team is dedicated to building a state-of-the-art big data platform that empowers both internal and external stakeholders to leverage data for business growth. Our team collaborates closely with business partners and engineering teams to gather metrics on essential initiatives for success. As a Senior Data Engineer focused on device metrics, you will design data models and develop scalable data pipelines to capture critical business metrics across our diverse range of Roku devices. About the RoleAt Roku, we connect users to the streaming content they love and enable content publishers to monetize large audiences while providing advertisers with unique capabilities to engage consumers. Our Roku streaming players and Roku TV™ models are available worldwide through direct retail sales and partnerships with TV brands and pay-TV operators. With millions of devices sold in numerous countries, thousands of streaming channels, and billions of hours of content consumed, the development of a scalable, highly available, fault-tolerant big data platform is crucial to our success. This role is based in Bengaluru, India and requires hybrid work, with three days in the office. What You'll Be DoingDevelop and maintain highly scalable, fault-tolerant distributed data processing systems (both batch and streaming) handling terabytes of data ingested daily and managing a petabyte-sized data warehouse.Design and implement efficient data models and pipelines that support business growth and decision-making.
Join smallest as a Research Data Engineer and play a pivotal role in transforming data into actionable insights. In this dynamic position, you will leverage advanced data engineering skills to collect, analyze, and interpret complex datasets. Collaborate with cross-functional teams to drive innovative solutions and contribute to impactful research projects.
Role Overview Wabtec Corporation is seeking a Manager of Advanced Analytics and AI Delivery in Bengaluru. This role leads projects that use data and artificial intelligence to improve business outcomes across global operations. The position focuses on applying analytics expertise to strengthen decision-making and advance technology initiatives. What You Will Do Lead advanced analytics and AI projects from concept through delivery Work with teams across departments to design and implement AI-driven strategies Apply analytics to optimize processes and support better business decisions Help identify opportunities where data and AI can boost efficiency and performance Collaboration This role partners closely with cross-functional teams, ensuring that AI solutions align with operational needs and deliver measurable improvements.
Full-time|₹500K/yr - ₹2M/yr|On-site|Bengaluru, Karnataka, India
Role overview Weekday's Client seeks a Big Data Developer in Bengaluru to improve and maintain data pipelines and processing systems that drive business intelligence and analytics. The position works with large volumes of both structured and unstructured data, spanning cloud and on-premise environments. Collaboration with data engineers, analysts, and product teams is central to delivering reliable, high-performance solutions that support business decisions. What you will do Design, develop, and maintain scalable data pipelines and big data processing systems. Build and optimize data architectures using AWS services to increase availability and performance. Use PostgreSQL for data storage, querying, and performance tuning. Process and analyze large datasets to enable analytics and reporting. Work with cross-functional teams to gather requirements and deliver data solutions. Maintain data quality, integrity, and security across all systems. Optimize data workflows for better performance, scalability, and cost efficiency. Implement ETL and ELT processes for smooth data ingestion and transformation. Monitor and troubleshoot data pipelines to ensure reliability and uptime. Integrate cloud and on-premise data systems to support hybrid environments. Document data architecture, workflows, and best practices for future growth. Location This role is based in Bengaluru, Karnataka, India.
Join our dynamic team as a Data Lakehouse IT Engineer where you will play a key role in designing and implementing cutting-edge data solutions. You will be responsible for managing data pipelines, ensuring data quality, and optimizing the data architecture to support our analytics initiatives. Your expertise in cloud technologies and data management will be crucial in driving our data strategy forward.
Role Overview weekday-1 is looking for a Data Engineer in Bengaluru, Karnataka. This role focuses on building and maintaining data pipelines to keep information moving smoothly across company systems. The work supports analytics and reporting that help guide business decisions. What You Will Do Design, implement, and maintain data pipelines Work with teams from different functions to understand data needs Support analytics efforts by ensuring reliable and accessible data Help deliver insights that improve business performance Location This position is based in Bengaluru, Karnataka, India.
Join the dynamic Founders Office at Gushwork as an Analytics Specialist, where you'll play a pivotal role in leveraging data to drive strategic decision-making. This position is perfect for analytically-minded individuals who are enthusiastic about transforming complex data into actionable insights.
CommerceIQ builds an AI-powered platform designed to support some of the world's largest brands. The company focuses on deploying AI agents that handle content, media, and sales functions within the operational systems of Fortune 100 companies. What you will do This role centers on managing and advancing DevOps practices for CommerceIQ's AI-driven solutions. The work involves supporting the integration and deployment of AI agents into enterprise-scale environments. Who you are Experience leading DevOps initiatives in a technology-driven organization is essential. Familiarity with supporting large-scale, AI-powered platforms will help in this position. Location This position is based in Bengaluru, Karnataka, India.
Join Renesa Electronics as a Staff DevOps Engineer specializing in AWS and Serverless architectures. In this dynamic role, you will collaborate with cross-functional teams to enhance our cloud infrastructure, ensuring robust deployment pipelines and automated processes. Your expertise will be key in driving our initiatives towards efficient and scalable cloud solutions.
We are seeking a dynamic and analytical Manager of Reporting & Analytics to join our team at Smiths Group. In this pivotal role, you will lead the development and implementation of reporting and analytics strategies that drive informed business decisions. You will work closely with cross-functional teams to gather insights and translate them into actionable recommendations.Your responsibilities will include overseeing the creation of comprehensive reports, analyzing data trends, and ensuring that our analytics framework aligns with the company's strategic objectives. This is a fantastic opportunity for someone who thrives in a data-driven environment and is passionate about leveraging analytics to foster growth.
T-Systems Information and Communication Technology India Private Limited
Full-time|On-site|Bengaluru
Join our dynamic team as a Databricks Operations Engineer, where you will play a key role in supporting our Azure-based Data Engineering platform, powered by Databricks. This position offers an exciting blend of DevOps practices, system reliability, and security auditing to ensure our data infrastructure is robust, secure, and scalable.Your contributions wil…
This position is with a client of Weekday.We are in search of an Analytics Manager to become a key player in our innovative team based in Bengaluru. This crucial role involves harnessing data-driven insights to influence strategic initiatives and improve our service offerings, particularly in last-mile deliveries and comprehensive supply chain solutions. If you have a passion for analytics and excel in a vibrant, fast-paced environment, we invite you to apply!Key Responsibilities: Analyze extensive datasets to uncover insights that foster business and operational advancements. Design and uphold data models, dashboards, and reports for essential stakeholders. Employ Python and SQL for sophisticated data analysis, automation, and reporting. Engage with product, operations, and business teams to refine decision-making processes. Identify key metrics and trends to boost last-mile delivery and supply chain efficiency. Facilitate A/B testing, forecasting, and predictive analytics for ongoing optimization. Guarantee data integrity and accuracy when managing large-scale datasets. What We Seek: 5+ years of experience in analytics, data science, or a related discipline. Expertise in Python and SQL for data manipulation, modeling, and automation. Solid understanding of product analytics and data visualization tools (e.g., Tableau, Power BI, Looker). Background in logistics, supply chain, or e-commerce data is advantageous. Exceptional problem-solving abilities and capacity to convert data into actionable insights. Strong communication skills for presenting findings to both technical and non-technical audiences.
Join Amol Technologies as a Senior DevOps Manager, where you will lead our efforts in automating and streamlining our operations and processes. You will be responsible for ensuring smooth and efficient deployment processes, maintaining cloud infrastructure, and fostering collaboration between development and operations teams.
We are seeking a highly skilled Senior Platform / DevOps Engineer with expertise in Real-time Media, WebRTC, Edge Computing, and Cloud Technologies. Join our dynamic team in Bengaluru as we develop a robust LiveKit-like real-time communications platform designed to scale to millions of simultaneous calls. This position offers an exciting opportunity to work with cutting-edge technologies while ensuring ultra-low latency and cloud reliability.About the Role:In this hands-on role, you will take ownership of production systems, focusing on performance and resilience. We are particularly interested in candidates with experience in scaling real-time and streaming infrastructures.Key Responsibilities:- Ensure the reliability and performance of signaling, SFU/media nodes, TURN, routing, failover, and capacity planning.- Build and manage multi-region Kubernetes platforms with secure networking and zero-downtime deployments.- Design edge and cloud architecture including PoPs, global routing, failover, autoscaling, and disaster recovery.- Implement SLOs/SLIs, incident response, postmortems, and maintain operational excellence.- Develop strong observability practices, including metrics, logs, tracing, and real-time QoE/latency metrics.Preferred Qualifications:- Proven experience with Kubernetes at scale (multi-cluster/multi-region).- Strong foundation in Linux and networking fundamentals (UDP/TCP, NAT, conntrack, DNS, load balancing).Nice to Have:- Experience in WebRTC/RTC operations (ICE, STUN/TURN, SFU scaling, packet loss/jitter tuning).- Knowledge of Edge/PoP and traffic management (global routing, Anycast/DNS strategies).- Familiarity with cost optimization for bandwidth-heavy workloads.- Previous experience operating real-time/streaming systems at high concurrency levels.Success Criteria:- Ability to maintain a real-time system's stability through traffic spikes, packet loss, ISP variability, and region failures.- Understanding of latency budgets, concurrency, bandwidth, and packet throughput, beyond just pods and nodes.- Capability to create platforms that are observable, automatable, and easy to manage.
Join Continental AG as a Senior DevOps/SysOps Engineer specializing in Server & Cloud Automation. In this role, you will be responsible for streamlining our cloud infrastructure and automation processes to enhance system efficiency and reliability. You will collaborate with cross-functional teams to develop and implement innovative solutions that drive operational excellence.
* Design, implement, and maintain robust data ingestion pipelines that are secure and scalable, sourcing data from diverse systems such as SAP, Salesforce, SharePoint, APIs, and legacy manufacturing platforms.* Develop and enhance metadata-driven services to facilitate discoverability, access governance, and operational transparency of enterprise data.* Act as a technical authority and cross-functional facilitator for the acquisition, quality, and compliance of both structured and unstructured data.* Establish and oversee comprehensive data quality management, including monitoring and reporting.* Contribute to a global data engineering team that supports all major business domains.* Spearhead the ingestion and metadata service implementation for over 100 enterprise data sources.* Collaborate with IT, cybersecurity, infrastructure, and architecture teams to ensure secure and sustainable data delivery.Main Responsibilities:▪ Construct and maintain extraction services using Python or Scala (e.g., Debezium Server, custom APIs, rclone).• Implement Change Data Capture (CDC), delta, and event-based patterns.• Facilitate push-based HTTP and Kerberos-authenticated DLT delivery.• Establish, operate, and troubleshoot SAP extraction using tools like Theobald Extract Universal.• Integrate with systems such as Salesforce, SharePoint, and other API or file-based endpoints.▪ Develop a user-friendly, web-accessible data catalog application, featuring dataset profiles, metadata, and usability enhancements.▪ Integrate dataset discoverability, preview/exploration features, and lineage information using Unity Catalog as a backend metadata system.▪ Design and implement structured access request workflows encompassing submission, approval chains, audit trails, and enablement triggers.• Conduct design reviews with the Cybersecurity team.• Ensure proper documentation and compliance for all interfaces and data ingress points.• Manage audit and traceability requirements.• Collaborate with IT and business users to translate requirements into scalable technical solutions.• Serve as a technical escalation point for complex source integration challenges.▪ Define and execute a multi-layered data quality framework, incorporating unit-level, integration-level, and cross-pipeline validation rules.▪ Establish centralized, version-controlled storage of Data Quality (DQ) rules, integrating them into orchestration and CI/CD pipelines.▪ Implement automated DQ monitoring with varying severity levels (Critical, High, Medium, Low) and enable flagging, filtering, and quarantining mechanisms at relevant pipeline stages.▪ Work closely with source system owners and business stakeholders to define meaningful and actionable DQ thresholds.
Join our innovative team at Continental as a Data Engineering IT Engineer, where you'll play a crucial role in driving our data initiatives. In this position, you will design, develop, and maintain robust data pipelines and architectures that support our advanced analytics and business intelligence goals. Collaborate with cross-functional teams to ensure data quality and integrity while leveraging cutting-edge technologies to enhance our data-driven decision-making.
Eurofins Scientific is hiring a Senior Specialist in Data Review for its MD-MV Analytical Chemistry team in Bengaluru. This position centers on the careful review of analytical data, with a strong focus on upholding strict quality standards. Role overview The Senior Specialist will examine analytical results to verify accuracy and ensure all data meets compliance requirements. Attention to detail and a thorough approach are essential in this work, as the integrity of client-facing results depends on precise data review. Impact By maintaining high standards in data review, this role directly supports Eurofins Scientific’s commitment to delivering reliable analytical outcomes for clients. The work contributes to the company’s reputation as a global leader in bio-analytical services.
Join Continental as a Data Engineering IT Engineer in our innovative team in Bengaluru. In this role, you will be responsible for designing and implementing data solutions that drive business intelligence and analytics. You will work closely with cross-functional teams to enhance data architecture and ensure data integrity, enabling data-driven decision-making across the organization.Your expertise in data processing and engineering will be crucial as you develop scalable data pipelines and optimize data workflows. This is an exciting opportunity to contribute to cutting-edge projects and make a significant impact on our technological landscape.
Join Altisource as a Business Analytics Manager!Are you a data-driven professional with a knack for turning insights into actionable strategies? Do you thrive in a dynamic environment where your analytical skills can lead to significant business impact? If so, we invite you to be part of our innovative team at Altisource, a leader in transforming the real estate transaction landscape in the United States.As a Business Analytics Manager reporting directly to the Vice President of Analytics and Strategy, you will lead a talented team focused on delivering key insights to enhance our operations. Your role will involve collaborating with various departments including Product, Technology, and Operations to identify challenges and develop scalable solutions that drive business growth.In this pivotal position, you will elevate our analytics function within Hubzu.com and Equator.com, ensuring we provide a superior customer experience for real estate buyers, sellers, and agents. You will implement best practices for data collection, reporting, and analysis, significantly improving user engagement across our platforms.Join us in leveraging your expertise in consumer psychology, quantitative modeling, and data science to uncover insights that propel our business forward!
Teamwork makes the stream work. Roku is revolutionizing the way the world experiences televisionAs the leading TV streaming platform in the U.S., Canada, and Mexico, Roku aims to empower every television globally. With a pioneering spirit, we connect users to their favorite content, support content creators in growing their audiences, and provide advertisers with innovative ways to engage viewers.From your first day at Roku, you'll be an integral part of our journey. As a fast-growing public company, everyone here plays a crucial role. Join us in delighting millions of TV streamers worldwide while gaining invaluable experience across various disciplines. About the TeamThe Roku Data Engineering team is dedicated to building a state-of-the-art big data platform that empowers both internal and external stakeholders to leverage data for business growth. Our team collaborates closely with business partners and engineering teams to gather metrics on essential initiatives for success. As a Senior Data Engineer focused on device metrics, you will design data models and develop scalable data pipelines to capture critical business metrics across our diverse range of Roku devices. About the RoleAt Roku, we connect users to the streaming content they love and enable content publishers to monetize large audiences while providing advertisers with unique capabilities to engage consumers. Our Roku streaming players and Roku TV™ models are available worldwide through direct retail sales and partnerships with TV brands and pay-TV operators. With millions of devices sold in numerous countries, thousands of streaming channels, and billions of hours of content consumed, the development of a scalable, highly available, fault-tolerant big data platform is crucial to our success. This role is based in Bengaluru, India and requires hybrid work, with three days in the office. What You'll Be DoingDevelop and maintain highly scalable, fault-tolerant distributed data processing systems (both batch and streaming) handling terabytes of data ingested daily and managing a petabyte-sized data warehouse.Design and implement efficient data models and pipelines that support business growth and decision-making.
Join smallest as a Research Data Engineer and play a pivotal role in transforming data into actionable insights. In this dynamic position, you will leverage advanced data engineering skills to collect, analyze, and interpret complex datasets. Collaborate with cross-functional teams to drive innovative solutions and contribute to impactful research projects.
Role Overview Wabtec Corporation is seeking a Manager of Advanced Analytics and AI Delivery in Bengaluru. This role leads projects that use data and artificial intelligence to improve business outcomes across global operations. The position focuses on applying analytics expertise to strengthen decision-making and advance technology initiatives. What You Will Do Lead advanced analytics and AI projects from concept through delivery Work with teams across departments to design and implement AI-driven strategies Apply analytics to optimize processes and support better business decisions Help identify opportunities where data and AI can boost efficiency and performance Collaboration This role partners closely with cross-functional teams, ensuring that AI solutions align with operational needs and deliver measurable improvements.
Full-time|₹500K/yr - ₹2M/yr|On-site|Bengaluru, Karnataka, India
Role overview Weekday's Client seeks a Big Data Developer in Bengaluru to improve and maintain data pipelines and processing systems that drive business intelligence and analytics. The position works with large volumes of both structured and unstructured data, spanning cloud and on-premise environments. Collaboration with data engineers, analysts, and product teams is central to delivering reliable, high-performance solutions that support business decisions. What you will do Design, develop, and maintain scalable data pipelines and big data processing systems. Build and optimize data architectures using AWS services to increase availability and performance. Use PostgreSQL for data storage, querying, and performance tuning. Process and analyze large datasets to enable analytics and reporting. Work with cross-functional teams to gather requirements and deliver data solutions. Maintain data quality, integrity, and security across all systems. Optimize data workflows for better performance, scalability, and cost efficiency. Implement ETL and ELT processes for smooth data ingestion and transformation. Monitor and troubleshoot data pipelines to ensure reliability and uptime. Integrate cloud and on-premise data systems to support hybrid environments. Document data architecture, workflows, and best practices for future growth. Location This role is based in Bengaluru, Karnataka, India.
Join our dynamic team as a Data Lakehouse IT Engineer where you will play a key role in designing and implementing cutting-edge data solutions. You will be responsible for managing data pipelines, ensuring data quality, and optimizing the data architecture to support our analytics initiatives. Your expertise in cloud technologies and data management will be crucial in driving our data strategy forward.
Role Overview weekday-1 is looking for a Data Engineer in Bengaluru, Karnataka. This role focuses on building and maintaining data pipelines to keep information moving smoothly across company systems. The work supports analytics and reporting that help guide business decisions. What You Will Do Design, implement, and maintain data pipelines Work with teams from different functions to understand data needs Support analytics efforts by ensuring reliable and accessible data Help deliver insights that improve business performance Location This position is based in Bengaluru, Karnataka, India.
Join the dynamic Founders Office at Gushwork as an Analytics Specialist, where you'll play a pivotal role in leveraging data to drive strategic decision-making. This position is perfect for analytically-minded individuals who are enthusiastic about transforming complex data into actionable insights.
CommerceIQ builds an AI-powered platform designed to support some of the world's largest brands. The company focuses on deploying AI agents that handle content, media, and sales functions within the operational systems of Fortune 100 companies. What you will do This role centers on managing and advancing DevOps practices for CommerceIQ's AI-driven solutions. The work involves supporting the integration and deployment of AI agents into enterprise-scale environments. Who you are Experience leading DevOps initiatives in a technology-driven organization is essential. Familiarity with supporting large-scale, AI-powered platforms will help in this position. Location This position is based in Bengaluru, Karnataka, India.
Join Renesa Electronics as a Staff DevOps Engineer specializing in AWS and Serverless architectures. In this dynamic role, you will collaborate with cross-functional teams to enhance our cloud infrastructure, ensuring robust deployment pipelines and automated processes. Your expertise will be key in driving our initiatives towards efficient and scalable cloud solutions.
We are seeking a dynamic and analytical Manager of Reporting & Analytics to join our team at Smiths Group. In this pivotal role, you will lead the development and implementation of reporting and analytics strategies that drive informed business decisions. You will work closely with cross-functional teams to gather insights and translate them into actionable recommendations.Your responsibilities will include overseeing the creation of comprehensive reports, analyzing data trends, and ensuring that our analytics framework aligns with the company's strategic objectives. This is a fantastic opportunity for someone who thrives in a data-driven environment and is passionate about leveraging analytics to foster growth.