Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Experience Level
Senior Level Manager
Qualifications
The ideal candidate will possess a strong analytical background with expertise in data modeling, statistical analysis, and visualization techniques. A minimum of 5 years of experience in data analytics or a related field is preferred, along with proficiency in data analysis tools such as SQL, Python, and Tableau. A Master's degree in Data Science, Statistics, or a related discipline is a significant advantage.
About the job
Delivery Hero seeks a Senior Manager, Data & Analytics based in Karachi. This leadership position shapes how the company uses data to inform business decisions and improve operations.
Role overview
The Senior Manager leads a team dedicated to transforming raw data into actionable insights. Oversight includes managing the processes for collecting, analyzing, and interpreting data across the organization.
Key responsibilities
Guide a team of data professionals in their daily work
Direct the collection and analysis of business data
Translate findings into recommendations that support strategic and operational goals
Impact
This role plays a central part in helping Delivery Hero use analytics to drive efficiency and support informed decision-making at multiple levels of the business.
About Delivery Hero
Delivery Hero is a leading global online food ordering and delivery service, connecting customers with their favorite restaurants. Our mission is to provide an easy and seamless experience for food lovers globally, and we pride ourselves on our commitment to innovation and excellence.
Who We Are:Motive is at the forefront of empowering organizations that manage physical operations. We provide cutting-edge tools designed to enhance safety, productivity, and profitability. For the first time, teams from safety, operations, and finance can manage their drivers, vehicles, equipment, and fleet-related expenses through a unified system. With ou…
Role: Data EngineerLocation: Egypt, Uzbekistan, and Pakistan (Remote)Work Week: Sunday – ThursdayWork Timings: 9:00 AM – 6:00 PM (Saudi Arabian Time Zone)Overview:Join our dynamic team as a Data Engineer, where you will play a crucial role in designing, building, and maintaining the robust data infrastructure that fuels our analytics, machine learning models, and strategic decision-making processes. Your expertise will be vital in constructing scalable data pipelines, integrating various data sources, and ensuring the quality, reliability, and accessibility of data throughout the organization. Collaborating closely with data scientists, analysts, and product teams, you will empower data-driven insights while optimizing for both performance and scalability. This position offers a unique opportunity to directly influence how data is utilized in a rapidly-growing company.
pavago is looking for a Data Engineer based in Pakistan to join its remote team. This position centers on building and maintaining the systems that move and organize data across the company. Role overview This role focuses on designing, developing, and supporting data pipelines. The work ensures that data flows smoothly and remains accessible for teams that depend on accurate, timely information. What you will do Create and improve data pipelines to handle large volumes of information Maintain data infrastructure for reliability and efficiency Work with modern technologies to support data accessibility across the organization Location This is a remote position open to candidates based in Pakistan.
Join the Devsinc Team! We are seeking a passionate and skilled Lead Data Engineer to become a vital part of our innovative data team. In this crucial position, you will spearhead the design and optimization of data pipelines, turning raw data into insightful analytics that drive informed decision-making. Your collaborative spirit and deep expertise in data engineering will significantly influence our data architecture, ensuring data integrity and accessibility for our organization.Key Responsibilities:Architect, develop, and refine scalable data pipelines and models to enhance the extraction, transformation, and loading (ETL) processes.Engage with data scientists, analysts, and business stakeholders to identify data needs and deliver effective data solutions that align with strategic objectives.Maintain high standards of data quality, consistency, and accuracy by implementing robust validation and testing processes.Leverage cloud services and data warehousing technologies to manage substantial datasets efficiently.Oversee and resolve issues in production data pipelines, ensuring data availability and reliability.Participate in data governance efforts by promoting best practices in data security, privacy, and compliance.Stay abreast of industry trends and advancements in data engineering technologies.
Join 9D Technologies, a leading innovator in mobile application development dedicated to crafting exceptional digital experiences that engage users worldwide. Our goal is to push the boundaries of creativity and technology, creating captivating applications that entertain and inspire.Key Responsibilities:Develop, oversee, and optimize ETL/ELT pipelines to efficiently process extensive datasets from diverse sources.Assist in the design and enhancement of databases to provide reliable data storage and retrieval solutions.Collaborate with cross-functional teams to convert raw data into structured formats suitable for analytics and reporting.Perform data quality checks, troubleshoot issues, and ensure the integrity and consistency of data.Engage with data analysts and engineers to align on data architecture and infrastructure requirements.Maintain comprehensive documentation for data processes, workflows, and best practices.
Job Overview:We are on the lookout for a talented Senior Data Engineer to become a vital part of our innovative team at creativechaos. The successful candidate will possess extensive expertise in crafting and deploying data pipelines, optimizing data processes, and managing substantial datasets. You will be tasked with establishing and sustaining robust data infrastructure while working alongside diverse teams to tackle data-oriented technical challenges and meet data infrastructure requirements.Key Responsibilities: Design and construct scalable, reliable data pipelines ensuring exceptional availability and performance. Create complex datasets that conform to both functional and non-functional business specifications. Identify, strategize, and execute enhancements to internal processes, including automating manual tasks, optimizing data delivery, and reengineering infrastructure for improved scalability. Adopt best practices for data storage, processing, and retrieval. Collaborate with various stakeholders, including executives, data scientists, and product managers, to comprehend data requirements and deploy effective data solutions. Enhance and fine-tune data workflows to achieve peak performance and efficiency. Ensure data security and compliance with pertinent data privacy regulations. Keep abreast of emerging technologies and industry advancements in data engineering and analytics. Mentor junior data engineers, providing guidance and support. Qualifications: A Bachelor's or Master's degree in Computer Science, Engineering, or a related discipline. A minimum of 7 years of experience in data engineering or a comparable role. Proficient in programming languages such as Python, Scala, or Java. Experience in designing and implementing data pipelines with tools like Apache Kafka, Apache Spark, or AWS Glue. Strong SQL skills and familiarity with database technologies like PostgreSQL, MySQL, or MongoDB. Understanding of cloud platforms, particularly Azure. Experience with data modeling, ETL processes, and data warehousing principles. Exceptional problem-solving and troubleshooting capabilities. Excellent communication and teamwork skills. Detail-oriented with a proactive approach to work. What We Offer: Paid Time Off Work From Home Health Insurance OPD Training and Development Life Insurance
Who We Are:Motive empowers individuals managing physical operations by providing innovative tools that enhance safety, productivity, and profitability. For the first time, safety, operations, and finance teams can efficiently oversee drivers, vehicles, equipment, and fleet-related expenditures within a unified system. Our advanced AI technology offers unparalleled visibility and control, significantly minimizing manual workloads through automation and simplification of tasks.With nearly 100,000 customers ranging from Fortune 500 companies to small businesses, Motive operates across diverse sectors, including transportation, logistics, construction, energy, field service, manufacturing, agriculture, food and beverage, retail, and the public sector.To learn more, visit gomotive.com.About the Role:The Sales Data Quality Analyst II is a crucial member of the GTM Strategy and Sales Operations team, committed to maintaining the accuracy, completeness, and actionable nature of Salesforce data across all sales functions. As the primary expert in sales data integrity, this position emphasizes account and contact data management, segmentation, and global hierarchy oversight (including regions like the UK and Mexico). The Analyst II will craft and implement sophisticated data quality frameworks, conduct in-depth analyses using SQL and Salesforce, and transform intricate data challenges into clear business insights and strategies that enhance pipeline health, territory planning, and executive decision-making.Responsibilities:Lead impactful data quality investigations and diagnostics.Proactively detect data issues and anomalies across accounts, contacts, and opportunities, utilizing SQL and Salesforce reporting for root-cause analysis and business impact quantification.Establish and uphold stringent data quality standards.Define and document benchmarks for high-quality sales data (e.g., completeness, consistency, and accuracy thresholds), contributing to data quality scorecards and rules.Engineer and manage proactive data monitoring frameworks.Develop, maintain, and refine automated checks and recurring reports to monitor data health (e.g., identifying duplicates, missing key fields, and invalid hierarchies), while surfacing prioritized issues and trends to GTM stakeholders.
Full-time|On-site|Islamabad, Islamabad Capital Territory, Pakistan
As a Data Reporting and Management Specialist at prime-system, you will play a pivotal role in our Development team, focused on crafting and implementing scalable, data-centric solutions for internal teams and clients alike.This position combines advanced reporting and analytics, data architecture, integration, and governance with a consultative, solution-focused methodology. You will serve as a trusted advisor, guiding stakeholders from problem identification to solution implementation, while ensuring the data ecosystem's reliability, security, and sustainability.Key Responsibilities:Consultation & Solution ArchitectureCollaborate with stakeholders to identify business challenges and design effective data solutions.Develop data models and reporting solutions that optimize accuracy, scalability, performance, and cost.Recommend tools, integrations, and architectures that align with business objectives and technical requirements.Influence data standards and strategies through careful design, documentation, and training.Reporting & AnalyticsCreate and manage high-quality reports, dashboards, and metrics to support operational and strategic decision-making.Translate complex data sets into clear and actionable insights suitable for both technical and non-technical audiences.Ensure that reporting solutions maintain accuracy, efficiency, and alignment with business goals.Establish and advocate for reporting best practices and standards.Data Management & GovernanceOversee the integrity, security, and reliability of internal data systems.Implement and uphold data quality, governance, and access standards.Act as a subject matter expert for data platforms and reporting solutions.Data Integration & WarehousingArchitect and support data integration pipelines across diverse systems.Design and manage data warehouses and centralized data models that facilitate analytics and reporting.
Devsinc is on the lookout for a skilled Data Engineer with at least 2 years of professional experience to become a vital part of our expanding data team. In this exciting role, you will architect and create scalable data pipelines, engage with advanced cloud platforms, and establish the groundwork for analytics that inform key business decisions. From your first day, you’ll receive mentorship from senior engineers, work with a cutting-edge cloud stack, and witness the significant impact of your contributions.Key Responsibilities:Design, develop, and sustain automated ETL/ELT data pipelines for both structured and unstructured datasets.Create and refine scalable, secure, and cost-effective cloud data solutions using AWS, Azure, or GCP.Model, clean, and transform data to facilitate analytics, dashboards, and reporting use cases.Implement automated testing, monitoring, and alerting to guarantee high data quality and reliability.Develop high-performance Python-based services and utilities for data ingestion and processing.Engage with APIs, event-driven systems, and streaming platforms to support real-time data workflows.Collaborate with cross-functional teams (Data Science, Backend, DevOps, Product) to gather requirements and deliver custom data solutions.Adhere to strong software engineering best practices — including clean code, modularity, version control, and CI/CD.Document architecture, data flows, schemas, and development standards.Keep abreast of the latest data engineering tools, frameworks, and cloud-native technologies.
Join the innovative team at Devsinc as a Software Engineer II – AI & Data Engineering. We are seeking a talented individual with over 2.5 years of professional experience in developing and deploying robust AI/ML systems, applications powered by LLMs, and scalable data engineering solutions.This position demands a strong foundation in AI/ML Engineering, MLOps, Backend Engineering, and Data Engineering. You will take ownership of the project lifecycle, from the design of LLM applications, RAG pipelines, embeddings, and inference systems to the construction of ETL/ELT pipelines, cloud-native infrastructures, and architectures for real-time data processing.Key Responsibilities:Craft, develop, enhance, and deploy AI/ML models, including LLM-powered applications, RAG pipelines, embeddings, vector search architectures, and inference systems tailored for real-world applications.Develop and refine high-performance Python APIs, microservices, and backend services for AI workloads, collaborating with Engineering teams, Project Managers, and business stakeholders to deliver scalable, production-ready AI solutions.Establish and manage MLOps workflows and cloud-native infrastructures across AWS, Azure, and GCP, covering experiment tracking, model versioning, deployment automation, monitoring, and model optimization techniques like hyperparameter tuning and quantization.Design, develop, and sustain scalable ETL/ELT pipelines for both structured and unstructured datasets.Create and enhance data transformation, cleansing, validation, and quality frameworks, utilizing distributed and streaming technologies such as Kafka, Spark, Kinesis, and Pub/Sub for real-time data processing.Guarantee reliability, scalability, security, and cost-efficiency across AI and data infrastructures, while documenting architectural decisions, technical workflows, and engineering standards.
Role Overview:Join Adal Fintech as a Senior Data Engineer and play a pivotal role in building and optimizing our data infrastructure. You will leverage your expertise in SQL, PL/SQL, Stored Procedures, Database Query Optimization, SSIS, Apache Spark, and Python to design and enhance data pipelines and ETL workflows that are both efficient and scalable, supporting our business intelligence and analytics goals.About AdalFi:AdalFi is at the forefront of revolutionizing digital lending in Pakistan. We are developing the country’s fastest-growing AI-driven digital lending platform, empowering banks to launch innovative credit solutions swiftly. By harnessing cutting-edge AI and data analytics, we enable financial institutions to make informed and rapid lending decisions, significantly enhancing access to credit for millions.Responsibilities:• Develop and maintain robust ETL processes and data pipelines utilizing SQL, PL/SQL, SSIS, Apache Spark, and Python.• Create and fine-tune complex queries and stored procedures to ensure optimal performance.• Manage large-scale structured and unstructured datasets, ensuring their quality, security, and compliance.• Collaborate with Business Intelligence teams and stakeholders to deliver reliable, scalable data solutions.• Document technical designs and workflows to facilitate knowledge sharing.Ideal Candidate:• Minimum 4 years of experience in the Data Engineering domain.• Proficient in SQL, PL/SQL, Stored Procedures, query optimization, and performance tuning.• Proven experience with ETL tools (SSIS), big data technologies (Apache Spark), and Python programming.• Strong grasp of data modeling, warehousing, and relational database management.• Familiarity with version control systems like Git, CI/CD practices, and collaborative problem-solving skills.Qualifications:• Bachelor’s degree in Computer Science, Information Technology, or a related field.• Experience with cloud computing platforms such as AWS, Azure, or GCP.• Understanding of data governance principles, security protocols, and compliance regulations.• Exposure to NoSQL databases and real-time data processing methodologies.
Join Octopus Digital, a subsidiary of Avanceon Limited, as a Data Engineer and play a pivotal role in designing and implementing sophisticated data warehouse solutions across our organization. We are seeking skilled professionals who are adept at managing vast volumes of structured and unstructured data using cutting-edge technologies.Job Responsibilities:Apply hands-on expertise in Azure or AWS platforms.Utilize Spark and Python for data processing tasks.Design and create efficient codes, scripts, and data pipelines to handle both structured and unstructured datasets.Oversee the management of data ingestion pipelines and stream processing.Develop and optimize complex SQL queries and shell scripts.Conduct Big Data querying to derive insights.Work with NoSQL databases to support data needs.Experience with Hadoop distributions, particularly Azure DataLake HDFS.Certification or training in Data Lake HDFS is an added advantage.Demonstrate the ability to meet deadlines, troubleshoot issues, and provide resolutions with minimal supervision.
Join our innovative team at Seekatechnology as a Backend AI & Data Pipeline Engineer. In this role, you will be responsible for designing and implementing robust data pipelines that facilitate the integration of AI technologies into our systems. Your expertise will play a crucial role in optimizing data flow and ensuring seamless data processing.
Join the dynamic team at Speechify as a Software Engineer focused on Data Infrastructure and Acquisition. In this role, you will be essential in developing robust data solutions and systems that enhance our product offerings. You will collaborate with cross-functional teams to design, implement, and optimize data pipelines, ensuring high availability and performance. Your contribution will directly impact our ability to deliver exceptional user experiences.
Role overview Smart Working Solutions seeks a Senior Data Engineer in Pakistan with a focus on Google Cloud Platform (GCP), BigQuery, and Looker. This role centers on developing and maintaining data pipelines that adapt as business needs grow. Building dashboards to help teams make informed decisions is also a central responsibility. What you will do Design and implement data pipelines using GCP and BigQuery Develop and maintain scalable solutions for integrating and processing data Create and manage dashboards in Looker to support analytics and reporting Assist teams throughout the company in making data-driven decisions
Delivery Hero seeks a Senior Manager, Data & Analytics based in Karachi. This leadership position shapes how the company uses data to inform business decisions and improve operations. Role overview The Senior Manager leads a team dedicated to transforming raw data into actionable insights. Oversight includes managing the processes for collecting, analyzing, and interpreting data across the organization. Key responsibilities Guide a team of data professionals in their daily work Direct the collection and analysis of business data Translate findings into recommendations that support strategic and operational goals Impact This role plays a central part in helping Delivery Hero use analytics to drive efficiency and support informed decision-making at multiple levels of the business.
Join our innovative team at Creative Chaos as a Data Engineer, specializing in Azure Data Lake. We are looking for an experienced professional to design, develop, and enhance data pipelines, enabling seamless processing of substantial datasets.Key Responsibilities:Create and manage efficient data pipeline architectures within Azure Data Lake.Transform and integrate data from diverse sources to support advanced analytics and reporting.Guarantee data quality and integrity through effective governance practices.Work collaboratively with cross-functional teams to ascertain data needs and devise scalable solutions.Optimize data processing workflows to enhance performance and reliability.Maintain thorough documentation of processes and architectures to ensure scalability and maintainability.Keep updated on the latest Azure technologies and data engineering best practices.
Full-time|On-site|Islamabad, Islamabad Capital Territory, Pakistan
Devsinc is actively seeking talented Data Scientists with 6 months to 1.5 years of experience, particularly in the realm of machine learning (ML). This position is perfect for those who have started to hone their skills in ML methodologies and are passionate about utilizing this expertise to tackle real-world problems. The ideal candidate will possess a solid grounding in ML techniques, a knack for analytical thinking to decode complex data challenges, and a commitment to driving data-informed decisions and innovations.Key Responsibilities:Design, develop, and implement machine learning models to solve specific business challenges, including data preprocessing, feature engineering, model selection, training, and validation.Conduct exploratory data analysis to discover hidden patterns, correlations, and insights within structured and unstructured datasets. Use these insights to optimize ML models and methodologies.Collaborate with a diverse team of data scientists, engineers, and business stakeholders to clarify data requirements and deliver ML-driven solutions.Create engaging visualizations to summarize the results of ML models and analyses. Prepare detailed reports and presentations that translate intricate ML concepts and findings into actionable business insights.Continuously seek educational opportunities in advanced machine learning techniques and algorithms, integrating innovative research and tools into projects to enhance model performance and efficiency.Contribute to the development of prototypes for predictive models and other ML applications, evaluating their effectiveness in practical scenarios.Explore opportunities to leverage insights, datasets, code, and models across various organizational functions, such as HR and marketing.Exhibit curiosity and enthusiasm for using algorithms to address challenges and inspire others to appreciate the value of your work.Maintain effective communication, both verbal and written, to understand data needs and report on results.
Join our dynamic team at pavago as a Sales Manager. In this fully remote role, you will lead our sales initiatives, develop strategies to drive revenue growth, and build strong relationships with clients. Your expertise in sales management will be essential in achieving our ambitious targets.
Job Summary:Join Creative Chaos as a Data Architect, where you will play a pivotal role in shaping the data architecture of our organization. You will collaborate with stakeholders, data scientists, and engineers to create scalable and efficient data solutions. Your proficiency in data modeling, database design, and integration techniques will be essential for the effective storage, retrieval, and analysis of data across various systems. Additionally, you will be instrumental in establishing data governance policies and best practices to uphold data quality, security, and compliance.Responsibilities:Design and develop comprehensive data models and database schemas.Define strategies for data integration and migration.Engage with stakeholders to comprehend data requirements and translate them into technical solutions.Enhance database performance and ensure scalability.Implement data governance policies and maintain data quality standards.Guarantee data security and compliance with applicable regulations.Stay informed about industry trends and emerging technologies in data management.Provide technical guidance and mentorship to junior team members.