Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
About the job
Toloka AI seeks a Freelance Machine Learning Engineer based in Greece to work remotely. This position centers on developing and refining machine learning models for a variety of AI projects.
Role overview
Work involves building new models and enhancing existing solutions to support projects in several industries. Projects may span different domains, offering exposure to diverse applications of AI and machine learning.
Location
This is a remote role open to candidates located in Greece.
Toloka AI seeks a Freelance Machine Learning Engineer based in Greece to work remotely. This position centers on developing and refining machine learning models for a variety of AI projects. Role overview Work involves building new models and enhancing existing solutions to support projects in several industries. Projects may span different domains, offering…
Are you enthusiastic about big data and eager to engage with advanced technologies? We invite you to explore a thrilling opportunity as a Data Engineer - Spark Developer with our dynamic and growing development teams. Whether you prefer the vibrant atmosphere of our Athens office or the flexibility of remote work, we are excited to welcome your expertise and passion.Key Responsibilities: Architect, develop, test, deploy, maintain, and enhance data pipelines; Implement coding solutions using Apache Spark on Azure Databricks; Create and design big data architectures leveraging Azure Data Factory, Service Bus, BI, Databricks, and other Azure Services. Essential Qualifications: Bachelor's degree in Computer Science or Software Engineering; Strong analytical mindset, team-oriented, dedicated to quality, and eager to learn; Comprehensive understanding of Apache Spark; Proven experience as a Data Engineer; Advanced proficiency in Python or Scala; Expertise in Spark query tuning and performance enhancement; Familiarity with cloud platforms such as Azure, AWS, or GCP; Fluent in both spoken and written English. Preferred Qualifications: Ability to understand and analyze Directed Acyclic Graph (DAG) operations; Experience in providing cost estimates for big data processing; Capability to write and review architecture documentation. Benefits:We value talent and commitment and offer a range of benefits to our team members:Competitive full-time salary;Comprehensive private health coverage under the company’s group program;Flexible working hours;Access to state-of-the-art tools;Opportunities for professional development including language courses, specialized training, and continuous learning;Career advancement opportunities with leading specialists in the industry;A dynamic work environment that promotes challenging goals, autonomy, and mentorship, supporting both personal and company growth.If you are looking for an exciting challenge, keen to work with innovative technologies, and enjoy your work, we would love to hear from you! Please submit your detailed CV in English, referencing: (DESD/02/26).Explore our other open positions by visiting our career section at www.eurodyn.com and follow us on Twitter (@EURODYN_Careers) and LinkedIn.European Dynamics (www.eurodyn.com) is a prominent European company specializing in Software, Information, and Communication Technologies, with a robust international presence.
Jobgether is looking for a Senior Full-Stack Engineer with strong skills in Python and React. This position is based in Greece and centers on building and improving the company’s platforms. Role overview This role focuses on developing new features and maintaining existing applications using Python and React. Collaboration with other engineers and team members is a regular part of the job, with the goal of delivering reliable and user-friendly products. What you will do Design, build, and refine applications across the stack using Python and React Work closely with peers to solve technical challenges and deliver updates Play a key part in shaping the direction of jobgether’s platforms Requirements Professional experience with both Python and React Ability to work effectively with a team Comfortable contributing to projects in a collaborative setting
Join our innovative team as a Semantic Data Engineer, where you'll play a crucial role in enhancing a sophisticated platform centered around RDF data models, SPARQL queries, and structured datasets. Your primary responsibilities will involve comprehending, maintaining, and advancing the semantic layer of our system, collaborating closely with backend engineers and architects. This position is ideal for a passionate specialist with a keen interest in data modeling, semantics, and knowledge representation within real-world production environments.Key Responsibilities:Analyze and uphold RDF/TTL data models and vocabularies;Design, optimize, and manage SPARQL queries;Facilitate data ingestion, transformation, and validation processes;Ensure the consistency and accuracy of semantic data throughout the platform;Work alongside backend engineers to integrate semantic logic into application workflows;Assist in documenting semantic models, assumptions, and constraints;Engage in troubleshooting data quality and reasoning challenges.
Location: Tavros, Attica, Greece About Aambience Services Aambience Services delivers workflow automation and information technology services to clients who value quality and expertise. The company focuses on building trust-based relationships and delivering strong customer experiences. Team members are valued as partners in meeting client needs, and employee development is a core value. Role Overview The Lead Data Engineer will guide the creation and management of data products for clients. This role involves forming and leading a specialized Data Engineering team responsible for designing and maintaining workflows that support data creation and movement. The position requires hands-on experience developing end-to-end data solutions and applying software industry best practices. What You Will Do Lead a team of data engineers, supporting their growth and professional development Design, build, and maintain data workflows and products for client projects Implement industry best practices in data engineering and software development Ensure reliable data movement and creation to support valuable client services What We Value Experience building end-to-end data solutions Ability to guide and mentor engineering teams Commitment to quality, collaboration, and continuous improvement
Join Kpler, where we simplify the complexities of global trade, providing insights that empower organizations in the commodities, energy, and maritime sectors. Since our inception in 2014, we've been committed to delivering premier intelligence through intuitive platforms. With a diverse team of over 700 experts from more than 35 countries, we transform intricate data into actionable strategies to help our clients thrive in a dynamic market landscape.We are on the lookout for a passionate and skilled Senior Data Analyst to join our Business Intelligence & Insights team. In this pivotal role, you will contribute to the data foundation that drives Kpler's commercial and strategic decision-making. Reporting to the Head of BI, you'll manage essential data pipelines, design scalable Looker solutions, and serve as a trusted advisor to stakeholders across the organization.This high-impact position is ideal for individuals who excel at the crossroads of data engineering and business analytics. One day, you’ll be constructing robust, production-ready infrastructure, and the next, you’ll be translating complex datasets into actionable insights for a commercial audience. If you’re excited about working with real-time commodity flow data and shaping the BI strategy of a rapidly growing B2B SaaS company, we want to hear from you.
Jobgether is looking for a Senior Data Engineer based in Greece. This position centers on building and maintaining data systems that support the company’s decision-making processes. Role overview The Senior Data Engineer will design and implement data solutions, with a focus on reliability and scalability. Collaboration with teams across the organization is a key part of the role, helping to shape the company’s overall data strategy. What you will do Develop and maintain data pipelines to support analytics and business needs Work with cross-functional teams on data projects Monitor and improve data quality throughout systems Contribute to the development and evolution of data strategy at Jobgether
cepal seeks a Senior Cloud Data Engineer based in Nea Smyrni, Attica, Greece. The position focuses on managing and enhancing a cloud-first data processing environment. Core technologies include Databricks, AWS, Spark, Unity Catalog, and Delta Lake. The team values efficiency, security, and reliability across data pipelines and analytics workloads. Key responsibilities Optimize data transformations using PySpark and Spark SQL within Databricks notebooks, handling large datasets for efficient processing. Automate, schedule, and manage data workflows with orchestration tools such as Apache Airflow to ensure consistent execution. Contribute to code reviews, testing, and documentation throughout the development lifecycle. Support and troubleshoot Databricks jobs, Spark workloads, and AWS-based data processes across development, QA, and production environments. Tune Databricks clusters and jobs to improve performance and manage costs. Maintain and enhance existing data pipelines built with AWS CodePipeline, Delta Lake, and Databricks Notebooks. Collaborate with data engineering and analytics teams to strengthen data quality and pipeline reliability. Develop and maintain CI/CD workflows for Databricks deployments using AWS tools. Manage access controls through IAM and Unity Catalog to ensure secure and compliant data usage. Monitor, troubleshoot, and perform root-cause analysis of data and compute workloads regularly.
About EveryPayEveryPay is dedicated to revolutionizing the digital financial landscape of e-commerce in Greece. Our mission is to empower Marketplaces and Merchants, enabling them to succeed in a competitive environment.We are a vibrant team of young professionals, united by our core values of Empowering Customers, Collaborating as a Team, Managing Risks, and Delivering Results.We take pride in having created the payment infrastructure that connects numerous Greek Marketplaces and Merchants with global payment schemes such as Visa and MasterCard. Our services extend to Greece's largest and most successful marketplace, Skroutz.Our systems interface with thousands of banks, both domestically and internationally. Our technology handles tens of thousands of transactions daily, amounting to billions of euros in e-commerce. If you have made an online purchase in Greece, you have likely interacted with our payment solutions.EveryPay is a wholly-owned subsidiary of the Skroutz Group of Companies, functioning as both a Technology Firm and a Regulated Financial Services Institution. This unique position offers you exposure to both the Tech Payments Sector and the realm of Financial Services.Your Role in EveryPay's Vision:We are looking to expand our Data Platform team by hiring a skilled Data Platform Engineer. In this role, you will design, build, and maintain the core data platform that drives analytics and business intelligence at EveryPay. You will be instrumental in developing robust data ingestion pipelines, establishing scalable data infrastructure, and enabling our BI team to extract actionable insights from data. Your contributions will ensure that high-quality, reliable data is readily accessible to all stakeholders within the organization.Key Challenges You Will Tackle:Data Ingestion at Scale: Design and implement scalable, reliable data ingestion pipelines that handle data from diverse internal and external sources.Platform Enablement: Construct, operate, and optimize our data platform to empower BI and analytics teams to easily explore, analyze, and visualize data.Data Quality & Governance: Establish and uphold best practices for data quality, lineage, and governance to ensure data trustworthiness and compliance.Your Responsibilities:Architect, build, and maintain ETL/ELT pipelines for ingesting data from various systems (e.g., payment systems, marketplaces, SaaS tools).Establish and manage data platform infrastructure (cloud data warehouses, databases, orchestration tools, etc.).Collaborate closely with the BI team to understand data requirements and deliver efficient, reliable data models and datasets.Monitor pipeline performance and data quality, proactively troubleshooting and resolving any issues.
We are actively seeking a talented Data and AI Engineer specializing in Intelligent Manufacturing to join our dynamic R&D team. This position is open to candidates fluent in English, and you can work remotely from Athens, Greece, or Luxembourg. As a member of our esteemed team, you will collaborate with diverse experts focused on achieving shared objectives. We value individuals who exemplify responsiveness, integrity, and reliability, complemented by exceptional analytical and organizational skills. Your role will support projects funded by the European Commission and national research initiatives, where you'll be expected to demonstrate initiative, flexibility, and accountability. If you have significant experience in intelligent manufacturing production lines, robotics, IoT applications, or related technologies with a software focus, we encourage you to apply. Experience with European-funded R&D projects will be an asset.Key Responsibilities:Oversee and execute projects funded by the European Commission;Develop comprehensive document deliverables and reports for R&D projects and business development (requirements, conceptual models, specifications, integration plans, etc.);Design, develop, and document innovative manufacturing technologies and solutions, including AI for manufacturing and eco-friendly production lines;Implement new manufacturing processes and technologies for next-generation intelligent production lines.
About the Role netcompany1 is looking for a Junior to Mid-level Data Engineer in Athens. This role focuses on building and improving data pipelines that support business decisions. The work involves designing, implementing, and optimizing data flows to keep information accurate and reliable. Main Responsibilities Design and build data pipelines for business and analytics needs Optimize existing data processes for efficiency and quality Work with data scientists, analysts, and engineers to improve data architecture Help ensure data integrity across projects Collaboration This position works closely with cross-functional teams, including data scientists and analysts, to support analytical projects and improve how data is used throughout the company. Who We're Looking For Interest in data engineering and analytics Willingness to learn new technologies and approaches Strong teamwork and communication skills
Become a vital part of our growing development teams as we are on the lookout for an experienced Big Data Developer. In this role, you will be at the forefront of innovation, contributing to large-scale international projects. You will collaborate closely with our agile software implementation and maintenance teams to not only develop new solutions but also enhance existing systems.Key Responsibilities:Design and implement robust solutions for big data processing using technologies such as Java and Python.
Become a Key Player at Bally’s Intralot as an Oracle PL/SQL Developer!Your Role:As an Oracle PL/SQL Engineer at Bally’s Intralot, you will play a pivotal role in designing and enhancing the database architecture that supports our enterprise platforms. Collaborating with backend engineers, you will ensure seamless integration of PL/SQL logic with Java/Spring Boot services, contributing to the development of scalable, high-quality solutions utilized in global markets.Responsibilities:Design, develop, and maintain PL/SQL packages, procedures, functions, triggers, and views.Work closely with Java/Spring Boot teams to integrate PL/SQL logic into backend services.Optimize SQL queries and database structures for enhanced performance and scalability.Support data migration activities, ETL processes, and facilitate integration between diverse systems.Participate in database design, schema reviews, and code quality assessments.Diagnose and resolve performance issues, production incidents, and database bottlenecks.Keep technical documentation clear and up-to-date.Contribute to the refinement of development standards, code quality, and automation practices.
Optasia is a cutting-edge B2B2X financial technology platform specializing in scoring, financial decision-making, disbursement, and collections. Our mission is to promote financial inclusion for everyone, and we pride ourselves on transforming the world in our own unique way.We are on the lookout for passionate and proactive professionals who are driven by results and possess a can-do attitude. Join a team of like-minded individuals dedicated to delivering innovative solutions in an exciting environment.We invite you to apply for the position of Data Engineer within our expanding Data Engineering team. In this role, you will design and implement highly scalable end-to-end batch and streaming data pipelines, contributing to the overall success of Optasia.Your responsibilities will include:Enhancing the scalability, stability, accuracy, speed, and efficiency of our existing data systems.Designing and developing end-to-end data processing pipelines.Navigating a diverse technology stack, including Scala, Spark, Python3, Bash/Python scripting, Hadoop, and SQL.Designing, constructing, testing, and deploying new libraries, frameworks, or complete systems while adhering to the highest standards of testing and code quality.Developing, maintaining, and optimizing core libraries for batch processing and large volume data ingestion into our big data infrastructure.Building and maintaining CI/CD orchestration.What we expect from you:Bachelor's or Master's degree in Computer Science or Informatics.A minimum of 2 years experience in Data Engineering.Proven experience in software/data engineering and/or operations/DevOps/DataOps.Familiarity with the Apache Hadoop ecosystem (YARN, HDFS, HBase, Spark).Hands-on experience with both relational and NoSQL databases.Proficient in systems administration with Linux.Experience in deploying, configuring, and maintaining distributed systems and data/software engineering tools.Your key attributes:Experience with fluid virtual infrastructures such as containers (e.g., Docker, Kubernetes).Familiarity with data and ML flow engines and tools, such as Apache Airflow.A strong passion for learning new technologies and collaborating with other creative professionals.Why you should join us:We offer a range of benefits including: Flexible hybrid working options Competitive remuneration package An extra day off on your birthday Performance-based bonus scheme Comprehensive private healthcare insurance All the tech gear you need to work efficientlyExperience the Optasia perks: Join our multicultural working environment Engage with a unique and promising business and industry Gain insights into the future market landscape Enjoy a solid career path within our working family.
Join Cyrex, a dynamic Magic Media company, where creativity meets cutting-edge technology in the media and tech landscape. We collaborate with top developers and publishers in the gaming industry, offering tailored support and innovative solutions through our team of global experts.At Cyrex, we pride ourselves on delivering exceptional results by embracing modern technology and progressive practices. Our team members are not just responsible for their daily tasks; they actively seek opportunities to enhance and expand their contributions.We are on the lookout for an energetic Python Agent Developer to join our team, working on thrilling cybersecurity and development initiatives specifically for the gaming sector. The ideal candidate will collaborate with cross-functional teams to create vital products and services for our clients, while also providing outstanding technical support. If you are a proactive, enthusiastic, and innovative thinker, we would love to hear from you!
Join a pioneering leader in education technology, recognized globally for excellence in the assessment and certification of professional skills across more than 200 countries. PeopleCert is actively seeking a talented Data Engineer to enhance our dynamic Data & AI team. This position is crucial in architecting, developing, and sustaining the infrastructure and data solutions that empower our AI-driven projects. The ideal candidate will possess substantial practical experience with Microsoft Azure technologies and demonstrate a strong passion for data engineering practices that facilitate machine learning, advanced analytics, and large-scale data processing.In this role, you will collaborate closely with the AI Center of Excellence, working alongside data scientists, ML engineers, software developers, analysts, and business stakeholders to enable data accessibility and drive intelligent applications.Your responsibilities will include:Designing, implementing, and maintaining scalable data pipelines and workflows to facilitate AI/ML model training, evaluation, and inference.Building and optimizing data integration solutions utilizing Azure data tools such as Synapse Analytics, Azure Data Factory, Databricks, and Delta Lake.Partnering with data scientists and AI engineers to ensure data is available in the correct format and quality for modeling purposes.Developing and maintaining APIs and data services that power AI-driven applications and insights delivery.Supporting the development of data lakes and lakehouses tailored for advanced analytics and AI use cases.Writing efficient, reusable Python and SQL code for data processing, cleaning, and transformation.Participating in code reviews and knowledge-sharing sessions within the team to cultivate best practices and continuous learning.Keeping abreast of emerging tools, cloud services, and trends in data engineering and AI infrastructure.
Full-time|Remote|Remote — Thessaloniki, Central Macedonia, Greece
Join our innovative team at EUROPEAN DYNAMICS as an IT Engineer (Helpdesk). We are looking for a dedicated professional to enhance our growing development teams.Key Responsibilities:Promptly address helpdesk tickets;Provide operational support for Azure cloud application issues;Manage daily ad hoc requests and tasks assigned by your supervisor;Assist with DevOps-related responsibilities.
iKnowHow Group is a leading international technology firm with over 24 years of expertise and a dedicated team of more than 300 skilled professionals. We specialize in delivering cutting-edge technology solutions across various sectors, including Energy, Telecommunications, Banking & Financial Services, and the Public Sector.Our specialized subsidiaries provide in-depth domain knowledge in fields such as Health and Robotics, merging extensive industry insight with advanced, future-oriented technologies.At the heart of iKnowHow S.A. is our commitment to delivering comprehensive project development, both internally and externally, transforming strategies into scalable, practical solutions. Our extensive portfolio includes Data & AI platforms, enterprise integration, cloud-native applications, and large-scale digital transformation projects, empowering leading organizations in both public and private sectors to adapt and thrive.We seek inquisitive, motivated individuals eager to make a significant impact through technology and grow alongside us.We are currently in search of a seasoned Oracle RDBMS and PL/SQL Developer to join our dynamic team, contributing to the design, development, and maintenance of robust business applications and databases.
Work Model: Hybrid | Employment Type: Full-timeAbout Accepted: We are a pioneering software and digital transformation services firm, dedicated to helping clients expedite innovation across various sectors including Finance, Energy, Gaming, and Telecommunications. With over 20 years of engineering excellence, we are recognized for our ability to create solutions driven by outcomes and for fostering high-performing teams that integrate seamlessly with your organization.We are seeking a talented Business Intelligence Engineer to enhance our hybrid delivery teams.Your ResponsibilitiesDashboard Development: Design and implement interactive, user-centric BI reports;Data Modeling: Build and refine tabular models utilizing DAX;ETL Pipeline Creation: Develop and manage SQL scripts to construct reliable data pipelines;Team Collaboration: Partner with team members to deliver data projects and achieve objectives;Quality Assurance: Oversee workspaces, uphold security measures, and enhance report performance.
ikh seeks a Senior Data Engineer / Team Lead to join the team in Marousi, Attica, Greece. This position blends technical data engineering with hands-on team leadership. The focus is on building effective data solutions while mentoring engineers as they transform complex data into actionable insights. Key responsibilities Lead a team of data engineers through all project phases, from initial planning to final delivery Design and implement data architecture to support business objectives Work with large, diverse data sets to extract and deliver meaningful information Contribute to the development of data engineering standards and practices within the team Location This role is based in Marousi, Attica, Greece.
Toloka AI seeks a Freelance Machine Learning Engineer based in Greece to work remotely. This position centers on developing and refining machine learning models for a variety of AI projects. Role overview Work involves building new models and enhancing existing solutions to support projects in several industries. Projects may span different domains, offering…
Are you enthusiastic about big data and eager to engage with advanced technologies? We invite you to explore a thrilling opportunity as a Data Engineer - Spark Developer with our dynamic and growing development teams. Whether you prefer the vibrant atmosphere of our Athens office or the flexibility of remote work, we are excited to welcome your expertise and passion.Key Responsibilities: Architect, develop, test, deploy, maintain, and enhance data pipelines; Implement coding solutions using Apache Spark on Azure Databricks; Create and design big data architectures leveraging Azure Data Factory, Service Bus, BI, Databricks, and other Azure Services. Essential Qualifications: Bachelor's degree in Computer Science or Software Engineering; Strong analytical mindset, team-oriented, dedicated to quality, and eager to learn; Comprehensive understanding of Apache Spark; Proven experience as a Data Engineer; Advanced proficiency in Python or Scala; Expertise in Spark query tuning and performance enhancement; Familiarity with cloud platforms such as Azure, AWS, or GCP; Fluent in both spoken and written English. Preferred Qualifications: Ability to understand and analyze Directed Acyclic Graph (DAG) operations; Experience in providing cost estimates for big data processing; Capability to write and review architecture documentation. Benefits:We value talent and commitment and offer a range of benefits to our team members:Competitive full-time salary;Comprehensive private health coverage under the company’s group program;Flexible working hours;Access to state-of-the-art tools;Opportunities for professional development including language courses, specialized training, and continuous learning;Career advancement opportunities with leading specialists in the industry;A dynamic work environment that promotes challenging goals, autonomy, and mentorship, supporting both personal and company growth.If you are looking for an exciting challenge, keen to work with innovative technologies, and enjoy your work, we would love to hear from you! Please submit your detailed CV in English, referencing: (DESD/02/26).Explore our other open positions by visiting our career section at www.eurodyn.com and follow us on Twitter (@EURODYN_Careers) and LinkedIn.European Dynamics (www.eurodyn.com) is a prominent European company specializing in Software, Information, and Communication Technologies, with a robust international presence.
Jobgether is looking for a Senior Full-Stack Engineer with strong skills in Python and React. This position is based in Greece and centers on building and improving the company’s platforms. Role overview This role focuses on developing new features and maintaining existing applications using Python and React. Collaboration with other engineers and team members is a regular part of the job, with the goal of delivering reliable and user-friendly products. What you will do Design, build, and refine applications across the stack using Python and React Work closely with peers to solve technical challenges and deliver updates Play a key part in shaping the direction of jobgether’s platforms Requirements Professional experience with both Python and React Ability to work effectively with a team Comfortable contributing to projects in a collaborative setting
Join our innovative team as a Semantic Data Engineer, where you'll play a crucial role in enhancing a sophisticated platform centered around RDF data models, SPARQL queries, and structured datasets. Your primary responsibilities will involve comprehending, maintaining, and advancing the semantic layer of our system, collaborating closely with backend engineers and architects. This position is ideal for a passionate specialist with a keen interest in data modeling, semantics, and knowledge representation within real-world production environments.Key Responsibilities:Analyze and uphold RDF/TTL data models and vocabularies;Design, optimize, and manage SPARQL queries;Facilitate data ingestion, transformation, and validation processes;Ensure the consistency and accuracy of semantic data throughout the platform;Work alongside backend engineers to integrate semantic logic into application workflows;Assist in documenting semantic models, assumptions, and constraints;Engage in troubleshooting data quality and reasoning challenges.
Location: Tavros, Attica, Greece About Aambience Services Aambience Services delivers workflow automation and information technology services to clients who value quality and expertise. The company focuses on building trust-based relationships and delivering strong customer experiences. Team members are valued as partners in meeting client needs, and employee development is a core value. Role Overview The Lead Data Engineer will guide the creation and management of data products for clients. This role involves forming and leading a specialized Data Engineering team responsible for designing and maintaining workflows that support data creation and movement. The position requires hands-on experience developing end-to-end data solutions and applying software industry best practices. What You Will Do Lead a team of data engineers, supporting their growth and professional development Design, build, and maintain data workflows and products for client projects Implement industry best practices in data engineering and software development Ensure reliable data movement and creation to support valuable client services What We Value Experience building end-to-end data solutions Ability to guide and mentor engineering teams Commitment to quality, collaboration, and continuous improvement
Join Kpler, where we simplify the complexities of global trade, providing insights that empower organizations in the commodities, energy, and maritime sectors. Since our inception in 2014, we've been committed to delivering premier intelligence through intuitive platforms. With a diverse team of over 700 experts from more than 35 countries, we transform intricate data into actionable strategies to help our clients thrive in a dynamic market landscape.We are on the lookout for a passionate and skilled Senior Data Analyst to join our Business Intelligence & Insights team. In this pivotal role, you will contribute to the data foundation that drives Kpler's commercial and strategic decision-making. Reporting to the Head of BI, you'll manage essential data pipelines, design scalable Looker solutions, and serve as a trusted advisor to stakeholders across the organization.This high-impact position is ideal for individuals who excel at the crossroads of data engineering and business analytics. One day, you’ll be constructing robust, production-ready infrastructure, and the next, you’ll be translating complex datasets into actionable insights for a commercial audience. If you’re excited about working with real-time commodity flow data and shaping the BI strategy of a rapidly growing B2B SaaS company, we want to hear from you.
Jobgether is looking for a Senior Data Engineer based in Greece. This position centers on building and maintaining data systems that support the company’s decision-making processes. Role overview The Senior Data Engineer will design and implement data solutions, with a focus on reliability and scalability. Collaboration with teams across the organization is a key part of the role, helping to shape the company’s overall data strategy. What you will do Develop and maintain data pipelines to support analytics and business needs Work with cross-functional teams on data projects Monitor and improve data quality throughout systems Contribute to the development and evolution of data strategy at Jobgether
cepal seeks a Senior Cloud Data Engineer based in Nea Smyrni, Attica, Greece. The position focuses on managing and enhancing a cloud-first data processing environment. Core technologies include Databricks, AWS, Spark, Unity Catalog, and Delta Lake. The team values efficiency, security, and reliability across data pipelines and analytics workloads. Key responsibilities Optimize data transformations using PySpark and Spark SQL within Databricks notebooks, handling large datasets for efficient processing. Automate, schedule, and manage data workflows with orchestration tools such as Apache Airflow to ensure consistent execution. Contribute to code reviews, testing, and documentation throughout the development lifecycle. Support and troubleshoot Databricks jobs, Spark workloads, and AWS-based data processes across development, QA, and production environments. Tune Databricks clusters and jobs to improve performance and manage costs. Maintain and enhance existing data pipelines built with AWS CodePipeline, Delta Lake, and Databricks Notebooks. Collaborate with data engineering and analytics teams to strengthen data quality and pipeline reliability. Develop and maintain CI/CD workflows for Databricks deployments using AWS tools. Manage access controls through IAM and Unity Catalog to ensure secure and compliant data usage. Monitor, troubleshoot, and perform root-cause analysis of data and compute workloads regularly.
About EveryPayEveryPay is dedicated to revolutionizing the digital financial landscape of e-commerce in Greece. Our mission is to empower Marketplaces and Merchants, enabling them to succeed in a competitive environment.We are a vibrant team of young professionals, united by our core values of Empowering Customers, Collaborating as a Team, Managing Risks, and Delivering Results.We take pride in having created the payment infrastructure that connects numerous Greek Marketplaces and Merchants with global payment schemes such as Visa and MasterCard. Our services extend to Greece's largest and most successful marketplace, Skroutz.Our systems interface with thousands of banks, both domestically and internationally. Our technology handles tens of thousands of transactions daily, amounting to billions of euros in e-commerce. If you have made an online purchase in Greece, you have likely interacted with our payment solutions.EveryPay is a wholly-owned subsidiary of the Skroutz Group of Companies, functioning as both a Technology Firm and a Regulated Financial Services Institution. This unique position offers you exposure to both the Tech Payments Sector and the realm of Financial Services.Your Role in EveryPay's Vision:We are looking to expand our Data Platform team by hiring a skilled Data Platform Engineer. In this role, you will design, build, and maintain the core data platform that drives analytics and business intelligence at EveryPay. You will be instrumental in developing robust data ingestion pipelines, establishing scalable data infrastructure, and enabling our BI team to extract actionable insights from data. Your contributions will ensure that high-quality, reliable data is readily accessible to all stakeholders within the organization.Key Challenges You Will Tackle:Data Ingestion at Scale: Design and implement scalable, reliable data ingestion pipelines that handle data from diverse internal and external sources.Platform Enablement: Construct, operate, and optimize our data platform to empower BI and analytics teams to easily explore, analyze, and visualize data.Data Quality & Governance: Establish and uphold best practices for data quality, lineage, and governance to ensure data trustworthiness and compliance.Your Responsibilities:Architect, build, and maintain ETL/ELT pipelines for ingesting data from various systems (e.g., payment systems, marketplaces, SaaS tools).Establish and manage data platform infrastructure (cloud data warehouses, databases, orchestration tools, etc.).Collaborate closely with the BI team to understand data requirements and deliver efficient, reliable data models and datasets.Monitor pipeline performance and data quality, proactively troubleshooting and resolving any issues.
We are actively seeking a talented Data and AI Engineer specializing in Intelligent Manufacturing to join our dynamic R&D team. This position is open to candidates fluent in English, and you can work remotely from Athens, Greece, or Luxembourg. As a member of our esteemed team, you will collaborate with diverse experts focused on achieving shared objectives. We value individuals who exemplify responsiveness, integrity, and reliability, complemented by exceptional analytical and organizational skills. Your role will support projects funded by the European Commission and national research initiatives, where you'll be expected to demonstrate initiative, flexibility, and accountability. If you have significant experience in intelligent manufacturing production lines, robotics, IoT applications, or related technologies with a software focus, we encourage you to apply. Experience with European-funded R&D projects will be an asset.Key Responsibilities:Oversee and execute projects funded by the European Commission;Develop comprehensive document deliverables and reports for R&D projects and business development (requirements, conceptual models, specifications, integration plans, etc.);Design, develop, and document innovative manufacturing technologies and solutions, including AI for manufacturing and eco-friendly production lines;Implement new manufacturing processes and technologies for next-generation intelligent production lines.
About the Role netcompany1 is looking for a Junior to Mid-level Data Engineer in Athens. This role focuses on building and improving data pipelines that support business decisions. The work involves designing, implementing, and optimizing data flows to keep information accurate and reliable. Main Responsibilities Design and build data pipelines for business and analytics needs Optimize existing data processes for efficiency and quality Work with data scientists, analysts, and engineers to improve data architecture Help ensure data integrity across projects Collaboration This position works closely with cross-functional teams, including data scientists and analysts, to support analytical projects and improve how data is used throughout the company. Who We're Looking For Interest in data engineering and analytics Willingness to learn new technologies and approaches Strong teamwork and communication skills
Become a vital part of our growing development teams as we are on the lookout for an experienced Big Data Developer. In this role, you will be at the forefront of innovation, contributing to large-scale international projects. You will collaborate closely with our agile software implementation and maintenance teams to not only develop new solutions but also enhance existing systems.Key Responsibilities:Design and implement robust solutions for big data processing using technologies such as Java and Python.
Become a Key Player at Bally’s Intralot as an Oracle PL/SQL Developer!Your Role:As an Oracle PL/SQL Engineer at Bally’s Intralot, you will play a pivotal role in designing and enhancing the database architecture that supports our enterprise platforms. Collaborating with backend engineers, you will ensure seamless integration of PL/SQL logic with Java/Spring Boot services, contributing to the development of scalable, high-quality solutions utilized in global markets.Responsibilities:Design, develop, and maintain PL/SQL packages, procedures, functions, triggers, and views.Work closely with Java/Spring Boot teams to integrate PL/SQL logic into backend services.Optimize SQL queries and database structures for enhanced performance and scalability.Support data migration activities, ETL processes, and facilitate integration between diverse systems.Participate in database design, schema reviews, and code quality assessments.Diagnose and resolve performance issues, production incidents, and database bottlenecks.Keep technical documentation clear and up-to-date.Contribute to the refinement of development standards, code quality, and automation practices.
Optasia is a cutting-edge B2B2X financial technology platform specializing in scoring, financial decision-making, disbursement, and collections. Our mission is to promote financial inclusion for everyone, and we pride ourselves on transforming the world in our own unique way.We are on the lookout for passionate and proactive professionals who are driven by results and possess a can-do attitude. Join a team of like-minded individuals dedicated to delivering innovative solutions in an exciting environment.We invite you to apply for the position of Data Engineer within our expanding Data Engineering team. In this role, you will design and implement highly scalable end-to-end batch and streaming data pipelines, contributing to the overall success of Optasia.Your responsibilities will include:Enhancing the scalability, stability, accuracy, speed, and efficiency of our existing data systems.Designing and developing end-to-end data processing pipelines.Navigating a diverse technology stack, including Scala, Spark, Python3, Bash/Python scripting, Hadoop, and SQL.Designing, constructing, testing, and deploying new libraries, frameworks, or complete systems while adhering to the highest standards of testing and code quality.Developing, maintaining, and optimizing core libraries for batch processing and large volume data ingestion into our big data infrastructure.Building and maintaining CI/CD orchestration.What we expect from you:Bachelor's or Master's degree in Computer Science or Informatics.A minimum of 2 years experience in Data Engineering.Proven experience in software/data engineering and/or operations/DevOps/DataOps.Familiarity with the Apache Hadoop ecosystem (YARN, HDFS, HBase, Spark).Hands-on experience with both relational and NoSQL databases.Proficient in systems administration with Linux.Experience in deploying, configuring, and maintaining distributed systems and data/software engineering tools.Your key attributes:Experience with fluid virtual infrastructures such as containers (e.g., Docker, Kubernetes).Familiarity with data and ML flow engines and tools, such as Apache Airflow.A strong passion for learning new technologies and collaborating with other creative professionals.Why you should join us:We offer a range of benefits including: Flexible hybrid working options Competitive remuneration package An extra day off on your birthday Performance-based bonus scheme Comprehensive private healthcare insurance All the tech gear you need to work efficientlyExperience the Optasia perks: Join our multicultural working environment Engage with a unique and promising business and industry Gain insights into the future market landscape Enjoy a solid career path within our working family.
Join Cyrex, a dynamic Magic Media company, where creativity meets cutting-edge technology in the media and tech landscape. We collaborate with top developers and publishers in the gaming industry, offering tailored support and innovative solutions through our team of global experts.At Cyrex, we pride ourselves on delivering exceptional results by embracing modern technology and progressive practices. Our team members are not just responsible for their daily tasks; they actively seek opportunities to enhance and expand their contributions.We are on the lookout for an energetic Python Agent Developer to join our team, working on thrilling cybersecurity and development initiatives specifically for the gaming sector. The ideal candidate will collaborate with cross-functional teams to create vital products and services for our clients, while also providing outstanding technical support. If you are a proactive, enthusiastic, and innovative thinker, we would love to hear from you!
Join a pioneering leader in education technology, recognized globally for excellence in the assessment and certification of professional skills across more than 200 countries. PeopleCert is actively seeking a talented Data Engineer to enhance our dynamic Data & AI team. This position is crucial in architecting, developing, and sustaining the infrastructure and data solutions that empower our AI-driven projects. The ideal candidate will possess substantial practical experience with Microsoft Azure technologies and demonstrate a strong passion for data engineering practices that facilitate machine learning, advanced analytics, and large-scale data processing.In this role, you will collaborate closely with the AI Center of Excellence, working alongside data scientists, ML engineers, software developers, analysts, and business stakeholders to enable data accessibility and drive intelligent applications.Your responsibilities will include:Designing, implementing, and maintaining scalable data pipelines and workflows to facilitate AI/ML model training, evaluation, and inference.Building and optimizing data integration solutions utilizing Azure data tools such as Synapse Analytics, Azure Data Factory, Databricks, and Delta Lake.Partnering with data scientists and AI engineers to ensure data is available in the correct format and quality for modeling purposes.Developing and maintaining APIs and data services that power AI-driven applications and insights delivery.Supporting the development of data lakes and lakehouses tailored for advanced analytics and AI use cases.Writing efficient, reusable Python and SQL code for data processing, cleaning, and transformation.Participating in code reviews and knowledge-sharing sessions within the team to cultivate best practices and continuous learning.Keeping abreast of emerging tools, cloud services, and trends in data engineering and AI infrastructure.
Full-time|Remote|Remote — Thessaloniki, Central Macedonia, Greece
Join our innovative team at EUROPEAN DYNAMICS as an IT Engineer (Helpdesk). We are looking for a dedicated professional to enhance our growing development teams.Key Responsibilities:Promptly address helpdesk tickets;Provide operational support for Azure cloud application issues;Manage daily ad hoc requests and tasks assigned by your supervisor;Assist with DevOps-related responsibilities.
iKnowHow Group is a leading international technology firm with over 24 years of expertise and a dedicated team of more than 300 skilled professionals. We specialize in delivering cutting-edge technology solutions across various sectors, including Energy, Telecommunications, Banking & Financial Services, and the Public Sector.Our specialized subsidiaries provide in-depth domain knowledge in fields such as Health and Robotics, merging extensive industry insight with advanced, future-oriented technologies.At the heart of iKnowHow S.A. is our commitment to delivering comprehensive project development, both internally and externally, transforming strategies into scalable, practical solutions. Our extensive portfolio includes Data & AI platforms, enterprise integration, cloud-native applications, and large-scale digital transformation projects, empowering leading organizations in both public and private sectors to adapt and thrive.We seek inquisitive, motivated individuals eager to make a significant impact through technology and grow alongside us.We are currently in search of a seasoned Oracle RDBMS and PL/SQL Developer to join our dynamic team, contributing to the design, development, and maintenance of robust business applications and databases.
Work Model: Hybrid | Employment Type: Full-timeAbout Accepted: We are a pioneering software and digital transformation services firm, dedicated to helping clients expedite innovation across various sectors including Finance, Energy, Gaming, and Telecommunications. With over 20 years of engineering excellence, we are recognized for our ability to create solutions driven by outcomes and for fostering high-performing teams that integrate seamlessly with your organization.We are seeking a talented Business Intelligence Engineer to enhance our hybrid delivery teams.Your ResponsibilitiesDashboard Development: Design and implement interactive, user-centric BI reports;Data Modeling: Build and refine tabular models utilizing DAX;ETL Pipeline Creation: Develop and manage SQL scripts to construct reliable data pipelines;Team Collaboration: Partner with team members to deliver data projects and achieve objectives;Quality Assurance: Oversee workspaces, uphold security measures, and enhance report performance.
ikh seeks a Senior Data Engineer / Team Lead to join the team in Marousi, Attica, Greece. This position blends technical data engineering with hands-on team leadership. The focus is on building effective data solutions while mentoring engineers as they transform complex data into actionable insights. Key responsibilities Lead a team of data engineers through all project phases, from initial planning to final delivery Design and implement data architecture to support business objectives Work with large, diverse data sets to extract and deliver meaningful information Contribute to the development of data engineering standards and practices within the team Location This role is based in Marousi, Attica, Greece.