Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Experience Level
Experience
About the job
We are seeking a motivated and experienced Site Manager to join our dynamic team at GEK TERNA. This role involves overseeing construction projects, ensuring they are completed on time, within budget, and to the highest quality standards. As a Site Manager, you will lead site operations, coordinate with subcontractors, and ensure compliance with safety regulations.
We are seeking a motivated and experienced Site Manager to join our dynamic team at GEK TERNA. This role involves overseeing construction projects, ensuring they are completed on time, within budget, and to the highest quality standards. As a Site Manager, you will lead site operations, coordinate with subcontractors, and ensure compliance with safety regula…
Join Kpler as a Business Intelligence Data Engineer where you will play a crucial role in transforming data into actionable insights. You will work with various data sources and be part of a dynamic team focused on enhancing our data platforms. You will have the opportunity to leverage your analytical skills to drive strategic decision-making and contribute to our innovative solutions.
Are you excited about Data & AI? At Satori Analytics, we are redefining the landscape of data and artificial intelligence. Our mission is to empower global brands by providing unparalleled clarity through innovative data solutions. We develop cloud-based ecosystems for fintech and predictive models for airlines, offering cutting-edge solutions that span the entire data lifecycle—from ingestion to AI applications.As a rapidly growing scale-up, our dynamic team of over 100 tech professionals—including Data Engineers, Data Scientists, and more—delivers transformative analytics solutions across diverse sectors such as FMCG, retail, manufacturing, and financial services. Join us in spearheading the data revolution in South-Eastern Europe and beyond!What Your Day Might Look Like:Technical & Delivery LeadershipLead the development and enhancement of data engineering standards, best practices, and architectural principles for all Satori projects.Serve as a senior technical authority for complex data platforms, including cloud data stacks, pipelines, streaming, and orchestration.Assist project teams in solution design, risk management, and technical decision-making processes.Evaluate and critique designs to ensure they meet scalability, performance, security, and cost-effectiveness criteria.Collaborate with Tech Leads to maintain consistency and quality across projects.People Management & LeadershipOversee Senior Data Engineers and Tech Leads, fostering growth, performance, and career advancement.Mentor engineers on technical depth, ownership, communication, and leadership skills.Contribute to performance evaluations, development plans, and promotion decisions in line with Satori’s competency framework.Exemplify Satori’s values of collaboration, transparency, and accountability.Cross-Functional CollaborationWork in tandem with Product Owners to align technical solutions with client requirements and delivery constraints.Partner with Data Science, AI, and Cloud teams to ensure seamless end-to-end solutions.Support presales and discovery phases by providing technical insights, estimations, and solution framing when necessary.Organizational ImpactIdentify skill gaps, tooling, or process improvements and recommend practical solutions.Engage in internal initiatives, such as guilds, playbooks, training, and knowledge sharing.Help scale the data engineering capabilities as Satori expands, ensuring quality and culture are preserved.
Elevate your career with us! Join our dynamic development teams in Athens or work remotely as a Data Engineer. In this vital role within our agile team, you will help design and implement cutting-edge big data solutions on a scalable cloud platform. You will analyze millions of real-time data points to extract advanced insights and enhance analytics capabilities for our end users.Your Responsibilities: Develop and implement batch processing pipelines utilizing Spark (Python or Scala) and SQL; Design and execute streaming ETL/ELT processes from a variety of data sources; Write and maintain code for developing comprehensive big data solutions, focusing on data integration and analytics use cases; Create and implement APIs using contemporary Python frameworks; Collaborate effectively with our Business Analysis teams to align technical solutions with business needs; Conduct end-to-end and functional testing using open-source tools; Set up monitoring solutions for our data platform, including alerts and dashboards. Essential Qualifications: Bachelor’s degree in Computer Science or Software Engineering; Extensive knowledge of Apache Spark; Proficient in Python and database management; Previous experience as a Data Engineer; Familiarity with Azure Data Lake Storage and Delta Live Tables; Fluency in English, both written and spoken; Strong analytical skills and a team-oriented mindset; A passion for learning and professional growth in data engineering. Preferred Qualifications: Experience with Databricks; Proficiency in API development with FastAPI; Familiarity with cloud platforms (AWS, Azure, GCP, etc.); Experience with Docker. Why Join Us?We value talent and commitment, offering a range of benefits for our team members, including:Competitive full-time salary;Comprehensive private health coverage under the company’s group program;Flexible working hours;Access to state-of-the-art tools;Opportunities for professional development including language courses and specialized training;Career advancement opportunities with industry-leading specialists;A dynamic work environment that encourages personal and professional growth through challenging goals and mentorship.If you're ready to embrace an exciting challenge, work with cutting-edge technologies, and enjoy your daily tasks, we invite you to apply! Please submit your detailed CV in English, referencing: (SDE/02/26).Explore all our open vacancies by visiting the career section of our website.
Are you enthusiastic about big data and eager to engage with advanced technologies? We invite you to explore a thrilling opportunity as a Data Engineer - Spark Developer with our dynamic and growing development teams. Whether you prefer the vibrant atmosphere of our Athens office or the flexibility of remote work, we are excited to welcome your expertise and passion.Key Responsibilities: Architect, develop, test, deploy, maintain, and enhance data pipelines; Implement coding solutions using Apache Spark on Azure Databricks; Create and design big data architectures leveraging Azure Data Factory, Service Bus, BI, Databricks, and other Azure Services. Essential Qualifications: Bachelor's degree in Computer Science or Software Engineering; Strong analytical mindset, team-oriented, dedicated to quality, and eager to learn; Comprehensive understanding of Apache Spark; Proven experience as a Data Engineer; Advanced proficiency in Python or Scala; Expertise in Spark query tuning and performance enhancement; Familiarity with cloud platforms such as Azure, AWS, or GCP; Fluent in both spoken and written English. Preferred Qualifications: Ability to understand and analyze Directed Acyclic Graph (DAG) operations; Experience in providing cost estimates for big data processing; Capability to write and review architecture documentation. Benefits:We value talent and commitment and offer a range of benefits to our team members:Competitive full-time salary;Comprehensive private health coverage under the company’s group program;Flexible working hours;Access to state-of-the-art tools;Opportunities for professional development including language courses, specialized training, and continuous learning;Career advancement opportunities with leading specialists in the industry;A dynamic work environment that promotes challenging goals, autonomy, and mentorship, supporting both personal and company growth.If you are looking for an exciting challenge, keen to work with innovative technologies, and enjoy your work, we would love to hear from you! Please submit your detailed CV in English, referencing: (DESD/02/26).Explore our other open positions by visiting our career section at www.eurodyn.com and follow us on Twitter (@EURODYN_Careers) and LinkedIn.European Dynamics (www.eurodyn.com) is a prominent European company specializing in Software, Information, and Communication Technologies, with a robust international presence.
Full-time|On-site|Athens or Ioannina, Sterea Ellada, Greece
Location: Athens or Ioannina, Sterea Ellada, Greece About Snappi Bank Snappi Bank is building a neobank from the ground up. The team focuses on financial freedom by delivering transparent, technology-driven digital banking services. The company aims to reshape how people interact with their finances. Role Overview The Data Engineer will design, build, and manage data architecture and pipelines that support data acquisition, storage, processing, and analysis across the organization. This position is open in both the Athens and Ioannina offices. Main Responsibilities Create and maintain data pipelines and infrastructure for efficient ingestion, processing, and storage of large datasets. Work with data scientists, analysts, and other stakeholders to understand data needs and translate them into technical solutions. Develop and optimize data models and schemas for effective storage and retrieval. Build and manage ETL processes to bring data from various sources into data warehouses or lakes. Monitor and troubleshoot pipelines to ensure data integrity, reliability, and performance. Evaluate and introduce new tools or technologies to improve data processing and operational efficiency. Document pipelines, processes, and solutions to support knowledge sharing and maintainability. Partner with infrastructure and DevOps teams to deploy and manage data systems in cloud environments. Keep up with trends and best practices in data engineering and analytics. Qualifications Bachelor’s degree in Computer Science, Electronics, or equivalent experience in data roles. Minimum 5 years of experience in a similar position (7+ years preferred; 3-5 years considered for junior roles). Strong skills in SQL and Python; experience with Azure Data Factory is a plus. Excellent interpersonal skills, including listening, negotiation, and presentation. Clear verbal and written communication abilities. Attention to detail. Effective decision-making, problem analysis, and resolution skills. Strong organizational habits. Proactive approach to problem-solving. Comfort working in a fast-changing environment. Interest in agile software processes, data-driven development, reliability, and experimentation; experience with Agile product teams is a plus. Why Work at Snappi? Snappi Bank values innovation, trust, and ongoing growth. The team focuses on solutions and results. This is a chance to make a real impact on the future of banking and improve financial services for a broad audience.
Join our innovative team as a Semantic Data Engineer, where you'll play a crucial role in enhancing a sophisticated platform centered around RDF data models, SPARQL queries, and structured datasets. Your primary responsibilities will involve comprehending, maintaining, and advancing the semantic layer of our system, collaborating closely with backend engineers and architects. This position is ideal for a passionate specialist with a keen interest in data modeling, semantics, and knowledge representation within real-world production environments.Key Responsibilities:Analyze and uphold RDF/TTL data models and vocabularies;Design, optimize, and manage SPARQL queries;Facilitate data ingestion, transformation, and validation processes;Ensure the consistency and accuracy of semantic data throughout the platform;Work alongside backend engineers to integrate semantic logic into application workflows;Assist in documenting semantic models, assumptions, and constraints;Engage in troubleshooting data quality and reasoning challenges.
Join Kpler as a Data Engineer specializing in Dry Bulk Commodities. In this pivotal role, you will design, implement, and optimize data pipelines to support our dynamic analytics platform. Collaborate with cross-functional teams to enhance data accessibility and ensure high data quality, driving insights that empower our clients in the commodities market.
Join Our Innovative Team at Kaizen GamingKaizen Gaming, the dynamic force behind Betano, stands as one of the premier GameTech companies globally, serving 19 diverse markets. Our mission is to harness advanced technology to deliver unparalleled entertainment experiences to millions of satisfied customers.Our vibrant workforce consists of over 2,700 talented individuals from more than 40 nationalities spanning three continents.We take pride in being recognized among the Best Workplaces in Europe and are certified as a Great Place to Work in all our offices. At Kaizen Gaming, every day is a new opportunity to excel. Are you ready to Press Play on your career potential?About the RoleAs a Lead Data Scientist, you will spearhead our AI initiatives by analyzing complex datasets and developing machine learning models that drive our innovative AI products. The ideal candidate will possess an in-depth knowledge of machine learning algorithms and a proven track record of deploying ML/AI applications in production environments.Key Responsibilities:Convert product specifications into actionable machine learning tasks and pinpoint high-impact AI opportunities.Conduct comprehensive data analysis to unearth vital patterns and derive actionable insights.Execute exploratory data analysis (EDA) and feature engineering to facilitate the modeling process.Implement best practices in model selection, parameter tuning, and validation.Conduct comparative experiments to enhance model training.Analyze machine learning metrics to assess various solution options.Oversee the complete lifecycle of AI features, from data collection through model design to implementation and optimization in live environments.Mentor junior team members, sharing expertise and leading intricate projects.
METRO AEBE operates a large network of retail stores in Greece and Cyprus, including My Market, My Market Local, METRO Cash & Carry, and BEST VALUE. With more than 11,000 employees, the company has been recognized as a Top Employer for both 2025 and 2026. As the company continues to expand, it remains committed to business growth alongside sustainable practices. The Data Warehouse (DWH) team in Athens is adding a Data Engineer to support these efforts. Role overview This Data Engineer position focuses on designing, improving, and optimizing enterprise data products. The work directly supports data-driven decision-making throughout METRO AEBE.
Optasia is a dynamic B2B2X financial technology platform that specializes in scoring, financial decision-making, disbursement, and collection. Our mission is to foster financial inclusion globally, as we are transforming the financial landscape.We are looking for passionate and energetic professionals who are results-oriented and possess a proactive mindset. Join our innovative team of like-minded individuals dedicated to delivering cutting-edge solutions in an exciting environment. At Optasia, data is central to our growth strategy, and our Data Engineering team plays a vital role in our success through data-driven insights and decision-making.As an ML & Data Ops Engineer at Optasia, you will be pivotal in ensuring the availability and performance of our workflows. You will monitor and optimize our existing data system infrastructure, prioritize issue resolution, and promote a culture of continuous improvement. Your contributions will be essential in maintaining high service standards and reinforcing our commitment to client success.Key Responsibilities:Oversee daily monitoring of ML workflows, service alerts, and all data system infrastructure, including storage, applications, databases, and resources.Act as an escalation point for critical ML and data incidents, diagnosing issues and resolving or escalating as needed.Champion continuous integration, continuous delivery, and automation practices to facilitate seamless software transitions from development to production.Manage customer inquiries efficiently to ensure prompt responses and resolutions within defined SLAs.Encourage a culture of continuous improvement by identifying enhancements in processes, tools, and technologies.Conduct health and quality checks of ML and data workflows to ensure operational excellence.Collaborate with cross-functional teams including ML/Data Engineering, Algorithmic Trading, and System & Network Administration to resolve internal and client-related issues effectively.Develop training materials and deliver sessions, maintaining comprehensive technical documentation.
Join Gek-Terna, a leading construction company in Greece, as a Construction Site Supervisor. In this vital role, you will oversee construction activities, ensuring compliance with safety standards and project specifications. Your leadership will guide the team in executing tasks efficiently and effectively, contributing to the successful delivery of our projects.
Join Optasia, a pioneering B2B2X financial technology platform that is revolutionizing financial inclusion through advanced scoring, decision-making, disbursement, and collection solutions. We are dedicated to changing the world of finance for the better.At Optasia, we thrive on innovation, and we are looking for passionate individuals who are results-oriented and eager to collaborate in a dynamic team environment. Data is the backbone of our growth strategy, and the Data Engineering team plays a crucial role in driving data-driven automated decision-making processes.We invite you to become a part of our expanding Data Engineering team as a Junior Data Engineer. In this role, you will have the opportunity to design and implement robust batch and streaming data pipelines, significantly contributing to our mission.Your ResponsibilitiesCollaborate with Data Architects, Machine Learning Engineers, and Data Analysts to design, deliver, and support the company's ETL and ML pipelines.Develop and maintain essential libraries for batch processing and large-scale data ingestion.Create and implement new processes to facilitate real-time data streaming from various sources.Enhance the efficiency and accuracy of data injection into our big data infrastructure.Explore and apply innovative data engineering patterns and technologies.Your QualificationsBachelor's or Master's degree in Computer Science, Informatics, or a related field.Understanding of data modeling, mining, and warehousing methodologies.Proficiency in at least one programming language, preferably Java, Scala, or Python.Solid SQL skills.Strong analytical abilities with meticulous attention to detail.A genuine enthusiasm for learning new technologies and working collaboratively with creative professionals.Preferred QualificationsPractical experience with Big Data technologies (e.g., Apache Spark, YARN, HDFS, MapReduce).Experience with NoSQL databases.Familiarity with mathematical modeling, algorithm development, and machine learning.Why Optasia?What we offer: Flexible hybrid working environment Competitive salary package Additional day off on your birthday Performance-based bonuses Comprehensive private health insurance All necessary tech gear to work efficientlyJoin our multicultural team at Optasia and be part of a unique and rewarding workplace!
Join our dynamic team at Netcompany1 as a Senior Business Intelligence Engineer! In this pivotal role, you will leverage your expertise to design and implement data-driven solutions that enhance business performance. Your contributions will be crucial in transforming data into actionable insights that drive strategic decision-making.
Join our team as a Senior Backend Engineer (Python) and engage in a challenging project that focuses on data-intensive processing, system integration, and semantic technologies. In this pivotal role, you will take charge of an established codebase, enhancing its stability and evolving it to meet the project’s demands. This position is perfect for seasoned engineers who thrive in working with complex systems and have a solid grasp of data-driven architectures.Key Responsibilities:Thoroughly analyze and take ownership of an existing Python-based backend.Refactor and optimize backend components with an emphasis on maintainability and performance.Develop and sustain data processing pipelines and integration workflows.Collaborate closely with data and semantic engineers on RDF/SPARQL-driven processes.Contribute to the architectural redesign and technical documentation.Assist with deployment, configuration, and troubleshooting tasks.Ensure high code quality through rigorous reviews, testing, and adherence to best practices.Requirements:A minimum of 3 years of professional experience in backend development.Expertise in Python programming.Experience with:Apache Airflow and AWS.Data-intensive or integration-heavy systems.APIs, batch processing, and backend services.Configuration-driven systems (XML / JSON / YAML).Strong understanding of:Software architecture and design patterns.Debugging and maintaining legacy codebases.Proven experience in complex, multi-stakeholder projects (experience in EU or public-sector projects is a plus).Preferred Qualifications:Familiarity with Semantic Web technologies (RDF, SPARQL, OWL).Experience in data modeling or knowledge-based systems.Exposure to DevOps practices (CI/CD, containerization).Experience in contributing to or maintaining technical documentation (e.g., AsciiDoc, Antora).Benefits:We value talent and commitment, offering the following perks:Attractive full-time salary.Private health insurance under the company’s group plan.Flexible working hours.Access to top-quality tools.Opportunities for professional development, including language courses and specialized training.Career advancement potential by collaborating with leading specialists in the field.A dynamic work environment that encourages personal and professional growth.If you're ready for an exciting challenge, we would love to hear from you!
Location: Athens, Greece Company: d-one About d-one d-one is a consultancy focused on data, artificial intelligence, and transformation strategies. The firm helps organizations turn their ambitions into tangible results by aligning strategy, building strong data foundations, delivering large-scale solutions, and fostering trust through effective governance and a people-first approach. With offices in Zurich, Munich, and Athens, d-one supports clients across Europe and covers the entire data value chain using a pragmatic, people-centered methodology. Role Overview: Data & AI Engineer Based in the Athens consulting hub, the Data & AI Engineer works closely with project teams to advise clients through the conception, planning, and implementation of data and AI solutions. The role involves hands-on collaboration and occasional travel to Switzerland or other international locations for client work and team meetings at the Zurich headquarters. Key Responsibilities Analyze client requirements and design tailored data solutions. Contribute to projects in areas such as cloud engineering, infrastructure, data engineering (ETL), data science, and generative AI applications. Communicate findings and insights to stakeholders and management. Travel occasionally for client engagement and collaboration with the Zurich team. What We Look For University degree in Computer Science or a related field. Proficiency in technologies and programming languages such as SQL, Python, and Spark. Experience working with large datasets, streaming data, and databases. Background in cloud development and architecture, especially with Azure, AWS, or GCP. Hands-on experience with Databricks, Microsoft Fabric, or Palantir Foundry. Pragmatic, down-to-earth, and focused on results. Excellent communication skills in English. What d-one Offers A calm, focused work environment that values what matters most. An international team of talented, ambitious colleagues who enjoy both challenges and camaraderie. Opportunities for growth through the D ONE Academy, including mentoring, knowledge-sharing, expert groups, and tailored soft-skill and leadership training. Regular offsite social events. What Helps You Succeed Here Commitment to fully understanding client needs. Ability to deliver clear, high-value results. Intellectual agility and a client-friendly approach to problem-solving. Appreciation for creative, collaborative teamwork.
Join Our Team at Kaizen Gaming!Kaizen Gaming, the driving force behind Betano, is a leading GameTech company with operations in 20 global markets. We are dedicated to harnessing the latest technology to deliver exceptional entertainment experiences to millions of our valued customers.Our diverse workforce consists of over 2,700 Kaizeners from more than 40 nationalities across three continents. We take pride in being recognized as one of the Best Workplaces in Europe and as a certified Great Place to Work. Here, each day offers unique challenges and opportunities. Are you ready to unleash your potential? About the RoleAs a Senior AI Data Governance Analyst, you will play an essential role within our Data/AI Governance team, contributing to the design and execution of a comprehensive AI governance framework across the organization. Collaborating with both business and technical stakeholders, you will ensure the integrity, consistency, availability, and compliance of AI models and applications, which is vital in the highly regulated betting and gaming industry. Key Responsibilities:Develop and enforce AI governance policies, standards, and procedures.Facilitate the implementation of AI metadata management, data lineage, and quality frameworks.Profile datasets to evaluate their accuracy, completeness, consistency, and integrity for effective AI utilization.Identify, document AI quality issues, and recommend corrective measures.Monitor and analyze AI quality KPIs, supporting remediation initiatives.Maintain AI quality dashboards and reports to communicate trends to stakeholders.Engage in cross-functional meetings to align AI governance priorities.Enhance AI business glossaries, data dictionaries, and catalogs.Collaborate with AI/Data owners, stewards, and custodians to promote governance best practices.Assist with compliance-related AI & Data initiatives (e.g., EU AI Act, GDPR, AML, Responsible Gaming reporting).Evaluate and support AI/Data governance tools (e.g., Atlan, Collibra, Informatica, Microsoft Purview).Contribute to the continuous evolution of the AI governance framework and best practices.
Do you have a passion for Artificial Intelligence? At Satori Analytics, we are on a mission to revolutionize the world through innovative algorithms that provide insights to global brands using Data & AI. Our state-of-the-art solutions encompass the full data lifecycle—from data ingestion to advanced AI applications—serving various sectors including fintech, airlines, FMCG, retail, manufacturing, and financial services.As a rapidly expanding scale-up with a team of over 100 technology specialists—including Data Engineers, Data Scientists, and more—we are at the forefront of the data revolution in South-Eastern Europe and beyond. Come and be part of our journey!Your Daily Responsibilities:Collaborate and Learn: Work alongside Solution Architects, Developers, and Business Analysts to ensure the successful delivery of projects.Build Data Solutions: Assist in the design and development of ETL pipelines utilizing tools such as Azure Data Factory and Databricks.Get Hands-On: Participate in data migrations, cloud transitions, and automation of routine tasks.Solve Problems: Help troubleshoot and debug data processes under the mentorship of senior engineers.Grow Your Skills: Thrive in a dynamic environment with ample opportunities for professional development.
About Us: At tbibankgr, we are a dynamic challenger bank situated in Southeast Europe, recognized as a regional leader in alternative payment solutions. Our innovative ecosystem merges financing and shopping to meet the diverse needs of our customers. With a successful and profitable business model, we are proud to serve clients in Romania, Germany, Bulgaria, and Lithuania.In our quest to expand our global presence, we have established a significant footprint in Greece, collaborating with thousands of merchants and consumers. Are you ready to contribute to our unique success story?Your Role: We are seeking a Senior Data Analyst to join our dedicated team in Greece!Key Responsibilities:Conduct data discovery to identify insights.Collaborate with partners to explore potential data sources and develop data interpretation logic.Acquire data from both primary and secondary sources.Analyze and interpret data, providing regular and ad-hoc reports and insights.Engage in the continuous development of our Data Warehouse (DWH) facilities.Work with extensive datasets from multiple sources to create cohesive data reports.Utilize data manipulation and transformation tools effectively.
Role overview d-one seeks a Senior Databricks Engineer in Athens, Greece. The focus is on developing and enhancing data solutions that help meet business objectives. Candidates should bring strong hands-on experience with Databricks and a solid track record in designing, implementing, and improving analytics infrastructure. What you will do Design and build data pipelines and solutions with Databricks Collaborate with cross-functional teams to gather requirements and turn them into technical deliverables Develop and maintain ETL processes Optimize data workflows for better performance and scalability Contribute to the ongoing development of the company’s data strategy Location This role is based in Athens, Attica, Greece.
We are seeking a motivated and experienced Site Manager to join our dynamic team at GEK TERNA. This role involves overseeing construction projects, ensuring they are completed on time, within budget, and to the highest quality standards. As a Site Manager, you will lead site operations, coordinate with subcontractors, and ensure compliance with safety regula…
Join Kpler as a Business Intelligence Data Engineer where you will play a crucial role in transforming data into actionable insights. You will work with various data sources and be part of a dynamic team focused on enhancing our data platforms. You will have the opportunity to leverage your analytical skills to drive strategic decision-making and contribute to our innovative solutions.
Are you excited about Data & AI? At Satori Analytics, we are redefining the landscape of data and artificial intelligence. Our mission is to empower global brands by providing unparalleled clarity through innovative data solutions. We develop cloud-based ecosystems for fintech and predictive models for airlines, offering cutting-edge solutions that span the entire data lifecycle—from ingestion to AI applications.As a rapidly growing scale-up, our dynamic team of over 100 tech professionals—including Data Engineers, Data Scientists, and more—delivers transformative analytics solutions across diverse sectors such as FMCG, retail, manufacturing, and financial services. Join us in spearheading the data revolution in South-Eastern Europe and beyond!What Your Day Might Look Like:Technical & Delivery LeadershipLead the development and enhancement of data engineering standards, best practices, and architectural principles for all Satori projects.Serve as a senior technical authority for complex data platforms, including cloud data stacks, pipelines, streaming, and orchestration.Assist project teams in solution design, risk management, and technical decision-making processes.Evaluate and critique designs to ensure they meet scalability, performance, security, and cost-effectiveness criteria.Collaborate with Tech Leads to maintain consistency and quality across projects.People Management & LeadershipOversee Senior Data Engineers and Tech Leads, fostering growth, performance, and career advancement.Mentor engineers on technical depth, ownership, communication, and leadership skills.Contribute to performance evaluations, development plans, and promotion decisions in line with Satori’s competency framework.Exemplify Satori’s values of collaboration, transparency, and accountability.Cross-Functional CollaborationWork in tandem with Product Owners to align technical solutions with client requirements and delivery constraints.Partner with Data Science, AI, and Cloud teams to ensure seamless end-to-end solutions.Support presales and discovery phases by providing technical insights, estimations, and solution framing when necessary.Organizational ImpactIdentify skill gaps, tooling, or process improvements and recommend practical solutions.Engage in internal initiatives, such as guilds, playbooks, training, and knowledge sharing.Help scale the data engineering capabilities as Satori expands, ensuring quality and culture are preserved.
Elevate your career with us! Join our dynamic development teams in Athens or work remotely as a Data Engineer. In this vital role within our agile team, you will help design and implement cutting-edge big data solutions on a scalable cloud platform. You will analyze millions of real-time data points to extract advanced insights and enhance analytics capabilities for our end users.Your Responsibilities: Develop and implement batch processing pipelines utilizing Spark (Python or Scala) and SQL; Design and execute streaming ETL/ELT processes from a variety of data sources; Write and maintain code for developing comprehensive big data solutions, focusing on data integration and analytics use cases; Create and implement APIs using contemporary Python frameworks; Collaborate effectively with our Business Analysis teams to align technical solutions with business needs; Conduct end-to-end and functional testing using open-source tools; Set up monitoring solutions for our data platform, including alerts and dashboards. Essential Qualifications: Bachelor’s degree in Computer Science or Software Engineering; Extensive knowledge of Apache Spark; Proficient in Python and database management; Previous experience as a Data Engineer; Familiarity with Azure Data Lake Storage and Delta Live Tables; Fluency in English, both written and spoken; Strong analytical skills and a team-oriented mindset; A passion for learning and professional growth in data engineering. Preferred Qualifications: Experience with Databricks; Proficiency in API development with FastAPI; Familiarity with cloud platforms (AWS, Azure, GCP, etc.); Experience with Docker. Why Join Us?We value talent and commitment, offering a range of benefits for our team members, including:Competitive full-time salary;Comprehensive private health coverage under the company’s group program;Flexible working hours;Access to state-of-the-art tools;Opportunities for professional development including language courses and specialized training;Career advancement opportunities with industry-leading specialists;A dynamic work environment that encourages personal and professional growth through challenging goals and mentorship.If you're ready to embrace an exciting challenge, work with cutting-edge technologies, and enjoy your daily tasks, we invite you to apply! Please submit your detailed CV in English, referencing: (SDE/02/26).Explore all our open vacancies by visiting the career section of our website.
Are you enthusiastic about big data and eager to engage with advanced technologies? We invite you to explore a thrilling opportunity as a Data Engineer - Spark Developer with our dynamic and growing development teams. Whether you prefer the vibrant atmosphere of our Athens office or the flexibility of remote work, we are excited to welcome your expertise and passion.Key Responsibilities: Architect, develop, test, deploy, maintain, and enhance data pipelines; Implement coding solutions using Apache Spark on Azure Databricks; Create and design big data architectures leveraging Azure Data Factory, Service Bus, BI, Databricks, and other Azure Services. Essential Qualifications: Bachelor's degree in Computer Science or Software Engineering; Strong analytical mindset, team-oriented, dedicated to quality, and eager to learn; Comprehensive understanding of Apache Spark; Proven experience as a Data Engineer; Advanced proficiency in Python or Scala; Expertise in Spark query tuning and performance enhancement; Familiarity with cloud platforms such as Azure, AWS, or GCP; Fluent in both spoken and written English. Preferred Qualifications: Ability to understand and analyze Directed Acyclic Graph (DAG) operations; Experience in providing cost estimates for big data processing; Capability to write and review architecture documentation. Benefits:We value talent and commitment and offer a range of benefits to our team members:Competitive full-time salary;Comprehensive private health coverage under the company’s group program;Flexible working hours;Access to state-of-the-art tools;Opportunities for professional development including language courses, specialized training, and continuous learning;Career advancement opportunities with leading specialists in the industry;A dynamic work environment that promotes challenging goals, autonomy, and mentorship, supporting both personal and company growth.If you are looking for an exciting challenge, keen to work with innovative technologies, and enjoy your work, we would love to hear from you! Please submit your detailed CV in English, referencing: (DESD/02/26).Explore our other open positions by visiting our career section at www.eurodyn.com and follow us on Twitter (@EURODYN_Careers) and LinkedIn.European Dynamics (www.eurodyn.com) is a prominent European company specializing in Software, Information, and Communication Technologies, with a robust international presence.
Full-time|On-site|Athens or Ioannina, Sterea Ellada, Greece
Location: Athens or Ioannina, Sterea Ellada, Greece About Snappi Bank Snappi Bank is building a neobank from the ground up. The team focuses on financial freedom by delivering transparent, technology-driven digital banking services. The company aims to reshape how people interact with their finances. Role Overview The Data Engineer will design, build, and manage data architecture and pipelines that support data acquisition, storage, processing, and analysis across the organization. This position is open in both the Athens and Ioannina offices. Main Responsibilities Create and maintain data pipelines and infrastructure for efficient ingestion, processing, and storage of large datasets. Work with data scientists, analysts, and other stakeholders to understand data needs and translate them into technical solutions. Develop and optimize data models and schemas for effective storage and retrieval. Build and manage ETL processes to bring data from various sources into data warehouses or lakes. Monitor and troubleshoot pipelines to ensure data integrity, reliability, and performance. Evaluate and introduce new tools or technologies to improve data processing and operational efficiency. Document pipelines, processes, and solutions to support knowledge sharing and maintainability. Partner with infrastructure and DevOps teams to deploy and manage data systems in cloud environments. Keep up with trends and best practices in data engineering and analytics. Qualifications Bachelor’s degree in Computer Science, Electronics, or equivalent experience in data roles. Minimum 5 years of experience in a similar position (7+ years preferred; 3-5 years considered for junior roles). Strong skills in SQL and Python; experience with Azure Data Factory is a plus. Excellent interpersonal skills, including listening, negotiation, and presentation. Clear verbal and written communication abilities. Attention to detail. Effective decision-making, problem analysis, and resolution skills. Strong organizational habits. Proactive approach to problem-solving. Comfort working in a fast-changing environment. Interest in agile software processes, data-driven development, reliability, and experimentation; experience with Agile product teams is a plus. Why Work at Snappi? Snappi Bank values innovation, trust, and ongoing growth. The team focuses on solutions and results. This is a chance to make a real impact on the future of banking and improve financial services for a broad audience.
Join our innovative team as a Semantic Data Engineer, where you'll play a crucial role in enhancing a sophisticated platform centered around RDF data models, SPARQL queries, and structured datasets. Your primary responsibilities will involve comprehending, maintaining, and advancing the semantic layer of our system, collaborating closely with backend engineers and architects. This position is ideal for a passionate specialist with a keen interest in data modeling, semantics, and knowledge representation within real-world production environments.Key Responsibilities:Analyze and uphold RDF/TTL data models and vocabularies;Design, optimize, and manage SPARQL queries;Facilitate data ingestion, transformation, and validation processes;Ensure the consistency and accuracy of semantic data throughout the platform;Work alongside backend engineers to integrate semantic logic into application workflows;Assist in documenting semantic models, assumptions, and constraints;Engage in troubleshooting data quality and reasoning challenges.
Join Kpler as a Data Engineer specializing in Dry Bulk Commodities. In this pivotal role, you will design, implement, and optimize data pipelines to support our dynamic analytics platform. Collaborate with cross-functional teams to enhance data accessibility and ensure high data quality, driving insights that empower our clients in the commodities market.
Join Our Innovative Team at Kaizen GamingKaizen Gaming, the dynamic force behind Betano, stands as one of the premier GameTech companies globally, serving 19 diverse markets. Our mission is to harness advanced technology to deliver unparalleled entertainment experiences to millions of satisfied customers.Our vibrant workforce consists of over 2,700 talented individuals from more than 40 nationalities spanning three continents.We take pride in being recognized among the Best Workplaces in Europe and are certified as a Great Place to Work in all our offices. At Kaizen Gaming, every day is a new opportunity to excel. Are you ready to Press Play on your career potential?About the RoleAs a Lead Data Scientist, you will spearhead our AI initiatives by analyzing complex datasets and developing machine learning models that drive our innovative AI products. The ideal candidate will possess an in-depth knowledge of machine learning algorithms and a proven track record of deploying ML/AI applications in production environments.Key Responsibilities:Convert product specifications into actionable machine learning tasks and pinpoint high-impact AI opportunities.Conduct comprehensive data analysis to unearth vital patterns and derive actionable insights.Execute exploratory data analysis (EDA) and feature engineering to facilitate the modeling process.Implement best practices in model selection, parameter tuning, and validation.Conduct comparative experiments to enhance model training.Analyze machine learning metrics to assess various solution options.Oversee the complete lifecycle of AI features, from data collection through model design to implementation and optimization in live environments.Mentor junior team members, sharing expertise and leading intricate projects.
METRO AEBE operates a large network of retail stores in Greece and Cyprus, including My Market, My Market Local, METRO Cash & Carry, and BEST VALUE. With more than 11,000 employees, the company has been recognized as a Top Employer for both 2025 and 2026. As the company continues to expand, it remains committed to business growth alongside sustainable practices. The Data Warehouse (DWH) team in Athens is adding a Data Engineer to support these efforts. Role overview This Data Engineer position focuses on designing, improving, and optimizing enterprise data products. The work directly supports data-driven decision-making throughout METRO AEBE.
Optasia is a dynamic B2B2X financial technology platform that specializes in scoring, financial decision-making, disbursement, and collection. Our mission is to foster financial inclusion globally, as we are transforming the financial landscape.We are looking for passionate and energetic professionals who are results-oriented and possess a proactive mindset. Join our innovative team of like-minded individuals dedicated to delivering cutting-edge solutions in an exciting environment. At Optasia, data is central to our growth strategy, and our Data Engineering team plays a vital role in our success through data-driven insights and decision-making.As an ML & Data Ops Engineer at Optasia, you will be pivotal in ensuring the availability and performance of our workflows. You will monitor and optimize our existing data system infrastructure, prioritize issue resolution, and promote a culture of continuous improvement. Your contributions will be essential in maintaining high service standards and reinforcing our commitment to client success.Key Responsibilities:Oversee daily monitoring of ML workflows, service alerts, and all data system infrastructure, including storage, applications, databases, and resources.Act as an escalation point for critical ML and data incidents, diagnosing issues and resolving or escalating as needed.Champion continuous integration, continuous delivery, and automation practices to facilitate seamless software transitions from development to production.Manage customer inquiries efficiently to ensure prompt responses and resolutions within defined SLAs.Encourage a culture of continuous improvement by identifying enhancements in processes, tools, and technologies.Conduct health and quality checks of ML and data workflows to ensure operational excellence.Collaborate with cross-functional teams including ML/Data Engineering, Algorithmic Trading, and System & Network Administration to resolve internal and client-related issues effectively.Develop training materials and deliver sessions, maintaining comprehensive technical documentation.
Join Gek-Terna, a leading construction company in Greece, as a Construction Site Supervisor. In this vital role, you will oversee construction activities, ensuring compliance with safety standards and project specifications. Your leadership will guide the team in executing tasks efficiently and effectively, contributing to the successful delivery of our projects.
Join Optasia, a pioneering B2B2X financial technology platform that is revolutionizing financial inclusion through advanced scoring, decision-making, disbursement, and collection solutions. We are dedicated to changing the world of finance for the better.At Optasia, we thrive on innovation, and we are looking for passionate individuals who are results-oriented and eager to collaborate in a dynamic team environment. Data is the backbone of our growth strategy, and the Data Engineering team plays a crucial role in driving data-driven automated decision-making processes.We invite you to become a part of our expanding Data Engineering team as a Junior Data Engineer. In this role, you will have the opportunity to design and implement robust batch and streaming data pipelines, significantly contributing to our mission.Your ResponsibilitiesCollaborate with Data Architects, Machine Learning Engineers, and Data Analysts to design, deliver, and support the company's ETL and ML pipelines.Develop and maintain essential libraries for batch processing and large-scale data ingestion.Create and implement new processes to facilitate real-time data streaming from various sources.Enhance the efficiency and accuracy of data injection into our big data infrastructure.Explore and apply innovative data engineering patterns and technologies.Your QualificationsBachelor's or Master's degree in Computer Science, Informatics, or a related field.Understanding of data modeling, mining, and warehousing methodologies.Proficiency in at least one programming language, preferably Java, Scala, or Python.Solid SQL skills.Strong analytical abilities with meticulous attention to detail.A genuine enthusiasm for learning new technologies and working collaboratively with creative professionals.Preferred QualificationsPractical experience with Big Data technologies (e.g., Apache Spark, YARN, HDFS, MapReduce).Experience with NoSQL databases.Familiarity with mathematical modeling, algorithm development, and machine learning.Why Optasia?What we offer: Flexible hybrid working environment Competitive salary package Additional day off on your birthday Performance-based bonuses Comprehensive private health insurance All necessary tech gear to work efficientlyJoin our multicultural team at Optasia and be part of a unique and rewarding workplace!
Join our dynamic team at Netcompany1 as a Senior Business Intelligence Engineer! In this pivotal role, you will leverage your expertise to design and implement data-driven solutions that enhance business performance. Your contributions will be crucial in transforming data into actionable insights that drive strategic decision-making.
Join our team as a Senior Backend Engineer (Python) and engage in a challenging project that focuses on data-intensive processing, system integration, and semantic technologies. In this pivotal role, you will take charge of an established codebase, enhancing its stability and evolving it to meet the project’s demands. This position is perfect for seasoned engineers who thrive in working with complex systems and have a solid grasp of data-driven architectures.Key Responsibilities:Thoroughly analyze and take ownership of an existing Python-based backend.Refactor and optimize backend components with an emphasis on maintainability and performance.Develop and sustain data processing pipelines and integration workflows.Collaborate closely with data and semantic engineers on RDF/SPARQL-driven processes.Contribute to the architectural redesign and technical documentation.Assist with deployment, configuration, and troubleshooting tasks.Ensure high code quality through rigorous reviews, testing, and adherence to best practices.Requirements:A minimum of 3 years of professional experience in backend development.Expertise in Python programming.Experience with:Apache Airflow and AWS.Data-intensive or integration-heavy systems.APIs, batch processing, and backend services.Configuration-driven systems (XML / JSON / YAML).Strong understanding of:Software architecture and design patterns.Debugging and maintaining legacy codebases.Proven experience in complex, multi-stakeholder projects (experience in EU or public-sector projects is a plus).Preferred Qualifications:Familiarity with Semantic Web technologies (RDF, SPARQL, OWL).Experience in data modeling or knowledge-based systems.Exposure to DevOps practices (CI/CD, containerization).Experience in contributing to or maintaining technical documentation (e.g., AsciiDoc, Antora).Benefits:We value talent and commitment, offering the following perks:Attractive full-time salary.Private health insurance under the company’s group plan.Flexible working hours.Access to top-quality tools.Opportunities for professional development, including language courses and specialized training.Career advancement potential by collaborating with leading specialists in the field.A dynamic work environment that encourages personal and professional growth.If you're ready for an exciting challenge, we would love to hear from you!
Location: Athens, Greece Company: d-one About d-one d-one is a consultancy focused on data, artificial intelligence, and transformation strategies. The firm helps organizations turn their ambitions into tangible results by aligning strategy, building strong data foundations, delivering large-scale solutions, and fostering trust through effective governance and a people-first approach. With offices in Zurich, Munich, and Athens, d-one supports clients across Europe and covers the entire data value chain using a pragmatic, people-centered methodology. Role Overview: Data & AI Engineer Based in the Athens consulting hub, the Data & AI Engineer works closely with project teams to advise clients through the conception, planning, and implementation of data and AI solutions. The role involves hands-on collaboration and occasional travel to Switzerland or other international locations for client work and team meetings at the Zurich headquarters. Key Responsibilities Analyze client requirements and design tailored data solutions. Contribute to projects in areas such as cloud engineering, infrastructure, data engineering (ETL), data science, and generative AI applications. Communicate findings and insights to stakeholders and management. Travel occasionally for client engagement and collaboration with the Zurich team. What We Look For University degree in Computer Science or a related field. Proficiency in technologies and programming languages such as SQL, Python, and Spark. Experience working with large datasets, streaming data, and databases. Background in cloud development and architecture, especially with Azure, AWS, or GCP. Hands-on experience with Databricks, Microsoft Fabric, or Palantir Foundry. Pragmatic, down-to-earth, and focused on results. Excellent communication skills in English. What d-one Offers A calm, focused work environment that values what matters most. An international team of talented, ambitious colleagues who enjoy both challenges and camaraderie. Opportunities for growth through the D ONE Academy, including mentoring, knowledge-sharing, expert groups, and tailored soft-skill and leadership training. Regular offsite social events. What Helps You Succeed Here Commitment to fully understanding client needs. Ability to deliver clear, high-value results. Intellectual agility and a client-friendly approach to problem-solving. Appreciation for creative, collaborative teamwork.
Join Our Team at Kaizen Gaming!Kaizen Gaming, the driving force behind Betano, is a leading GameTech company with operations in 20 global markets. We are dedicated to harnessing the latest technology to deliver exceptional entertainment experiences to millions of our valued customers.Our diverse workforce consists of over 2,700 Kaizeners from more than 40 nationalities across three continents. We take pride in being recognized as one of the Best Workplaces in Europe and as a certified Great Place to Work. Here, each day offers unique challenges and opportunities. Are you ready to unleash your potential? About the RoleAs a Senior AI Data Governance Analyst, you will play an essential role within our Data/AI Governance team, contributing to the design and execution of a comprehensive AI governance framework across the organization. Collaborating with both business and technical stakeholders, you will ensure the integrity, consistency, availability, and compliance of AI models and applications, which is vital in the highly regulated betting and gaming industry. Key Responsibilities:Develop and enforce AI governance policies, standards, and procedures.Facilitate the implementation of AI metadata management, data lineage, and quality frameworks.Profile datasets to evaluate their accuracy, completeness, consistency, and integrity for effective AI utilization.Identify, document AI quality issues, and recommend corrective measures.Monitor and analyze AI quality KPIs, supporting remediation initiatives.Maintain AI quality dashboards and reports to communicate trends to stakeholders.Engage in cross-functional meetings to align AI governance priorities.Enhance AI business glossaries, data dictionaries, and catalogs.Collaborate with AI/Data owners, stewards, and custodians to promote governance best practices.Assist with compliance-related AI & Data initiatives (e.g., EU AI Act, GDPR, AML, Responsible Gaming reporting).Evaluate and support AI/Data governance tools (e.g., Atlan, Collibra, Informatica, Microsoft Purview).Contribute to the continuous evolution of the AI governance framework and best practices.
Do you have a passion for Artificial Intelligence? At Satori Analytics, we are on a mission to revolutionize the world through innovative algorithms that provide insights to global brands using Data & AI. Our state-of-the-art solutions encompass the full data lifecycle—from data ingestion to advanced AI applications—serving various sectors including fintech, airlines, FMCG, retail, manufacturing, and financial services.As a rapidly expanding scale-up with a team of over 100 technology specialists—including Data Engineers, Data Scientists, and more—we are at the forefront of the data revolution in South-Eastern Europe and beyond. Come and be part of our journey!Your Daily Responsibilities:Collaborate and Learn: Work alongside Solution Architects, Developers, and Business Analysts to ensure the successful delivery of projects.Build Data Solutions: Assist in the design and development of ETL pipelines utilizing tools such as Azure Data Factory and Databricks.Get Hands-On: Participate in data migrations, cloud transitions, and automation of routine tasks.Solve Problems: Help troubleshoot and debug data processes under the mentorship of senior engineers.Grow Your Skills: Thrive in a dynamic environment with ample opportunities for professional development.
About Us: At tbibankgr, we are a dynamic challenger bank situated in Southeast Europe, recognized as a regional leader in alternative payment solutions. Our innovative ecosystem merges financing and shopping to meet the diverse needs of our customers. With a successful and profitable business model, we are proud to serve clients in Romania, Germany, Bulgaria, and Lithuania.In our quest to expand our global presence, we have established a significant footprint in Greece, collaborating with thousands of merchants and consumers. Are you ready to contribute to our unique success story?Your Role: We are seeking a Senior Data Analyst to join our dedicated team in Greece!Key Responsibilities:Conduct data discovery to identify insights.Collaborate with partners to explore potential data sources and develop data interpretation logic.Acquire data from both primary and secondary sources.Analyze and interpret data, providing regular and ad-hoc reports and insights.Engage in the continuous development of our Data Warehouse (DWH) facilities.Work with extensive datasets from multiple sources to create cohesive data reports.Utilize data manipulation and transformation tools effectively.
Role overview d-one seeks a Senior Databricks Engineer in Athens, Greece. The focus is on developing and enhancing data solutions that help meet business objectives. Candidates should bring strong hands-on experience with Databricks and a solid track record in designing, implementing, and improving analytics infrastructure. What you will do Design and build data pipelines and solutions with Databricks Collaborate with cross-functional teams to gather requirements and turn them into technical deliverables Develop and maintain ETL processes Optimize data workflows for better performance and scalability Contribute to the ongoing development of the company’s data strategy Location This role is based in Athens, Attica, Greece.