Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Qualifications
Proven experience in machine learning and statistical modeling. Strong programming skills in Python, R, or similar languages. Familiarity with machine learning frameworks such as TensorFlow, PyTorch, or similar. Ability to work independently and collaboratively in a remote environment. Excellent problem-solving skills and attention to detail.
About the job
Toloka AI seeks a Freelance Machine Learning Engineer based in Greece to work remotely. This position centers on developing and refining machine learning models for a variety of AI projects.
Role overview
Work involves building new models and enhancing existing solutions to support projects in several industries. Projects may span different domains, offering exposure to diverse applications of AI and machine learning.
Location
This is a remote role open to candidates located in Greece.
About Toloka AI
Toloka AI is a forward-thinking company specializing in artificial intelligence and machine learning solutions. We are committed to leveraging technology to drive efficiency and innovation across various sectors. Join us and be part of a dynamic team that values creativity and expertise.
Please submit your CV in English and indicate your English proficiency level. This freelance, project-based role with Mindrift connects experienced data science professionals to AI projects for leading technology companies. Assignments focus on testing, evaluating, and improving AI systems. This is not a permanent position. Role overview The Freelance Data S…
Please submit your CV in English and highlight your language proficiency. Mindrift offers project-based contracts for professionals interested in shaping and testing AI systems for leading technology companies. This freelance Data Scientist (Python & SQL) - AI Training Specialist position is not a permanent role but focuses on specific projects. Work is fully remote and open to those based in Greece. Role overview This contract centers on designing and validating computational data science problems that reflect real-world analytics challenges. Projects span industries such as telecom, finance, government, e-commerce, and healthcare. The work involves creating complex, computationally demanding scenarios that require advanced data science skills and cannot be solved manually in a short time. What you will do Design data science problems based on realistic business cases, including customer analytics, risk assessment, fraud detection, forecasting, and optimization. Develop challenges solvable using Python and libraries like Pandas, Numpy, Scipy, Scikit-learn, Statsmodels, Matplotlib, and Seaborn. Ensure each problem is deterministic and reproducible, using either no randomness or fixed seeds. Include tasks that require advanced reasoning: data manipulation, statistical analysis, feature engineering, predictive modeling, and drawing insights. Address the full data science workflow: data ingestion, cleaning, exploratory analysis, modeling, validation, and deployment considerations. Incorporate big data scenarios that demand scalable solutions. Validate solutions in Python using established libraries and statistical approaches. Clearly document each problem, providing realistic context and verified solutions. Requirements Minimum 5 years of hands-on data science experience with measurable business outcomes. Portfolio of projects or publications showing real-world problem solving. Strong Python skills for data science tasks (pandas, numpy, scipy, scikit-learn, statsmodels). Advanced knowledge of statistical analysis and machine learning, both algorithms and practical applications. Expertise in SQL and database operations for analytics. Experience with Generative AI technologies, including LLMs, RAG, prompt engineering, and vector databases. Understanding of MLOps and model deployment workflows. Familiarity with frameworks such as TensorFlow, PyTorch, or LangChain. Excellent English skills (C1 level or above). Application process Apply Pass qualifications Join a project Complete tasks Receive compensation This freelance role is remote and open to candidates based in Greece.
Join our innovative team at Toloka AI as a Senior Python Data Scraping Engineer and contribute to cutting-edge data solutions. This freelance position allows you to work remotely from Greece, giving you the flexibility to manage your own schedule while collaborating with our talented engineers.
Role overview tgndata is hiring a Data Engineer in Athens, Attica. This hybrid role focuses on building and maintaining the data infrastructure that supports the company’s analytics and decision-making. What you will do Design, develop, and maintain data pipelines Work closely with teams to ensure reliable data flows for analytics and reporting Use Python to create scalable and efficient solutions for extracting insights from company data Location This position is based in Athens, Greece, with a hybrid work arrangement.
Join our team as a Senior Backend Engineer (Python) and engage in a challenging project that focuses on data-intensive processing, system integration, and semantic technologies. In this pivotal role, you will take charge of an established codebase, enhancing its stability and evolving it to meet the project’s demands. This position is perfect for seasoned engineers who thrive in working with complex systems and have a solid grasp of data-driven architectures.Key Responsibilities:Thoroughly analyze and take ownership of an existing Python-based backend.Refactor and optimize backend components with an emphasis on maintainability and performance.Develop and sustain data processing pipelines and integration workflows.Collaborate closely with data and semantic engineers on RDF/SPARQL-driven processes.Contribute to the architectural redesign and technical documentation.Assist with deployment, configuration, and troubleshooting tasks.Ensure high code quality through rigorous reviews, testing, and adherence to best practices.Requirements:A minimum of 3 years of professional experience in backend development.Expertise in Python programming.Experience with:Apache Airflow and AWS.Data-intensive or integration-heavy systems.APIs, batch processing, and backend services.Configuration-driven systems (XML / JSON / YAML).Strong understanding of:Software architecture and design patterns.Debugging and maintaining legacy codebases.Proven experience in complex, multi-stakeholder projects (experience in EU or public-sector projects is a plus).Preferred Qualifications:Familiarity with Semantic Web technologies (RDF, SPARQL, OWL).Experience in data modeling or knowledge-based systems.Exposure to DevOps practices (CI/CD, containerization).Experience in contributing to or maintaining technical documentation (e.g., AsciiDoc, Antora).Benefits:We value talent and commitment, offering the following perks:Attractive full-time salary.Private health insurance under the company’s group plan.Flexible working hours.Access to top-quality tools.Opportunities for professional development, including language courses and specialized training.Career advancement potential by collaborating with leading specialists in the field.A dynamic work environment that encourages personal and professional growth.If you're ready for an exciting challenge, we would love to hear from you!
Role Overview finartix is looking for an ETL/SSIS Data Engineer to join the team in Athens, Attica, Greece. This role focuses on building and maintaining data solutions for clients in the Greek market. The position works closely with IT professionals to improve data ecosystems and streamline data delivery for a range of sectors. What You Will Do Develop, test, and maintain data solutions throughout the full software development lifecycle. Apply effective methods for collecting and analyzing data to support strategic recommendations that fit client business goals. Act as a technical advisor, offering insights and solutions to clients. Work as part of an Agile team, contributing to collaboration and new ideas. Qualifications BS or MS in Computer Science, Engineering, or a related field. Minimum 3 years of experience in software development using MS SQL Server and ETL tools, especially SSIS. At least 2 years working on data migration projects. 2 years of experience in the Banking Industry. Solid understanding of software application fundamentals and how they affect user experience. Strong skills in testing and quality assurance. Proficient programming abilities and a creative, problem-solving approach. Good communication and time management skills. Comfortable working both independently and as part of a team. Demonstrated analytical thinking and a solution-oriented mindset. Proficiency with Microsoft Office Suite. Fluent in both English and Greek, written and spoken.
iKnowHow S.A. is part of the iKnowHow Group, a technology company with more than 24 years in the field and a team of over 300 professionals. The group delivers technology solutions to sectors such as Energy, Telecommunications, Banking & Financial Services, and the Public Sector. Specialized subsidiaries within the group focus on areas like Health and Robotics, integrating advanced technologies for clients and internal projects. The company’s portfolio covers Data & AI platforms, enterprise integration, cloud-native applications, and digital transformation initiatives for organizations in both public and private sectors. Role overview iKnowHow S.A. is hiring an SSIS Data Engineer in Gerakas, Attica, Greece. This role will support ongoing and upcoming data migration projects, working as part of a collaborative technology team.
METRO AEBE is recognized as one of Greece's leading employers, proudly supporting over 11,000 employees. Operating under the renowned My Market brand, we manage one of the largest retail networks in the country, featuring 290 stores nationwide. Additionally, we dominate the wholesale market with 50 METRO Cash & Carry stores catering to professionals across Greece.To fuel our ongoing expansion, we are looking for a skilled Data Engineer to join our Data Warehouse (DWH) team. In this role, you will play a pivotal part in designing, enhancing, and optimizing enterprise data products that empower data-driven decision-making.Responsibilities:Design and develop robust, automated data pipelines (ETL/ELT) to efficiently ingest data from diverse sources into our Data Warehouse or Data Lake.Conduct data wrangling tasks, including data cleaning and transformation, to convert raw data into actionable formats for analysis, visualization, or machine learning applications.Ensure data quality and monitor pipeline performance to uphold data integrity and reliability.Implement data access controls in alignment with corporate regulations and policies.Contribute to machine learning and AI initiatives by preparing, validating, and serving high-quality datasets for model training and evaluation.Work collaboratively with Data and BI Analysts, providing technical support as required.
Full-time|Remote|Remote — Thessaloniki, Central Macedonia, Greece
Are you an enthusiastic Software Engineer eager to advance your career while working on significant projects? We invite you to join our innovative team at European Dynamics, where remote work is not just allowed, but embraced. Collaborate with a supportive project team to develop challenging applications for prominent public organizations across Germany, Austria, and Switzerland. We pride ourselves on high software development standards and offer ongoing coaching and training opportunities to enhance your skills. Proficiency in German is a plus, and a desire to learn is essential.Your Responsibilities:Engaging in the design and development of sophisticated web applications;Creating, testing, and maintaining large-scale software applications;Fostering effective collaboration within the team;Ensuring compliance with software quality standards;Preparing detailed technical documentation.
Elevate your career with us! Join our dynamic development teams in Athens or work remotely as a Data Engineer. In this vital role within our agile team, you will help design and implement cutting-edge big data solutions on a scalable cloud platform. You will analyze millions of real-time data points to extract advanced insights and enhance analytics capabilities for our end users.Your Responsibilities: Develop and implement batch processing pipelines utilizing Spark (Python or Scala) and SQL; Design and execute streaming ETL/ELT processes from a variety of data sources; Write and maintain code for developing comprehensive big data solutions, focusing on data integration and analytics use cases; Create and implement APIs using contemporary Python frameworks; Collaborate effectively with our Business Analysis teams to align technical solutions with business needs; Conduct end-to-end and functional testing using open-source tools; Set up monitoring solutions for our data platform, including alerts and dashboards. Essential Qualifications: Bachelor’s degree in Computer Science or Software Engineering; Extensive knowledge of Apache Spark; Proficient in Python and database management; Previous experience as a Data Engineer; Familiarity with Azure Data Lake Storage and Delta Live Tables; Fluency in English, both written and spoken; Strong analytical skills and a team-oriented mindset; A passion for learning and professional growth in data engineering. Preferred Qualifications: Experience with Databricks; Proficiency in API development with FastAPI; Familiarity with cloud platforms (AWS, Azure, GCP, etc.); Experience with Docker. Why Join Us?We value talent and commitment, offering a range of benefits for our team members, including:Competitive full-time salary;Comprehensive private health coverage under the company’s group program;Flexible working hours;Access to state-of-the-art tools;Opportunities for professional development including language courses and specialized training;Career advancement opportunities with industry-leading specialists;A dynamic work environment that encourages personal and professional growth through challenging goals and mentorship.If you're ready to embrace an exciting challenge, work with cutting-edge technologies, and enjoy your daily tasks, we invite you to apply! Please submit your detailed CV in English, referencing: (SDE/02/26).Explore all our open vacancies by visiting the career section of our website.
Η Ηπειρωτική Βιομηχανία Εμφιαλώσεων Α.Ε., μια κορυφαία εταιρεία στον τομέα των εμφιαλώσεων, στο πλαίσιο της διαρκούς ανάπτυξής της, αναζητά να εντάξει στο δυναμικό της έναν Data Analyst. Η θέση αφορά τις εγκαταστάσεις της εταιρείας μας στα κεντρικά γραφεία της Περιβλέπτου Ιωαννίνων.Ο ιδανικός υποψήφιος θα έχει την ευθύνη για την:Συλλογή, καθαρισμό, ποιοτικό έλεγχο και ενοποίηση δεδομένων από πολλαπλές πηγές με στόχο τη στατιστική ανάλυση.Εντοπισμό, ανάλυση και ερμηνεία μοτίβων και τάσεων σε σύνθετα σύνολα δεδομένων.Προετοιμασία αναφορών και παρουσίαση ευρημάτων για την υποστήριξη διοικητικών αποφάσεων.Εκπαίδευση χρηστών στη χρήση και κατανόηση αναφορών και δεδομένων.Αξιολόγηση αλλαγών και ενημερώσεων σε παραγωγικά συστήματα.Παρακολούθηση και ανάλυση δεικτών απόδοσης (KPIs) και αποδοτικότητας.Εντοπισμό περιοχών προς βελτίωση και διατύπωση προτάσεων βελτιστοποίησης.Ο κάτοχος της θέσης μπορεί να αναλάβει και άλλες σχετικές εργασίες που θα κριθούν απαραίτητες από τη διοίκηση της εταιρείας.
Satori Analytics delivers data and AI solutions for clients in sectors such as fintech, airlines, FMCG, retail, manufacturing, and financial services. The company’s team of over 100 specialists, including Data Engineers and Data Scientists, supports the entire data lifecycle, from ingestion to AI-powered applications. Projects range from building cloud infrastructure to developing predictive analytics tools. Role overview This Python Software Developer position is based in Athens, Attica, Greece. The role centers on building backend services, APIs, and data-driven software that support Satori’s analytics products. As part of a growing team, the developer will help create scalable, reliable solutions for clients across South-Eastern Europe and beyond. What you will do Design and implement Python applications: Build APIs and microservices that are clean, well-tested, and scalable. Develop and maintain data pipelines: Create reliable data ingestion and processing systems that are observable and easy to support. Collaborate with specialists: Work closely with Data Scientists, Data Engineers, AI Engineers, and domain experts to turn analytical needs into practical software. Focus on quality delivery: Use Git, CI/CD, code reviews, testing, and documentation to support fast, maintainable releases. Translate business needs: Break down real-world problems into actionable technical solutions, balancing speed, quality, and long-term scalability. Improve engineering practices: Help raise the standard of software development across the team. Location Athens, Attica, Greece
Join our dynamic team at netcompany1 as a Mid-Senior SQL Developer. In this role, you will be responsible for developing robust database solutions, ensuring data integrity, and optimizing performance. You will collaborate with cross-functional teams to implement innovative data strategies that support business objectives.
Are you excited about the future of AI? At Satori Analytics, we are on a mission to transform industries through innovative algorithms and insightful data solutions. Our advanced offerings span cloud-based ecosystems tailored for fintech and predictive analytics for the aviation sector, addressing the complete data lifecycle—from initial ingestion to sophisticated AI applications.As a rapidly expanding scale-up, our diverse team of over 100 tech experts—including Data Engineers, Data Scientists, and more—provides cutting-edge analytical solutions across various sectors, including FMCG, retail, manufacturing, and financial services. Join us in spearheading the data revolution in South-Eastern Europe and beyond!Are you ready to collaborate with one of the largest gaming companies worldwide?Your Daily Responsibilities:System Maintenance: Oversee, sustain, and support MS SQL Server environments, including high-availability configurations (e.g., Availability Groups).Issue Resolution: Diagnose performance and availability challenges, uncover root causes, and implement preventive measures.Performance Optimization: Execute tuning, indexing, and automation to ensure databases are both efficient and scalable.Data Resilience: Manage backup and recovery protocols to ensure data integrity and readiness for disaster scenarios.Security Management: Apply patches, oversee upgrades, and uphold best practices for database security.Operational Support: Address incidents and support requests to minimize disruption to business activities.Cross-Technology Collaboration: Engage in projects involving additional database systems (e.g., MongoDB, CockroachDB).Cloud Operations: Optimize and support cloud-based database environments, primarily on Azure.
Are you ready to take on the challenge of joining Hack The Box?By joining our team, you will be integral to redefining cybersecurity expertise. Prepare yourself for a thrilling journey into the cybersecurity landscape! Your Role as a Senior Python Engineer:As a Senior Python Engineer, you will join our software and data engineering team, contributing to cutting-edge HTB products. Your primary responsibilities will include writing backend code and executing the platform's roadmap. This role offers the chance to create projects from inception while maintaining existing products, all while adhering to best practices in software development and becoming familiar with HTB methodologies. Work Environment:Location Options: Greece or EuropeWork Mode: Fully Remote / Hybrid (2 days in the office, 3 days remote, along with one month of working from anywhere). If you're in Greece, we welcome candidates from all areas. Those within 55 km of our Athens office will follow a hybrid model, while those further away can opt for a fully remote setup. Team Dynamics:Join our software and data engineering team, the driving force behind Hack The Box. You will work collaboratively with AI Engineers, Data Engineers, and various cross-functional partners across SRE and Product teams.Our AI organization encompasses Analytics, Data Engineering, and AI teams, having pioneered innovations such as the first MCP server in the cybersecurity sector and our AI Range platform designed for training autonomous agents.The Research & Development team, which includes Analytics, Data Engineering, and specialized development squads, leads the way in innovation, contributing to industry firsts like the MCP server and advanced training platforms.Your role will significantly impact the backend code of our latest products. Tools and Technologies You Will Use:Python, Django, FastAPI, RESTful APIs, Kafka, PostgreSQL, Elasticsearch, Kubernetes, Terraform Recommended Resources:HTB launches the world’s first AI RangeIntroducing the MCP Server for CTF competitions at Hack The Box
Toloka AI seeks a Freelance Machine Learning Engineer based in Greece to work remotely. This position centers on developing and refining machine learning models for a variety of AI projects. Role overview Work involves building new models and enhancing existing solutions to support projects in several industries. Projects may span different domains, offering exposure to diverse applications of AI and machine learning. Location This is a remote role open to candidates located in Greece.
Are you enthusiastic about big data and eager to engage with advanced technologies? We invite you to explore a thrilling opportunity as a Data Engineer - Spark Developer with our dynamic and growing development teams. Whether you prefer the vibrant atmosphere of our Athens office or the flexibility of remote work, we are excited to welcome your expertise and passion.Key Responsibilities: Architect, develop, test, deploy, maintain, and enhance data pipelines; Implement coding solutions using Apache Spark on Azure Databricks; Create and design big data architectures leveraging Azure Data Factory, Service Bus, BI, Databricks, and other Azure Services. Essential Qualifications: Bachelor's degree in Computer Science or Software Engineering; Strong analytical mindset, team-oriented, dedicated to quality, and eager to learn; Comprehensive understanding of Apache Spark; Proven experience as a Data Engineer; Advanced proficiency in Python or Scala; Expertise in Spark query tuning and performance enhancement; Familiarity with cloud platforms such as Azure, AWS, or GCP; Fluent in both spoken and written English. Preferred Qualifications: Ability to understand and analyze Directed Acyclic Graph (DAG) operations; Experience in providing cost estimates for big data processing; Capability to write and review architecture documentation. Benefits:We value talent and commitment and offer a range of benefits to our team members:Competitive full-time salary;Comprehensive private health coverage under the company’s group program;Flexible working hours;Access to state-of-the-art tools;Opportunities for professional development including language courses, specialized training, and continuous learning;Career advancement opportunities with leading specialists in the industry;A dynamic work environment that promotes challenging goals, autonomy, and mentorship, supporting both personal and company growth.If you are looking for an exciting challenge, keen to work with innovative technologies, and enjoy your work, we would love to hear from you! Please submit your detailed CV in English, referencing: (DESD/02/26).Explore our other open positions by visiting our career section at www.eurodyn.com and follow us on Twitter (@EURODYN_Careers) and LinkedIn.European Dynamics (www.eurodyn.com) is a prominent European company specializing in Software, Information, and Communication Technologies, with a robust international presence.
Join our innovative team at aambience-services as a Lead Data Engineer. In this pivotal role, you will spearhead our data engineering initiatives, driving the design and implementation of robust data pipelines and architectures that empower our analytics and machine learning capabilities. You will work closely with cross-functional teams to ensure data accuracy, accessibility, and security.
Jobgether is looking for a Senior Full-Stack Engineer with strong skills in Python and React. This position is based in Greece and centers on building and improving the company’s platforms. Role overview This role focuses on developing new features and maintaining existing applications using Python and React. Collaboration with other engineers and team members is a regular part of the job, with the goal of delivering reliable and user-friendly products. What you will do Design, build, and refine applications across the stack using Python and React Work closely with peers to solve technical challenges and deliver updates Play a key part in shaping the direction of jobgether’s platforms Requirements Professional experience with both Python and React Ability to work effectively with a team Comfortable contributing to projects in a collaborative setting
Join our innovative team as a Semantic Data Engineer, where you'll play a crucial role in enhancing a sophisticated platform centered around RDF data models, SPARQL queries, and structured datasets. Your primary responsibilities will involve comprehending, maintaining, and advancing the semantic layer of our system, collaborating closely with backend engineers and architects. This position is ideal for a passionate specialist with a keen interest in data modeling, semantics, and knowledge representation within real-world production environments.Key Responsibilities:Analyze and uphold RDF/TTL data models and vocabularies;Design, optimize, and manage SPARQL queries;Facilitate data ingestion, transformation, and validation processes;Ensure the consistency and accuracy of semantic data throughout the platform;Work alongside backend engineers to integrate semantic logic into application workflows;Assist in documenting semantic models, assumptions, and constraints;Engage in troubleshooting data quality and reasoning challenges.