About the job
Join Our Dynamic Team!
- The Data Engineer for Workflow Platform is an integral member of Toss Bank's Data Division, specifically within the Data Platform team.
- This team comprises three key areas: Data Infrastructure & Hadoop, Streaming Platform, and Workflow Platform.
- We operate various Data Platforms, including Hadoop, Kafka, CDC, and Airflow.
- Our mission is to ensure the reliability and scalability of the enterprise data infrastructure, ensuring all data is securely collected and processed.
Your Responsibilities:
- Design and operate a large-scale data workflow execution platform in an on-premise Kubernetes environment.
- Optimize resources to ensure the stable execution of large workflows across various data organizations, enhancing platform performance and reliability.
- Collaborate with enterprise data engineers to improve the execution quality of the overall data pipeline and enhance developer experience.
- Monitor workflow execution status, design and improve systems for automated fault detection, alerts, and recovery procedures.
- Safely manage workflow executions in accordance with internal control standards of the financial sector, advancing a systematic history management system.
- Continuously review and implement new technologies and open-source solutions to enhance the performance and scalability of the workflow platform.
We Are Looking For:
- Experience operating an Airflow-based workflow orchestration system with proven improvements in stability, scalability, and execution efficiency.
- Background in developing Python-based data workflows and platform services.
- Understanding of container technologies (Docker, Kubernetes, etc.) and experience in automating service deployment and configuration using tools like Helm.
- Ability to understand company environments and communicate effectively with various teams during service development.
- A keen interest in improving operational efficiency and optimization in large-scale workflow environments.
- Desire to enhance platform user experience to facilitate easier and safer pipeline development and operations for in-house data engineers.
- A proactive approach to analyzing, modifying, and improving open-source solutions at the code level to solve issues.
Resume Submission Tips:
- Clearly outline impactful projects you have worked on in your career.
- Focus on experiences related to data platforms, particularly with Airflow, Kubernetes, and Python.

