About the job
About Trafilea
Trafilea is a forward-thinking Tech E-commerce Group that operates several direct-to-consumer brands within the intimate apparel and beauty industries. We leverage data-driven strategies to propel our businesses forward. Beyond our diverse range of products, we cultivate an online community that champions body positivity. As a rapidly expanding global entity, Trafilea is dedicated to delivering high-quality products and services that enhance customer satisfaction and facilitate sustainable growth.
Join Our Business Intelligence Team @ Trafilea
At Trafilea, we nurture a culture rooted in collaboration, innovation, and continuous learning. We are committed to investing in our talent and providing the necessary support and development opportunities for both personal and professional growth. Embrace the freedom of a remote-first work environment, collaborating with a diverse and talented team from around the globe.
We are seeking a Senior Data Engineer who will play a pivotal role in constructing and maintaining data pipelines and data models for our company's data platform. This role is crucial for driving Trafilea’s growth. You should have a strong passion for data architecture and data warehousing, with a focus on creating scalable and dependable frameworks for efficient data extraction and transformation.
Key Responsibilities:
Analyze, design, implement, and maintain pipelines that reliably and efficiently produce business-critical data utilizing cloud technologies.
Develop new ETL processes (Extract, Transform, Load) using Apache Airflow. Propose initiatives to enhance performance, scalability, reliability, and overall robustness.
Collect, process, and clean data from various sources using Python & SQL.
Collaborate closely with lead Architects and Developers to ensure adherence to best practices and guidelines across all projects.
Effectively assess and communicate the effort required for necessary developments.
Identify new data sources to enhance existing pipelines and take responsibility for building and maintaining data models for new and ongoing projects.
Maintain comprehensive documentation of your work and changes to uphold data quality and governance.
Provide insightful feedback and expert perspectives to support data initiatives across the organization.
Enhance the quality of existing and new data processes (ETL) by incorporating statistical process control and setting up alerts for anomalies at every stage of the pipeline.
Create benchmarks for execution times to measure performance effectively.
