About the job
About Us:
At Xenon7, we create a unique intersection where exceptional tech talent meets premium opportunities! We collaborate with top-tier enterprises and pioneering startups on thrilling, state-of-the-art projects that utilize the latest advancements across various IT domains, including Data, Web, Infrastructure, and AI. Our proficiency in developing IT solutions and providing on-demand resources enables us to partner with clients on transformative initiatives that fuel innovation and foster business growth. Whether empowering global organizations or teaming up with groundbreaking startups, we are dedicated to delivering advanced, impactful solutions that tackle today’s most intricate challenges.
About the Client:
Become a part of one of Egypt’s leading financial institutions, celebrated for its comprehensive suite of banking services, which includes Institutional Banking, Personal Banking, and Islamic Banking. With a robust global footprint through over 50 branches and correspondents, we cater to a diverse and dynamic client base. As we embark on an ambitious digital transformation journey, we are focused on harnessing the latest technologies to create a cutting-edge data architecture that redefines our performance and service delivery.
Position Overview:
We are in search of a highly driven and experienced Senior Data Engineer to enhance our expanding data team. In this pivotal role, you will lead the design and development of scalable, high-performance data pipelines and lakehouse architectures. Collaborating closely with data modelers, analysts, and business stakeholders, you will deliver reliable, real-time, and batch-based data solutions that underpin our analytics and AI-driven strategies.
Key Responsibilities:
- Design, implement, and optimize data pipelines utilizing both batch and streaming processing frameworks.
- Architect and maintain data lakehouse solutions using Apache Iceberg and object storage technologies such as S3.
- Implement scalable Data Vault and Star Schema models.
- Build and manage real-time ingestion pipelines with technologies like Kafka, Spark, or Flink.
- Integrate and orchestrate workflows using tools such as Airflow, dbt, NiFi, or Airbyte.
- Enforce data governance, quality, and access control policies.
- Troubleshoot pipeline performance and reliability issues.
