About the job
As a Senior Data Engineer at Remotebase, you will play a pivotal role in designing and implementing resilient, scalable, and automated data pipelines. This position is critical in optimizing our data infrastructure and ensuring efficient data delivery across various departments. You will contribute significantly to the enhancement of our cloud data platform, primarily using Snowflake, and will be responsible for implementing and managing CI/CD processes, Infrastructure as Code (Terraform), and data transformation workflows (dbt).
Key Responsibilities:
- Architect, develop, and maintain scalable CI/CD pipelines for data applications, focusing on Snowflake, dbt, and associated tools.
- Manage Snowflake dbt projects for data transformation, including the creation of dbt models, tests, and documentation while integrating dbt into CI/CD workflows.
- Utilize Terraform for Infrastructure as Code to provision and configure cloud resources for data storage, processing, and analytics on GCP.
- Automate deployment, monitoring, and management of Snowflake data warehouse environments to ensure optimal performance and security.
- Partner with data engineers and scientists to understand requirements and deliver robust automated solutions for data ingestion, processing, and delivery.
- Set up and manage monitoring, logging, and alerting systems for data pipelines and infrastructure to maintain high availability and proactive issue resolution.
- Develop and maintain automation scripts and tools, primarily in Python, to enhance operational efficiency, with Bash scripting for system-level tasks.
- Implement security best practices across the data infrastructure and pipelines.
- Diagnose and resolve data infrastructure and pipeline issues promptly.
- Engage in code reviews for infrastructure code, dbt models, and automation scripts.
- Document system architectures, configurations, and operational procedures.
- Stay updated on emerging DevOps technologies, data engineering tools, and cloud best practices, particularly related to Snowflake, dbt, and Terraform.
- Optimize data pipelines for performance, scalability, and cost-efficiency.
- Support data governance and quality initiatives from an operational perspective.
- Contribute to the implementation of AI features.

