Remotebase logoRemotebase logo

Senior Data Engineer

RemotebaseRemote — Brazil
Remote Full-time

Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.


Experience Level

Mid to Senior

Qualifications

Qualifications:Bachelor's degree in Computer Science, Engineering, or a related technical field.5+ years of experience in DevOps, SRE, or infrastructure engineering roles.3+ years of hands-on experience in automating and managing data infrastructure and pipelines.1+ years of experience with cloud platforms, specifically Snowflake and GCP. Proficiency in Python and Bash scripting. Experience with Terraform for Infrastructure as Code. Strong problem-solving skills and ability to troubleshoot data infrastructure issues. Excellent communication skills and ability to collaborate effectively with cross-functional teams.

About the job

As a Senior Data Engineer at Remotebase, you will play a pivotal role in designing and implementing resilient, scalable, and automated data pipelines. This position is critical in optimizing our data infrastructure and ensuring efficient data delivery across various departments. You will contribute significantly to the enhancement of our cloud data platform, primarily using Snowflake, and will be responsible for implementing and managing CI/CD processes, Infrastructure as Code (Terraform), and data transformation workflows (dbt).

Key Responsibilities:

  • Architect, develop, and maintain scalable CI/CD pipelines for data applications, focusing on Snowflake, dbt, and associated tools.
  • Manage Snowflake dbt projects for data transformation, including the creation of dbt models, tests, and documentation while integrating dbt into CI/CD workflows.
  • Utilize Terraform for Infrastructure as Code to provision and configure cloud resources for data storage, processing, and analytics on GCP.
  • Automate deployment, monitoring, and management of Snowflake data warehouse environments to ensure optimal performance and security.
  • Partner with data engineers and scientists to understand requirements and deliver robust automated solutions for data ingestion, processing, and delivery.
  • Set up and manage monitoring, logging, and alerting systems for data pipelines and infrastructure to maintain high availability and proactive issue resolution.
  • Develop and maintain automation scripts and tools, primarily in Python, to enhance operational efficiency, with Bash scripting for system-level tasks.
  • Implement security best practices across the data infrastructure and pipelines.
  • Diagnose and resolve data infrastructure and pipeline issues promptly.
  • Engage in code reviews for infrastructure code, dbt models, and automation scripts.
  • Document system architectures, configurations, and operational procedures.
  • Stay updated on emerging DevOps technologies, data engineering tools, and cloud best practices, particularly related to Snowflake, dbt, and Terraform.
  • Optimize data pipelines for performance, scalability, and cost-efficiency.
  • Support data governance and quality initiatives from an operational perspective.
  • Contribute to the implementation of AI features.

About Remotebase

Remotebase is a forward-thinking technology company specializing in building innovative software solutions. Our commitment to excellence and efficiency drives us to create a dynamic work environment that fosters growth and collaboration among talented professionals.

Similar jobs

Browse all companies, explore by city & role, or SEO search pages. View directory listings: all jobs, search results, location & role pages.

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.