Salário: R$ 11.000 a R$ 20.000 por mês
Área: Tecnologia da Informação
Nível: Senior
We are seeking a proactive and skilled Senior Data DevOps Engineer specializing in Google Cloud Platform (GCP) to join our expanding team.
This role is ideal for professionals with hands-on experience in cloud-based data systems, infrastructure automation, and optimizing data workflows. Familiarity with AWS or Azure is a plus, but not mandatory.
Responsibilities
- Design cloud data infrastructure by utilizing GCP services such as DataFlow, GCS, BigQuery, Dataproc, and Cloud Composer
- Deploy Infrastructure as Code (IaC) solutions using tools like Terraform for automated provisioning and monitoring
- Collaborate with data engineering teams to create efficient and automated workflows with Python
- Set up CI/CD pipelines with tools like Jenkins, GitLab CI, or GitHub Actions for seamless deployment processes
- Optimize performance and enhance the reliability of data platforms in coordination with cross-functional teams
- Configure cloud-based data tools such as Apache Spark, Apache Kafka, and Apache Airflow
- Resolve reliability and scalability concerns in cloud-based data systems
Requirements
- 3+ years of experience in cloud environments focusing on GCP services including BigQuery, Cloud Composer, and Dataproc
- Proficiency in Python paired with competency in SQL for managing data pipelines
- Knowledge of IaC tools like Terraform or CloudFormation for infrastructure automation
- Skills in integrating CI/CD pipelines with tools like Jenkins, GitHub Actions, or GitLab CI
- Background in Linux-based operating systems and shell scripting
- Understanding of network protocols such as TCP/IP, DNS, and NAT
- Competency in tools like Apache Spark, Apache Airflow, or ELK Stack
Nice to have
- Familiarity with AWS or Azure services like ECS, S3, Data Lake, or Synapse
- Flexibility to utilize additional IaC tools like Ansible
- Showcase expertise with alternate data workflow automation tools
