Data Engineer GCP (f/m)
Full-time Mid-Senior LevelJob Overview
Project Description:
We are looking for highly skilled Data Engineers to join our team in DBIZ. The Technik Value Stream teams are responsible for Data Ingest on ODE, development of the relevant data products on ODE, Operations of the data products on ODE
Activity description and concrete tasks:
- Infrastructure Deployment & Management: Efficiently deploy and manage infrastructure on Google Cloud Platform (GCP) including network architectures (Shared VPC, Hub-and-Spoke), security implementations (IAM, Secret Manager, firewalls, Identity-Aware Proxy), DNS configuration, VPN, and Load Balancing.
- Data Processing & Transformation: Utilize Hadoop cluster with Hive for querying data, and PySpark for data transformations. Implement job orchestration using Airflow.
- Core GCP Services Management: Work extensively with services like Google Kubernetes Engine (GKE), Cloud Run, BigQuery, Compute Engine, and Composer, all managed through Terraform.
- Application Implementation: Develop and implement Python applications for various GCP services.
- CI/CD Pipelines: Integrate and manage GitLab Magenta CI/CD pipelines for automating cloud deployment, testing, and configuration of diverse data pipelines.
- Security & Compliance: Implement comprehensive security measures, manage IAM policies, secrets using Secret Manager, and enforce identity-aware policies.
- Data Integration: Handle integration of data sources from CDI, Datendrehscheibe (FTP servers), TARDIS API´s and Google Cloud Storage (GCS).
- Multi-environment Deployment: Create and deploy workloads across Development (DEV), Testing (TEST), and Production (PROD) environments.
- AI Solutions: Implement AI solutions using Google’s Vertex AI for building and deploying machine learning models.
- Certification Desired: Must be a certified GCP Cloud Architect or Data Engineer.
Make Your Resume Now