- Implement new ETLs for data collection from internal systems into GCP
- Help evangelize high quality software engineering practices towards building data infrastructure and pipelines at scale
- Work within and across agile teams to design, develop, test, implement, and support technical solutions across a full-stack of cloud development tools and technologies
- Create monitoring and alerting solutions for data pipeline statuses
- Implement specific Google Cloud data security and governance controls
- Maintain access controls for the data lake and associated Google Cloud products
- Ensure quality of the solutions are robust, scalable and efficient to meet the needs of the business
- Strong programming skills preferably in Python
- Experience supporting production cloud environments
- Strong understanding of IAM and cloud-based access and security controls
- Familiarity with ETL pipeline orchestration frameworks, such as Luigi or Airflow
- Experience with data processing and storage frameworks like Apache Beam, BigQuery, BigTable, Redshift, Kinesis, etc.
- Experience with log management and monitoring tools, including tools within Amazon Web Services and Google Cloud Platform as well as open source and third-party monitoring tools.
- Experience in managing projects and infrastructure for cloud-based platforms (security, authentication, monitoring, data governance)
- Experience working with containers and container services is also a significant plus
How to Apply
Please send your resume to firstname.lastname@example.org
Consider the below given points while applying
- Resume in MS Word or PDF format as an attachment to the email.
- Subject of the email mentioning the job you are applying for.