Thursday, 23 April 2026

Python With GCP

 

Bengaluru
8 - 12 years
18.0 lacs - 27.5 lacs
In office
python, GCP, Data Engineering, Cloud Data Migration

Job description

Role & responsibilities
• 8+ years of practical experience designing and building data solutions.
• Should have more than 4 years of cloud experience with at least 3 years on GCP building data lakes /data warehouses / data pipelines / AI & ML solutions using GCP services.
• Enterprise experience with GCP services like storage&database , data processing and secondary services using including BigQuery, Cloud SQL, Pub/Sub, Cloud Composer, Dataflow, Dataproc, Dataprep, Data Studio, Bigtable, Cloud Storage, file store, Cloud VM, Composer, Appengine, GKE or similar cloud experience.
• Understand different types of storage (filesystem, relational, NoSQL) and working with various kinds of data (structured, unstructured, metrics, log files, etc.)
• Experience in building scalable and reusable data pipelines (ETL, ELT) using airflow and data wrangling procedures using Python and SQL.
• Tune application and query performance using performance profiling tools and SQL
• Experience with batch and stream processing (including GCP Dataflow/Kafka Streams/Spark)
• Working knowledge of data visualization tools such as Looker and Tableau is a plus.
• Experience working with agile software development practices and drive to ship quickly.
• Research, analyze, and recommend technical approaches for solving difficult and challenging development and integration problems.
• Responsible for maintenance and enhancement of data platform which involves adding various operators for carrying out tasks in Apache Airflow
• Accountable that the team adheres to provided estimates and technical design, code review appropriate to the best performance standards.
• Identify, design and implement internal process improvements by automating manual processes and optimizing data delivery.
• Experience with Continuous Integration and Automated Test tools such as Jenkins, Artifactory, Git
• Experience with microservice patterns, API development, RESTful web services.
• Experience with containerization technologies (Docker, Kubernetes)

Desired Qualifications
• University Graduate in Engineering, or Post Graduate in Computer Applications.
• Technical certifications such as Google Cloud Data Engineer or advanced certifications in data science a plus.

Industry type
IT Services & Consulting
Education
Any Graduate

Apply

No comments:

Popular Posts