Careers

Apply For

Senior Data Architect – GCP & Snowflake

Job Description

Job Title: Senior Data Architect – GCP & Snowflake

Experience: 13+ Years

Location: Flexible / Remote - India

Employment Type: Full-time

Job Summary:

We are looking for an experienced Senior Data Architect with deep expertise in Google Cloud Platform (GCP) and Snowflake to lead the design, implementation and optimization of our cloud-based data architecture. The ideal candidate will bring 13+ years of experience in data engineering, architecture and solutioning, with a strong track record in designing scalable, secure and high-performance data platforms for analytics and AI-driven workloads.

Key Responsibilities:

· Architect and implement scalable data solutions using GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, etc.) and Snowflake.

· Lead the end-to-end data architecture including ingestion, transformation, storage, governance and consumption layers.

· Collaborate with business stakeholders, data scientists and engineering teams to define and deliver enterprise data strategy.

· Design robust data pipelines (batch and real-time) ensuring high data quality, security and availability.

· Define and enforce data governance, data cataloging and metadata management best practices.

· Evaluate and select appropriate tools and technologies to optimize data architecture and cost efficiency.

· Mentor junior architects and data engineers, guiding them on design best practices and technology standards.

· Collaborate with DevOps teams to ensure smooth CI/CD pipelines and infrastructure automation for data platforms.

Required Skills & Qualifications:

· 13+ years of experience in data architecture, data engineering, or enterprise data platform roles.

· 3+ years of hands-on experience in Google Cloud Platform (especially BigQuery, Dataflow, Cloud Composer, Data Catalog).

· 3+ years of experience designing and implementing Snowflake-based data solutions.

· Deep understanding of modern data architecture principles (Data Lakehouse, ELT/ETL, Data Mesh, etc.).

· Proficient in Python, SQL and orchestration tools like Airflow / Cloud Composer.

· Experience in data modeling (3NF, Star, Snowflake schemas) and designing data marts and warehouses.

· Strong understanding of data privacy, compliance (GDPR, HIPAA) and security principles in cloud environments.

· Familiarity with tools like dbt, Apache Beam, Looker, Tableau, or Power BI is a plus.

· Excellent communication and stakeholder management skills.

· GCP or Snowflake certification preferred (e.g., GCP Professional Data Engineer, SnowPro Core/Advanced).

Preferred Qualifications:

· Experience working with hybrid or multi-cloud data strategies.

· Exposure to ML/AI pipelines and support for data science workflows.

· Prior experience in leading architecture reviews, PoCs and technology roadmaps.

 

Positions

1

Work Experience

13 Years

Job Type

Full-Time

Location

Flexible/ Remote