Snowflake Architect

Job Title: Snowflake Architect

Primary Skill: Strong in Snowflake, Data Modelling, and architecture solutions

Secondary Skill: PySpark, SQL

Location: Pune, Hyderabad

Experience:12+ years

About the job:

Are you someone who has a strong data engineering background and a passion for tackling complex enterprise business challenges by designing data driven solutions? Then read on!

ValueMomentum is seeking snowflake architect to join our teams who will be responsible for leading technology innovation in cloud Technology. You will be part of a highly collaborative and growing team and solve complex business challenges leveraging the modern data and analytics technologies. In this role, you will be responsible for designing, building, and maintaining data warehousing solutions using Snowflake’s cloud-based data platform. You will collaborate with data engineers, data analysts, and other stakeholders to ensure efficient data storage, retrieval, and processing for the organization’s analytical and reporting needs.

Know your team:

At ValueMomentum’s Engineering Center, we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain. We achieve this through strong engineering foundation and continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise.

Responsibilities:

The Data Architect for presales will engage with clients to understand their data challenges, tailor technical solutions, and present compelling proposals. Bridge the gap between technical intricacies and client business needs, showcasing the value of our data architecture and domain expertise. Drive pre-sales success by leveraging deep understanding of data systems and strategic thinking.

Solution Development and Strategy

  • Develop and implement enterprise data architecture strategies aligned with organizational goals and industry best practices, tailoring solutions for presales engagements.
  • Provide reference architecture and guidelines for technology adoption, emphasizing cloud migration, and data modernization.

Data Solutions Delivery

  • Collaborate with business stakeholders to understand data needs and design solutions.
  • Create a unified data model showcasing the organization’s assets, relationships, and attributes.
  • Design and oversee the implementation of data integration, storage, and retrieval solutions, ensuring data accessibility, reliability, and performance.
  • Implement solutions leveraging AI and ML for better productivity.

Leadership and Innovation

  •  Provide technical guidance to data architects, engineers, and other data-related roles.
  •  Drive data governance initiatives, ensuring compliance.
  •  Research, architect, and deliver expanding solutions on the Data Analytics stack.
  •  Work with cross-functional project teams and provide data strategy and architecture, resolving performance and technical issues during proposal and delivery stages.

Building and Growth of Practice

  • Customize solutions to fit client needs and industry requirements.
  • Cultivate partnerships, emphasizing the long-term benefits of robust data architecture.
  • Drive innovation, integrating cutting-edge technologies into practices.
  • Establish industry thought leadership through active contributions.
  • Develop scalable frameworks adaptable to evolving client needs and growth.
  • Foster a collaborative team culture, encouraging knowledge sharing and skill development.
  • Manage Centre of Excellence for Data engineering and Analytics.

Marketing Support

  • Contribute for White Paper, Case Studies and Technical blogs.
  • Participate as a Panelist in technical session.
  • Support in activities for adding or enhancing partnership with Tech product companies.

What we need in you

  • Proven experience in designing and implementing complex data solutions aligned with business objectives.
  • Expertise in data modelling, integration, security, and governance
  • Hands-on experience with guiding the virtual data model definition, defining Data Virtualization architecture and deployment with focus on Azure/AWS, Snowflake, PySpark technologies.
  • Prior experience with establishing best practices for business optimizations.
  • Experience with relational and non-relational data stores (Hadoop, SQL, Mongo DB), ETL or ELT tools (SSIS, Informatica, Matillion, DBT), DevOps, Data Lake and Data Fabric concepts
  • In-depth experience with data governance, data integration and related technologies.
  • Proficiency in a variety of database technologies, both relational and non-relational.
  • Knowledge of cloud-based data solutions (e.g., AWS, Azure).
  • Excellent collaboration and communication skills
  • Experience in Insurance Domain will be a differentiator.
  • Data architecture certifications (e.g., TOGAF, DAMA) are a plus.

Requirements:

Candidates are required to have these mandatory skills:

  • At least 12 years’ overall experience in Data Modeling/Data Warehousing.
  • 5 years’ experience in Snowflake, Data Modeling, and Architecture including expertise in Cloning, Data Sharing, and Search optimization.
  • Proficient in Python and PySpark.
  • Ability to write complex SQL for analysis and troubleshooting of data anomalies.
  • Experience in Performance Management in Snowflake.
  • Working knowledge of cloud platforms (AWS, Azure, Google Cloud Platform, etc.).
  • Experience across Snowflake and Cloud-based databases on IAM & Role Management.
  • Familiarity with GIT for development, deployment, and support of data processes and procedures.
  • Detailed understanding of AWS, Google Cloud Platform, Azure, Snowflake resources (e.g., S3, Glue, Blob, EC2, Containers, etc.).
  • Experience with Linux and Windows environments.
  • Experience in Snowpark, Snowflake Data Sharing, Cloning, and Database replication for Disaster Recovery (DR) features.
  • Expertise in high-volume data processing using Snowflake, Redshift, Databricks, and other relevant cloud technologies.
  • Excellent communication skills, both verbal and written
  • Experience with BI tool and Database integration.

About the Company:

ValueMomentum is a leading solutions provider for the global property & casualty insurance industry, supported by deep domain and technology capabilities. We offer a comprehensive suite of advisory, development, implementation, and maintenance services across the entire P&C insurance value chain. This includes Underwriting, Claims, Distribution, and more, empowering insurers to stay ahead with sustained growth, high performance, and enhanced stakeholder value.  Trusted by over 75 insurers, ValueMomentum is one of the largest standalone insurance-focused solutions providers to the US insurance industry.

Our culture – Our fuel

At ValueMomentum, we believe in making employees win by nurturing them from within, collaborating and looking out for each other.

  • People first – Empower employees to succeed.
  • Nurture leaders – Nurture from within.
  • Enjoy wins – Recognize and celebrate wins.
  • Collaboration – Foster a culture of collaboration and people-centricity.
  • Diversity – Committed to diversity, equity, and inclusion.
  • Fun – Create a fun and engaging work environment.
  • Warm welcome – Provide a personalized onboarding experience.

Company Benefits:

  • Compensation – Competitive compensation package comparable to the best in the industry.
  • Career Growth – Career development, comprehensive training & certification programs, and fast track growth for high potential associates.
  • Benefits: Comprehensive health benefits and life insurance.

    Apply for this Job