Primary Roles & Responsibilities:
• Architect and implement end-to-end data solutions using Snowflake.
• Design modern data platforms integrating tools like DBT, Fivetran, Airflow, and cloud services (AWS, Azure, GCP).
• Optimize Snowflake performance through clustering, caching, and query tuning
• Lead data modeling efforts (star/snowflake schema) and ETL/ELT pipeline design
• Provide technical leadership and mentorship to engineering teams
• Collaborate with stakeholders to gather requirements and deliver tailored solutions
• Support pre-sales activities, including solutioning and technical demos
Requirements
Experience in data architecture, data warehousing, and cloud technologies.
• Strong expertise in Snowflake architecture, data modeling, and optimization.
• Solid hands-on experience with cloud platforms: AWS, Azure, and GCP.
• In-depth knowledge of SQL, Python, PySpark, and related data engineering tools.
• Expertise in data modeling (both dimensional and normalized models).
• Strong experience with data integration, ETL processes, and pipeline development.
• Certification in Snowflake, AWS, Azure, or related cloud technologies.
• Experience working with large-scale data processing frameworks and platforms.
• Experience in data visualization tools and BI platforms (e.g., Tableau, Power BI).
• Strong experience with client communication and requirement understanding.
• Experience in Agile methodologies and project management.
• Strong problem-solving skills with the ability to address complex technical challenges.
• Excellent communication skills and ability to work collaboratively with cross-functional teams.
