Responsibilities
· Participate in the migration of Datastage jobs to Snowpark. Reverse engineer the logic of existing Datastage jobs and translate them accurately into Snowpark.
· Perform thorough unit testing of developed Snowpark code to ensure functionality and data quality.
· Use airflow for orchestration of Snowpark jobs
· Perform unit testing and parallel run testing, comparing the output of migrated Snowpark jobs with the original Datastage jobs using production data in the development environment.
· Document developed code and processes as required.
· Work remotely and effectively within the India Time Zone.
Qualifications
Required Skills and Experience
· Bachelor's degree in Computer Science, Engineering, or a related field.
· 5 yrs of experience in Snowflake and 24 months experience in Snowpark
· Experience in Airflow
· Experience with Snowflake on Azure
· Experience with at least one ETL/ELT tool (Datastage experience is a significant plus).
· Solid understanding of SQL and data warehousing concepts.
· Familiarity with scripting languages such as Python.
· Ability to understand and translate existing data processing logic.
· Strong problem-solving and analytical skills.
· Excellent communication and collaboration skills, comfortable working in a remote team environment.
· Ability to work independently and manage tasks effectively within deadlines.