Collaborate with Data & Technical architects, integration and engineering teams to capture inbound/outbound data pipeline requirements, conceptualize and develop solutions.
Support the evaluation and implementation of the current and future data applications/technologies to support the evolving Zscaler business needs.
Collaborate with IT business engagement & applications engineer teams, enterprise data engineering and business data partner teams to identify data source requirements.
Profile and quantify quality of data sources, develop tools to prepare data and build data pipelines for integrating into Zscaler’s data warehouse in Snowflake.
Continuously optimize existing data integrations, data models and views while developing new features and capabilities to meet our business partners needs.
Work with Data Platform Lead to design and implement data management standards and best practices.
Continue to learn and develop next generation technology/ data capabilities that enhance our data engineering solutions.
Develop large scale and mission-critical data pipelines using modern cloud and big data architectures.
Qualifications/Your Background:
4+ years of experience in data warehouse design & development.
Proficiency in building data pipelines to integrate business applications (salesforce, Netsuite, Google Analytics etc) with Snowflake
Must have proficiency in data modeling techniques (Dimensional) – able to write structured and efficient queries on large data sets
Must have hands-on experience in Python to extract data from APIs, build data pipelines.
Completely proficient in advanced SQL, Python/Snowpark(PySpark)/Scala (any Object Oriented language Concepts), ML libraries.
Strong hands-on experience in ELT Tools like Matillion, Fivetran, Talend, IDMC (Matillion preferred) , data transformational tool – DBT and in using AWS services like EC2, s3, lambda, glue.
Solid understanding of CI/CD process, git versioning, & advanced snowflake concepts like warehouse optimizations, SQL tuning/pruning
Experience in using data orchestration workflows using open-source tools like Apache Airflow, Prefect
Knowledge of data visualization tools such as Tableau, and/or Power BI
Must demonstrate good analytical skills, should be detail-oriented, team-player and must have ability to manage multiple projects simultaneously.
#LI-YC2
#LI-Remote
This job is no longer open
Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.