Komodo Health

San Francisco
501-1,000 employees
In our mission to reduce the burden of disease, our Healthcare Map™ powers software to answer healthcare's most complicated questions.

Data Engineer

Data Engineer

This job is no longer open

The Opportunity at Komodo Health

At Komodo, we refine healthcare data to build a comprehensive Healthcare Map, integrating patient treatments and observations over time. Despite challenges posed by duplicative, inconsistent, and missing data from various vendors, we enrich the information with context data detailing attributes like drugs, procedures, and insurance coverage, derived from event data and external feeds. Our goal is to offer valuable insights beyond traditional claims reselling, ensuring accessibility to all customers, regardless of their level of expertise.

Reporting to the Engineering Manager, you will be solving complex data challenges while designing and implementing data processing and transformation at a scale that powers state-of-the-art interactive product experiences. You will enable smarter, more innovative uses of healthcare data by building robust data pipelines and implementing data best practices.

Looking back on your first 12 months at Komodo Health, you will have…

  • Gained an understanding of the broader Komodo Health data landscape and being part of architectural decisions for the Healthcare Analytics and Platform as a Service offerings.
  • Played a pivotal role in building foundational elements of the data platform architecture and pipelines, collaborating closely with cross-functional teams.
  • Building foundational pieces of our data platform architecture, pipelines, analytics, and services underlying our platform.

You will accomplish these outcomes through the following responsibilities…

  • Partnering with Engineering team members, Product Managers, Data Scientists, and customer-facing teams to understand complex health data use cases and business logic.
  • Being curious about our data.
  • Building foundational pieces of our data platform architecture, pipelines, analytics, and services underlying our platform.
  • Architecting and developing reliable data pipelines that transform data at scale, orchestrated jobs via Airflow, using SQL and Python in Snowflake.
  • Contributing to Python packages in Github and APIs, using current best practices.

What you bring to Komodo Health:

  • Expertise in writing enterprise-level code and contributing to large data pipelining and API processing with Python
  • Experience with SQL and query design on large, complex datasets 
  • Ability to use a variety of relational, NoSQL, Postgres, and/or MPP databases (ideally Snowflake on AWS) and leading data modeling, schema design, and data storage best practices
  • Demonstrated proficiency in designing and developing with distributed data processing platforms like Spark and pipeline orchestration tools like Airflow 
  • A thirst for knowledge, willingness to learn, and a growth-oriented mindset
  • Committed to fostering an inclusive environment where your teammates feel motivated to succeed
  • Excellent cross-team communication and collaboration skills

Additional skills and experience we’ll prioritize…

  • Experience enhancing CI/CD build tooling in a containerized environment, from deployment pipelines (Jenkins, etc), infrastructure as code (Terraform, Cloudformation), and configuration management via Docker and Kubernetes
  • US health care data experience is not required but it is a strong plus

#LI-Remote

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.