The Business Data Platform team is looking for a skilled data engineer with prior experience designing and implementing data pipelines, warehouse and business intelligence systems integrating transactional, analytical, and big data components. Together with the team, you will drive the data platform design and implementation to meet the data needs of the organization. This is a highly visible role where you will be instrumental in designing and building out the data platform and features. The team comprises of data engineers, data analysts and data scientists focused on enabling Smartsheet to perform actions and achieve insights to help continue the high company growth. Here is one short video showing a Smartsheet overview and how you will be on the forefront of data analysis for a growing customer base:

If you like big data challenges within a fun and fast-growing company, contact us. This role will be located our Bellevue, WA headquarters reporting to the Manager, Business Data Platform. 


  • Design and code data pipeline features and data processing jobs that encompass innovative business intelligence and analysis to help Smartsheet on its growth trajectory
  • Lead the development of capabilities in all facets from strategic to tactical implementation and from conception to post deployment
  • Ensure smooth ongoing operations of data platform with high availability while making continuous improvements
  • Help key users across the entire organization to understand and consume the data sets and platform for enhanced decision making and analytics
  • Advance the data architecture and platform and ensure adherence to key architectural tenets and best practices
  • Design, implement and maintain data models


  • Love to learn, and must be open-minded and action-oriented
  • 6+ years of development experience with schema design, data architecture, and data pipeline and processing
  • Experience designing and delivering large scale, 24-7, mission critical data pipelines and features with today’s more current big data architectures
  • Must have data engineering experience with 1 or more non-SQL languages like Python, Scala or Java
  • Strong data modeling skills (relational, dimensional and flattened). Strong analytical and SQL skills, with attention to detail
  • Deft problem solver and strong collaborator
  • Self-driven and highly dependable in an agile and results-oriented environment
  • Familiarity with both established and emerging data technologies and the ability to evaluate and ascertain their applicability
  • Experience with big data, and data pipeline technologies in a cloud environment. AWS, Airflow and Snowflake PaaS platform a plus
  • Expert knowledge of ETL and data integration techniques
  • Legally eligible to work in the U.S. on an ongoing basis
  • BS or MS in Computer Science, or equivalent