Senior Data Engineer (Remote) // British Columbia


Vancouver, BC, Canada Remote

Full time

Sep 26

This job is no longer accepting applications.

SSENSE, pronounced [es-uhns], is a Montreal-based fashion platform with global reach. Founded in 2003, SSENSE is pacing the vanguard of directional retail with a mix of luxury, streetwear, and avant-garde labels. We produce industry-leading original content and take pride in building our own technology solutions and systems from scratch. Our field of focus has grown beyond that of a typical e-commerce entity as we explore the nexus of content, commerce, and culture. Currently serving 150 countries, generating an average of 76 million monthly page views, and achieving high double digit annual growth since inception, SSENSE is becoming a cultural protagonist in its own right.

Job Description

*This is a remote position within Canada

SSENSE is looking for a Senior Data Engineer to join our rapidly growing technology team. The Senior Data Engineer will take complex features of the product roadmap, break them down into their required technical components, and develop them independently. They own at least one component of the SSENSE technical stack and hold accountability for its SLAs. The ideal candidate will actively contribute to knowledge dissemination within the organization, participate in the recruiting and onboarding of new employees, and mentor Junior Developers on the team.


  • Product delivery
  • Build, test and operate stable, scalable data pipelines that cleanse, structure and integrate disparate data sets into a readable and accessible format for end-user facing reports, data sciences and ad-hoc analyses
  • Develop a deep understanding of the product roadmap for the squad, including future features to be developed
  • Contribute to high-level estimation and participate in laying out the development sequences, challenging the product roadmap and identifying areas where technical debt can be reduced or avoided
  • Complete independently complex development tasks and actively contribute to pushing code to production
  • Write testable, efficient, and reusable code suitable for continuous integration and deployment, respecting best practices and SSENSE development standards
  • Review Unified Modeling Language (UML) diagrams and technical documentation


Ownership and accountability

  • Be accountable for code quality by conducting adequate testing
  • Be accountable for performance, reliability, scalability and resilience of at least one technical component owned by the squad through SLAs and monitoring
  • Solve complex technical problems and mentor/support other technical staff on data modeling and ETL related issues
  • Contribute to cross-squad initiatives, acting as a change agent amongst peers to foster adoption of new processes or technical solutions


Knowledge sharing and coaching

  • Review Pull Requests with the objective to guide and upskill other Data Engineers on various technical topics
  • Actively contribute to SSENSE University (the internal peer learning platform) to promote continuous learning
  • Participate in the onboarding of new Data Engineers


  • Contribute to solution designs, challenging other members on technical decisions and explaining the technical design to junior developers so they can write documentation for the rest of the team



  • Participate in HR recruiting events, helping to identify and recruit top developers


  • Bachelor’s degree in Computer Science, Engineering, or a related technical field, Master’s degree an asset
  • A minimum of 5 years of Functional Programming and/or Object Oriented Programming (OOP) experience
  • A minimum of 3 years experience writing and optimizing SQL queries
  • A minimum of 3 years experience with Apache Spark for big data processing
  • Extensive knowledge of Python programming language and its data manipulation libraries (Pandas and Numpy)
  • Expertise in data modeling and an advanced understanding of data architecture
  • Expertise with RDBMS and NoSQL databases at scale
  • Experience with Apache Airflow or other similar data pipelining and workflow scheduling framework (Luigi, Azkaban)
  • Ability to use containers, orchestration frameworks, and other DevOps tools (Kubernetes, Terraform, Giant Swarm, etc.)
  • Proficiency with cloud resources (AWS/Google Cloud/Azure) with the ability to operate them for the components owned, Certification is an asset.
  • Knowledge of the AWS services (Redshift, Glue, Athena, S3, etc.) an asset
  • Knowledge of big data technology (Databricks, Hadoop, Hive, Pig, Presto) an asset
  • Familiarity with continuous integration and automated pipeline tools (Jenkins, Travis, etc.)
  • Proficiency in Git
  • Strong written and verbal communication skills in both English and French 

Additional Information

  • Highly analytical and detail oriented
  • Ability to coach and mentor junior employees to achieve personal and professional goals
  • Team player with a high sense of accountability and ownership
  • Ability to influence and drive change
  • Solution-oriented mindset and can-do attitude to overcome challenges
  • Ability to thrive in a fast-paced environment and master frequently changing technologies and techniques

You must be logged in to to apply to this job.


Your application has been successfully submitted.

Please fix the errors below and resubmit.

Something went wrong. Please try again later or contact us.

Personal Information


View resume



Luxury fashion tech leader and cultural resource.