back to results
Company & Role:
A global mobile platform that sells a wide selection of curated luxury fashion from the world’s best boutiques. Its omnichannel business model supports retailers to directly reach customers worldwide and particularly the Middle East. Customers gain unique access to shop the most coveted luxury brands, new labels and one-of-a-kind vintage gems from around the world and have them delivered directly to their doorstep.
In this role, you will be responsible for:
– Translating business goals into technologies and driving business innovation via technology
– Designing, building, and owning a multi-tenant data pipeline, and be responsible for catalogue updates
– Implementing computer vision / machine learning algorithms for feature extraction and classification tasks
– Building ad-hoc and productional data pipelines for financial and operational reporting, supporting other teams with structured data streams
– Creating a state-of-the-art multi-platform event tracking pipeline from event definitions to ETL and data visualisation
– Building forecasting models for catalogue and stock management, financial reporting
– Maintaining ownership over all data assets of the company, initiating integrations driven by business goals
The ideal candidate for this role will have strong, proven experience in software or data engineering and scaling large data sets.
You can design and optimise queries, data sets, and data pipelines to organise, collect and standardise data that helps generate insights and addresses reporting needs, as well as understanding the challenges of reliable data replication, optimising for a data warehouse, and maintaining the integrity of a data lake.
– Have experience reliably integrating and handling data from multiple APIs
– Have experience building applications at scale on any major cloud provider (MS Azure, AWS, GCP, etc.)
– Have experience with version control, shell scripting, the Unix filesystem, and automating deployments
– Have production experience with Python, PostgreSQL, event brokering (Kafka) and Unix makefile
– Have experience with BI tools and managing data sets for BI tools
A basic understanding of statistics and sampling, and experience in machine learning, deep learning and computer vision will be a big benefit.
We apologise in advance that only successful candidates will be contacted by a specialist Consultant.