Search video content as a human, not a system
NATURAL LANGUAGE SEARCH - MAKING VIDEO ADS SEARCHABLE THROUGH AI
This work focused on enabling users to find video adverts based on what actually happens within them. By combining AI driven understanding with a natural language interface, users can search a large corpus of data in seconds using simple human questions rather than structured queries.
THE CHALLENGE
Video content is not easily searchable, its value sits inside the video, not in metadata or predefined fields.
THE APPROACH
Bridging AI capability with product and delivery reality, working across product, engineering, and an offshore AI supplier to make it work in practice.
LARGE CORPUS
Thousands of video adverts made searchable through content understanding
NATURAL LANGUAGE
Users search with everyday questions, not filters or structured queries
REAL OUTCOME
Relevant results returned in seconds across a large dataset
CASE STUDY
MAKING VIDEO ADS SEARCHABLE THROUGH AI
CONTEXT
Making video ads searchable through understanding content
Test Your Ad contains a large corpus of video adverts. Traditionally, these are not easily searchable because their value sits inside the content of the video, not just metadata. The opportunity was to enable users to find adverts based on what actually happens within the video, using natural language rather than filters or predefined fields.
CHALLENGE
Search the unsearchable and do it instantly
The core challenge was enabling users to search video content in a human way. This meant interpreting intent, understanding what happens within an advert, and returning relevant results quickly across a large dataset. The experience needed to feel natural and fast, while the underlying system handled complex interpretation, indexing, and ranking.
APPROACH
Bridge AI capability with product and delivery reality
I worked at the centre of product, development, and business during UAT to make this capability work in practice. This involved working closely with the offshore AI supplier to refine how video content was interpreted and indexed, shaping feedback loops to improve relevance, and ensuring results aligned with real user expectations. I also translated the technical behaviour into product language to support positioning and understanding.
OUTCOME
Natural language access to video insight at speed
The result is a capability that allows users to identify adverts based on what actually happens within them, not just tags or categories. Users can ask questions in natural language and retrieve relevant ads within seconds. This transforms how the dataset can be explored and positions the feature as a strong example of applied AI within customer experience.
DELIVERABLES
WHAT I DELIVERED IN THIS WORK
VIDEO UNDERSTANDING LAYER
Support for interpreting and indexing what happens within adverts rather than relying on metadata alone.
MODEL REFINEMENT
Iterative improvements to how queries are interpreted and results are ranked.
NATURAL LANGUAGE INTERFACE
A way for users to search using everyday language rather than structured filters.
UAT STRUCTURE
A clear process for testing, feedback, and continuous improvement of the capability.
CONTRIBUTION
HOW I SHAPED THIS WORK
Facilitated UAT across product, engineering, and business
Improved model outputs through structured feedback loops
Translated technical AI behaviour into clear product messaging
Worked with offshore AI supplier on video understanding and indexing
Ensured relevance and speed of search across large datasets
Aligned stakeholder expectations with real system behaviour
PRINCIPLES
HOW IT WAS DELIVERED
SEARCH SHOULD FEEL HUMAN
Users should be able to ask questions naturally, without needing to understand how the system works.
CONTENT OVER METADATA
The real value sits in what happens inside the video, not just how it is labelled.
SPEED BUILDS TRUST
Returning relevant results within seconds is critical to making the experience usable.
BRIDGE COMPLEXITY
The system can be complex underneath, but the experience must remain clear and intuitive.