Computer vision, machine learning, and some other stuff I still don't understand.
How can we combine our existing and emerging search technology to save users’ time and allow them to find the right content in new ways?
Searching for media assets can be a long and uninspiring process. Users don’t feel in control over the results they're given. They're often disappointed with the quality, composition, and style of the imagery available. There's limited resources for finding content that matches a designer's vision.
- Save time.
- Have more control over their results.
- Find content specific to their creative concept.
- Differentiate search features from competing services.
- Display Adobe's advanced and emerging technology.
A large part of this work was collaborating with engineers and data scientists to understand what was technically possible for us and where any overlaps with user needs existed. Some new technology can be exciting but may not be applicable for actually helping people search content for a creative project.
As a designer on search, it was often my role to be an intermediary between the user and emerging technology. Usability testing and interviews helped us decide what features were useful, what needed tweaking, and what should be abandoned. The features below are a direct result of a long, ongoing process of collaboration with data science, research, and our product management team.
It's important to note as well that what we didn't ship is just important as what we did ship. Through our testing, both internally amongst our team and with users, we were also able to identify which features were interesting but just not quite ready for prime-time.
Tested and launched in 2017
Aesthetic filters provide new ways to have control over the visual style of the imagery content on a search results page. Whether you're interested in something that pops, has a blurry background, or something more gloomy, you can do it with aesthetic search filters.
The Vivid Color and Depth of Field filters were launched in 2017. Copy Space was added in 2018. The Copy Space feature shows content that has areas suitable for text to be overlayed on top of the image.
Depth of Field
Tested and launched in 2018
The 'find similar' feature on Stock was (and probably still is?) a key interaction of the site. While it's a simple feature, users often expressed appreciation in usability testing for this functionality. We also had data that proved this as well.
However, users often asked, “I wonder what exactly the site thinks I like about this image?” People might be interested in a few aspects of an image, for example, the colors within it, the specific content, or the layout. In general, it’s a combination of these factors, but sometimes users have a specific reason.
Similarity intent empowers users to be able to communicate if there’s a particular aspect of an image they like. The options we made available were a result of technical feasibility and user research. Similarity intent allows a user to indicate if they're interested in the color, composition, content, or all aspects of an image.
This feature was launched in fall 2018.
Unfortunately, this one's a secret.
All I will say is that this work involved extensive, complex interaction design and high fidelity prototyping with Framer.
I'll also take this opportunity to mention our use of the Adobe Stock API.
We often used real data (actual search results) from the Adobe Stock API within Framer prototypes to evaluate the quality of content that a particular feature renders. A feature can hypothetically be 'a cool idea' but if it ultimately provides poor content for the user, it's probably time to pivot to something else.