Tamebay Position Paper:
Download or read online
This position paper on Search is available as a free download to read offline at your convenience, or carry on reading below.
Search used to be all about keywords and was text based. In the 1990 when the Internet first appeared most sites were text only with images little more then clipart and this was due to several reasons – server space was expensive and download speeds were slow, often as little as a 9.6 or 14.4 baud modem, and browsing images simply wasn’t practical.
Today the Internet is a very different place, rich with images and video and services have become possible which weren’t even dreamt of in the 1990s. However, many approaches to search and SEO have changed little in the intervening years with publishers still obsessed with keyword density and SEO back links to attract traffic.
Search engines today, whether Internet search engines or marketplace search engines, are much more sophisticated relying on many more signals than simply text. To be successful in today’s world, ecommerce merchants need to adjust their strategies and take advantage of the new technologies that search has to offer.
In this article we will explore how search has changed and how although text search will continue to underpin the Internet for many years into the future, new technologies such as voice and image search along with virtual reality and augmented reality will impact the way consumers buy products and hence how retailers should tailor their product presentation.
Text search is relatively easy to understand, but even here there are several strategies that can be employed of ecommerce to ensure that your products are surfaced that you’ll doubtless already be familiar with.
Titles and product descriptions are the defacto fundamentals, but increasingly structured data and product identifiers are ever more important. Both marketplace and Internet search engines are starting to demand GTINs for surety in identifying a product and connected to the GTIN is a known set of product attributes in a catalogue database that can assist in surfacing relevant search results.
There are two key steps you can take to make your products easier to index
- Consider tabulating key product information in a list. Search engines understand both ordered (lettered or numbered) and unordered (bullet point) lists and can interpret this as a set of structured data about the product.
This is taken to the next step with GS1 Smart Search using an external extension to Schema.org where structured data is related to the GTIN. It converts your product data from a description such as “Green Wellington Boots Size 4 £29.99” to “Product Name: Wellington Boots; Colour: Green; Size: 4; Currency: £, Price: 29.99”. Generally, there will be many more attributes, but structured data makes it much easier for a search engine to understand the product information and in turn provide more accurate search results to consumers.
- Consider using FAQs in your product descriptions – Increasingly consumers type queries into search engines and for certain products having the question and answer in your description can pay dividends when consumers search the web for general information.
Conversational language in descriptions will become increasingly important as consumers start to use voice search more frequently which is being driven by the rise of Virtual Assistants.
Voice search is closely related to text search but has subtle differences. You might think that voice is just a different way of entering a text search a la speech to text, but consumers are somewhat more polite and verbose when entering a voice command.
Rather than typing in a couple of keywords and hitting return, voice searches tend to be much more conversational. It’s also possible that Schema.org style mark-up may have little or no impact on search results and even titles have less relevance, but long form content performs better indicating that longer product descriptions could bear fruit. Many voice searches are in the form of question-based queries which is why FAQ style sections in product descriptions make sense.
It’s also worth remembering that hitherto search engine results would return several search results (10 on the first page of Google search results, 25 on eBay, 24 on Amazon) but when performing a voice search only a single search result will be presented.
Although it’s not possible on marketplaces, long form content with conversational language is likely to perform well in search results so consider landing pages for your main product sets with in-depth long form content around the subject and highlighting your key products – don’t forget that whilst voice searches on Virtual Assistants will return a single search result, voice searches on devices such as mobiles and tablets may well load an entire webpage. Even voice assistants will often send a webpage to the relevant Virtual Assistant app for the consumer to browse at their leisure.
Voice search will increasingly tie devices to platforms, for instance Amazon Echo and other Alexa powered devices will only return products listed on Amazon. Google Assistant will (so far in the US only) return Google Express shopping results although (currently in the US and Australia) you can also query eBay. Samsung don’t have their own shopping solution and then have chosen for voice searches through Samsung Bixby to return results from Amazon. Only by having your products listed on the relevant platform will specific shopping voice searches surface your products.
For many years, platforms asked merchants to provide ever larger, clearer product shots and preferably on a white background. Initially this was because they’re clearer and easier for consumers visual search but today it’s essential for Image search.
The major advantage of image search is the ability to find products which are difficult to describe, for example ornaments and art. Image search removes the need to describe the artefact and simply taking a photo enables the consumer to search and buy similar products.
There are two distinct types of image search with the later being much more sophisticated analysing the actual image whilst the older form of image search relied on inferring the subject from related information.
Image search started by using signals within or around the image. Common advice has in the past been to rename the file name to relate to the image, add alt tags and image descriptions, and place contextual information around the image. Whilst this enabled search engines to infer the content of the image it was limited when finding similar images based on similar tagging and did little more than reduce images into a subset of text search.
Today, Image search is based on the image itself and whilst adding tags and using relevant file names still makes sense, when an image is uploaded to a shopping search engine the image itself is analysed. Details such as shapes, lines, colours and proportions are converted into a mathematical model from which an image database can be searched to find other images with similar attributes.
Two key components of Artificial Intelligence are utilised – computer vision and deep learning – to power image search results. Consumers may take a photo or upload an image from their device or use an image already on their device or from any webpage which will be analysed and then compared to the image database to present visually similar images and, for ecommerce, products for the consumer to purchase.
The aim in image analysis is to understand the content of the image and infer enough unique attributes to be able to compare it to the image database and surface similar images and for ecommerce this means similar products.
In marketplace terms, eBay are leading the field with their Image Search and Find It on eBay solutions which have rolled out in the US.
- Find It On eBay is a feature in the eBay app and mobile platform that lets you share images from any social platform or web browser. All you have to do is “share” the image with eBay and their mobile app will find listings of the item in that image or others like it.
- With Image Search, you can take a photo of something you want to buy—or use an existing photo from your camera roll—and put it into the eBay Search bar on eBay native apps. Then, they’ll show you listings that match the item you are looking for.
Where this becomes key for retailers is understanding how product shots from different angles can assist search engines to analyse their product. For instance, something a simple as a shirt if photographed in its retail packaging may not reveal if it’s a long or short sleeved shirt. By shooting photographs on a mannequin (or even a on a flat surface) image search will immediately be able to analyse the image and infer the sleeve length enabling it to be better compared to similar products.
Virtual Reality has been promised as the future of shopping, but the reality is that consumers aren’t ready to take the plunge.
There are some applications where Virtual Reality has its place, for instance the ability to walk around a virtual show house when buying off plan or when buying a new kitchen and being able to do a virtual walk through. Where virtual reality fails are solutions that envisage a consumer donning a headset and walking through a virtual supermarket to do their weekly shop.
Wearing a virtual reality headset and a using a couple of Wii style controllers to select products from virtual shelves of a supermarket has two major failings. In practical terms few consumers will own the equipment required and more importantly wearing a virtual reality headset can be an incredibly disorientating experience and can cause nausea.
Virtual Reality has seen huge success in gaming as it gives the ability to be fully immersed in a fantasy world but there are few real-world applications for ecommerce.
Whilst many Virtual Reality implementations are fun marketing gimmicks, one of the few successes has been eBay owned Stubhub, the ticketing site, where virtual reality can assist fans to choose their seats. It offers an offers an immersive, 360-degree view from any seat in a stadium. Even here however it’s acknowledged that many users won’t have a headset and so the virtual view can be seen through a headset or on a regular computer screen.
The reality of Virtual Reality is that the technology is ahead of consumers and it is still only utilised in limited scenarios.
Augmented Reality can be considered as a specialised subset of Virtual Reality. Augmented Reality takes a view of the real world and overlays additional information over the top. The big advantage of Augmented Reality is that is requires nothing more than a smartphone to add an overlay to the real world compared to Virtual Reality which will generally require a headset and potentially handheld controllers to navigate and select options.
Augmented Reality has been popularised by services such as SnapChat filters giving the ability to add overlays to photos. Want to look like a barking dog or don some flowery spectacles, it can all be done with Snapchat filters and a smart phone.
One of the best known and most implementations of Augmented Reality was Pokémon Go which when launched in 2016 saw adults and children around the world chasing virtual creatures in the real world through the lens of their smartphones. The smartphone displayed the real world as seen through the smartphone camera and the Pokémon were overlaid by the Augmented Reality app.
There are many real-world applications of Augmented Reality from make up apps such as Makeup Genius from L’Oreal. Makeup Genius enables consumers to pick preselected ‘looks’ or pick and choose from a range of L’Oreal products to create their own look. As your select different products the app virtually applies them to your face in the app and you can even look from side to side to see the effect from different angles. Naturally once you’re happy with your look you can purchase the selected products in a single transaction.
In a similar manner “see it on” technology enables consumers to virtually try on products ranging from clothing to visualising how a new sofa will fit in your front room.
Both Augmented Reality and Virtual Reality are beyond the reach of smaller online retailers and best left to third parties such as marketplaces and search engines to implement. Larger retailers are experimenting with their own implementations and as early as 2011 eBay were showcasing sunglasses Augmented Reality in their iPhone fashion app enabling consumers to virtually try on sunglasses available on the marketplace.
The evolution of search
The way that consumers search will impact how online retailers present their products to the world, but it’s important not to get side-tracked with the glamorous future of Virtual and Augmented Reality. Getting the basics right for text search is still critical as this is how most consumers will habitually enter queries but with the increasing rise of voice search tailoring descriptions with natural language and lists and adding landing pages to provide rich content will pay dividends.
Image search will increasingly become mainstream enabling consumers to search without having to double guess the keywords a retailer may have used and so multiple images from different angles on clear backgrounds do more than power visual search, they replace the keywords and titles retailer relied upon in the past.
As has always been the case, the retailers who invest the time into tailoring their product listings will ultimately be the most successful. However, what worked a decade ago may not necessarily still work today and if you’re wondering why traffic is falling it may well be because your product listings don’t have the structured data, the keywords used in voice search or images which are being analysed to best effect.