Voice Search In The Metaverse

Voice Search In The Metaverse

by , Staff Writer @lauriesullivan, April 15, 2022

Voice Search In The Metaverse

Voice-based digital assistants such as Apple Siri, Amazon Alexa, and Google have brought voice interfaces into the home and automobiles, but advances in artificial intelligence and machine learning will mimic human interactions and bring synthesized conversations into the metaverse.

The primary interaction in the metaverse is becoming voice.

Meta Platforms, Facebook’s parent company, announced in February that it is building a digital voice assistant to help people interact hands-free with physical devices.

Chief Executive Officer Mark Zuckerberg, during a presentation in February, said digital assistants will need to learn as humans do — learn and anticipate human behavior — to help users navigate the new online world.

Meta’s voice assistant will pick up contextual clues in conversations, collect physical body language like a facial expression, and hand gestures.

BrightEdge focuses on using data to support search engine optimization and, in part, voice search — helping brands increase revenue through technology and data. Inside Performance (IP) connected with Jim Yu, CEO and founder of BrightEdge, to talk about voice’s role in the metaverse, and ways to optimize for this type of search.

Inside Performance:  What is the role of voice search in the metaverse, and how do you describe the interaction?

Jim Yu:  It’s still early days, but people have started to think about the interaction and the role of voice search in the metaverse. It will be a hands-free virtual experience. Think about how you integrate with Google Home — it’s much more integrated into your physical world, but it’s still a separate command interface.

You can interact in a voice mode with a voice command, but these are short searches. You search for something using your voice — set an alarm, complete a task. It’s an informational search.

The user looks for a fast response. The voice search is separate from the interaction. In the metaverse, that interaction will be continuous and integrated.

IP:  How will companies become proficient with voice in the metaverse?

Yu:  The building blocks start with rich content around information. Having context in the interaction, the algorithms don’t need a lot of structure to understand the information. Marketers will need to create a much richer context around the information they create, so the AI can make sense around voice.

IP:  How should brands think about optimizing for search?

Yu:  The basic building blocks also start with understanding the areas in which you need to create schema.

There will be an evolution of formats that provide a richer set of information to participate. We’ve seen an increase of marketers paying attention to location to get more context, which is a good starting point.

IP:  Is voice search still being used more on mobile than desktop?

Yu:  Yes, it’s still being used more for mobile based on location searches and directional questions, intentional queries.

IP:  Do voice systems still have challenges around words that have more than one meaning?

Yu:  Natural-language processing is getting smarter. Location is a great example. It knows your location for the searches it performs, and sometimes that’s very important. It’s about adding data and information around context, so it can provide accurate answers.

IP:  Do you have recent data that speaks to the use of voice search?

Yu:  We recently surveyed 600 enterprise brands — our customers, many in the Fortune 500 — and saw that 41% are not putting any emphasis on conversational search, 52% are doing something, and 7% are putting a lot of emphasis on it.

 

 

 

Advances in AI and machine learning will mimic human interactions and bring synthesized conversations into the metaverse. Meta Platforms is building a digital voice assistant to help people interact hands-free with physical devices. MediaPost.com: Search & Performance Marketing Daily

(24)