How the IoT interacts with human language
Artificial Solutions is a specialist in Natural Language Interaction (NLI), a form of Artificial Intelligence that allows people to talk to applications and electronic devices in free-format, natural language, using speech, text, touch or gesture.

    How the IoT interacts with human language

    Swedish Natural Language Interaction (NLI) specialist Artificial Solutions has launched its Teneo Analytics suite. The software aims to provide predictive metrics on conversations between humans and the technology that they use.

    The suite is comprised of three modules: Teneo Insight, Teneo Discovery and Teneo Analytics API. It works to crunch raw unstructured conversational data from sources such as virtual assistants, live chat, call transcripts and emails.

    Access to this data is hoped to increase insight into customer preferences and decision-making, allowing companies to fully personalize the way they interact with each individual consumer.

    The ability to summarize conversational data alongside context, metadata and other criteria is a tough task. Teneo Analytics aims to enable companies to use conversational data to create data-driven applications and automatically train Artificial Intelligence (AI) systems to anticipate customer needs.

    Teneo Discovery is the text mining tool of the suite, Teneo Analytics API allows organizations to automatically tailor communication based on each unique interaction, Teneo Insight is designed to track and analyze the conversations that users have with their Teneo applications providing the insight and information need to track and refine KPIs and business targets.

    How NLI affects the IoT

    Andy Peart, chief strategy officer at Artificial Solutions spoke to Internet of Business to clarify the NLI to IoT connection.

    Peart claims that Teneo Analytics is the first tool that allows for highly complex conversational data to be interpreted. This is important for the IoT because Artificial Solutions believes that in the future there will be three main levels of interaction with the wider IoT:

    1. A master controlling device such as a smartphone or wearable personalized to the individual.
    2. A few shared devices in a private eco-system such as house management or automotive.
    3. A multitude of open domain devices such as speed sensors on public highways.

    “It’s highly unlikely that washing machines, door locks and thermostats are going to have the computing power to be controlled individually by voice, except for maybe some basic commands. Users are most likely to control their surroundings by talking to their smartphone. However, the free format, unstructured data that is generated from these conversations makes it a challenge for many organisations to understand the user’s true intent, which is required to react intelligently,” said Peart.

    For example “It’s hot”, might refer to the weather, the stove or the new iPhone. It needs to be put into context with the other parts of the conversation. Then there is the multitude of different ways humans say the same thing and being able to understand when “great” isn’t always meant in a positive way.

    Peart explains that if the question before was “What’s the weather like tomorrow”, then analytics can tie in the meaning of “It’s hot” with other things they know about the user, such as the temperature of their thermostat and ask if they’d like it turned down, or offer to set the alarm an hour earlier so that the user’s morning run can be taken in cooler air.

    Talking to devices

    “Users are already expecting to be able to interact with devices, services and businesses using natural language and regardless of whether the input is text or voice, it all ends up as unstructured data. Masses of it. But if businesses can understand and interpret it, the data holds a wealth of information direct from their own customers and users,” said Peart.

    This can then be appended with knowledge from other services and IoT devices to give a more complete picture of the conversation. For example, where the user was at the time, what action did the user take next and what input were they commenting on. All of which can then be linked in using the API with existing business intelligence tools for further analysis.

    Peart concludes by saying that although natural language analytics is not new, until now organisations were only able to look at individual words and then make assumptions.

    “Teneo Analytics changes this by enabling enterprises to fully understand the nuance of every conversation in real-time and to use that information to further its interaction with the user, as well as developing the insight required to grow the business.  Furthermore, this understanding can be fed back into the natural language applications built using Teneo and used to personalize the responses in near real-time, increasing the conversational ability of the Teneo based apps and making them more humanlike and intuitive,” he said.