Conversational assistants such as Alexa, Google Assistant, Siri, and Cortana have come a long way since Siri first appeared in 2011. Most of the time, these assistants recognize spoken words correctly, and they can reliably answer simple everyday questions about topics such as the weather. They now work so well that sometimes it’s hard to realize they still have a long way to go. If you think about it, what they can do is pretty much what you would expect if you were asking the same question of a 10-year-old with a mobile phone—very helpful, but not always enough.

A simple example of information that can’t be answered by generic assistants (or, really, any search engine) is enterprise-internal information that’s not on the web. This information could be entirely internal—think human resources—or it could be information that’s only accessible to customers with an account, such as order tracking. Most assistants are not going to be able to answer specific enterprise questions (such as, “How long do I have to work at my company to get 4 weeks of vacation?” or “When will my new laptop be delivered?”). Since there are thousands of organizations with vast amounts of enterprise-specific information, it isn’t possible for generic assistants to make this information available to their users. If organizations want to make this kind of helpful information available to their users, they have to develop it themselves.

Fortunately, this can be done, and there are good tools available to do it. Probably the best known is the Alexa Skills Kit, provided by Amazon. Developing the voice user interface part of an application with the Alexa Skills Kit is not difficult. The kit has been used to create 70,000-plus conversational Alexa skills, and it has been used by developers at all levels of skill. There is even at least one sixth grader who has created and published several Alexa skills.

Many Alexa skills are simple applications that the developers just used to learn the Skills Kit, but some of them are working enterprise-level applications. For example, the Best Buy Alexa Skill can list items on sale, track orders, or purchase items. It needs work, as evidenced by its three and a half star rating, but it is a first step. Microsoft’s LUIS and Google’s Dialogflow are similar tools that can be used to develop custom applications.

These enterprise assistants hold a lot of potential (for example, customer support, shopping, and accessing enterprise data). Similar to the generic assistants, custom applications must be easy to use. They have to give the user a frictionless voice user interface, recognize speech correctly, and understand what’s being requested of them. These basic capabilities are all extremely important. Nevertheless, it’s easy to overlook what happens after that—the integration step that actually gets results from the enterprise’s back-end information resources. These information resources can include webpages, databases, or documents—and combinations of these sources provide the data that’s needed to actually answer users’ questions.

In fact, one reason that we aren’t seeing too many enterprise assistants yet is the difficulty and expense of getting at this back-end information. Fortunately, there are some interesting tools that can be helpful for this. Web services are already providing streamlined web-based access to enterprise information for websites, and they can be used by conversational assistants as well. Another option for accessing back-end information is to make use of natural language technology to extract data from existing documents or webpages. This application of natural language technology is information extraction. Information extraction systems analyze existing written text (for example, user manuals, technical documentation, and employee handbooks) and format it into a machine-readable structure. For example, “How long do I have to work here to get 4 weeks of vacation?” could be answered by extracting information from human resources manuals. “How do I change the toner cartridge in my printer?” could be answered based on information extracted from user manuals. Of course, users could also get this information by perusing a several-hundred-pages-long HR document, but that’s not very convenient.

We’re getting more and more used to the convenience of Alexa, Siri, Google Assistant, and Cortana. Enterprise conversational assistants are starting to extend this convenience to the vast amount of information that’s currently hidden in enterprise resources.