At last year's E3 gaming convention, Ubisoft unveiled the trailer for the new Star Trek: Bridge Crew virtual reality experience, which is slated for release May 30. For all of VR's shortcomings, Bridge Crew actually does look promising: players will assume the role of officers of the Federation on the starship Aegis, where they will bark out orders that will determine the fate of the ship and its crew. And as IBM announced today, if you're not playing with other humans (and to be fair, at this point it's going to be hard to find friends with VR headset), you will be able to play with Watson, the infamous jeopardy-playing and disease-diagnosing supercomputer.
Following the release of Bridge Crew later this month, players will gain access to IBM's interactive speech capabilities during an experimental beta period this summer. This will allow players to put Watson's natural language processing abilities to use while commanding the Aegis.
This announcement is concurrent with the release of IBM's VR Speech Sandbox, a Github repository where developers can take advantage of Watson Speech to Text and Watson Conversation (that is, its natural language process ability) for further use in their own VR games.
When I tried out the demo pages for Watson's Speech to Text and Conversation apps, the supercomputer struggled to accurately understand what is being said outside of a pretty narrow range.
In fairness, this is only the demo version of this ability and it is likely that the version to be released with Star Trek will be more fine-tuned and tailored to the needs of the game. But based on the difficulties that plague natural language processing algorithms, there's always the possibility that playing Bridge Crew with Watson will be worse than just playing by yourself. Any chance players will get tired of screaming "THAT'S BEAM ME UP, WATSON, NOT BEAN ME UP" at home, alone? Probably, but I just want to know how Watson will interpret my KHAAAAAAAN!