Watson IBM DURHAM, NORTH CAROLINA — IBM Watson came to Moogfest 2016, but there were no Jeopardy! questions this time around. If you’ve been following ExtremeTech, you already know that IBM Watson, an artificially intelligent system capable of answering questions in natural language, has been up to much more than that recently. At Moogfest, IBM Watson team spokesperson Ally Schneider was on hand to outline all of the latest developments.
Everyone remembers Watson from its Jeopardy! performance on television in 2011. But work on the project was started much earlier — not just in 2006, when three researchers at IBM first got the idea to build a system for the game show, but really decades before that, as IBM began doing work on natural language processing and cognitive computing in the 1970s.
IBM-Watson
The Jeopardy! Watson system in 2011 had three main abilities, as Schneider explained. First, it could understand unstructured text. “[Normally] we don’t have to think about it, but we inherently understand what sentences are, and how verbs, nouns, etc. come together to produce text,” Schneider said. Watson could read through human-generated content and parse it in a way that other systems haven’t been able to do before. Next, Watson could come up with its own hypotheses, and then return the one with the highest confidence. Finally, there’s a machine learning component — one that’s not hard-coded or programmed, but that really learns as it goes. “When you were back in school, not too long ago for some, how did your teachers test you to see if you understood what you were reading?” Schneider asked. “They would give you feedback on your answers. [For example], yes, full credit… maybe you got partial credit… or no, incorrect, here’s what you should have done instead.” Watson is able to “reason” in the same manner.
Today, after continuous improvements, Watson consists of 30 open-source APIs across four categories: language, speech, vision, and data insights. “Watson [today] has the ability to read through and understand unstructured data like a human and pull out the relevant answers and insights and now images,” Schneider said. She then began to illustrate some recent examples of Watson’s power. The first and arguably most significant one was a joint effort with Memorial Sloan Kettering Cancer Center. The goal was to train Watson to think like a doctor, in order to assist oncologists working with breast and colon cancers. IBM’s team fed Watson a steady diet of medical journals, clinical trial results, encyclopedias, and textbooks to teach it the language of medicine.
From there, Watson could look at a patient’s individual information and compare it against what the system knows about medicine, and then come back with recommended treatment options. Schneider said it’s still up to the doctor to decide how to use that information; it’s not a question of man versus machine, but rather, how machines can enhance what humans can already perform. In this case, the goal was to empower doctors so that they don’t have to read an impossible 160 hours worth of material each week — an actual estimated figure for how much new research is being published on a weekly basis!
Watson Logo
Next up was an application for the music industry. Quantone delivers in-depth data on music consumption. It not only leverages structured metadata the way Pandora, Spotify, and other music services do, such as the genre of music, the number of beats in songs, and so on, but using IBM Watson technologies, it can also process unstructured data, such as album reviews, artist-curated content, and natural language classification. Using Quantone, as Schneider put it, an end user can say, “I’m looking for a playlist reminiscent of Michael Jackson from a certain time period,” and get an answer that also pulls in and considers unstructured data.
Content creators can also benefit from AI-infused programming. Sampack offers algorithmically and artistically generated samples that are royalty-free. It’s essentially an automated license-free music sample generator. It takes in descriptions of tones (such as “dark” or “mellow”) and then translates them into an audio sample using Watson’s Tone Analyzer capability. Sampack can understand descriptions and emotions and translate them into music effects, sounds, and filters.
IBM also published a cookbook recently, which as Schneider pointed out isn’t something you would have expected to hear before it happened. The book is called Cognitive Cooking with Chef Watson: Recipes for Innovation from IBM & the Institute of Culinary Education. Watson would analyze the molecular construction of foods, figured out what goes well together, take in inputs such as specific ingredients and what to exclude (such as gluten or other allergy triggers), and then create 100 new recipes using that query. It doesn’t search through an existing recipe database for these, either; instead, it creates 100 new recipes based on your inputs. The first recipe is usually pretty normal; by the time it gets to recipe 100, it’s “a little out there,” as Schneider put it.
In the art world, World of Watson was a recent exhibit (pictured below) by Stephen Holding in Brooklyn, in collaboration with IBM Watson using a deviation of a color API. Watson mined through Watson-specific brand imagery and came up with a suggested color palette for Holding to use. The goal was to invoke innovation, passion, and creativity with an original piece of art.
Stephen Holding IBM Watson World of Watson Art
Finally, IBM Watson partnered with fashion label Marchesa for the recent Metropolitan Museum of Art gala with model Karolina Kurkova. Watson was tasked with coming up with a new dress design that was “inherently Marchesa and true to the brand.” Watson was involved in every step of the way. Using another color deviation API, Watson mined through hundreds of images from Marchesa, including model photos, to get a feel for the color palette, Schneider said. Then Inno360 (an IBM Watson ecosystem partner) used several APIs and considered 40,000 options for fabric. With inputs from Marchesa that were consistent with the brand, but while also evaluating fabrics that would work with embeddded LEDs, Watson came up with 35 distinct choices. The third step involved embedding the LED technology into the dress using the tone analyzer, with specific colors being lit up through the flowers.
All lit up. How @MarchesaFashion + IBM’s #CognitiveDress brought fans’ emotions to life at the #MetGala. Worn by the amazing @karolinakurkova. Check the link in our profile for all the details!
A video posted by IBM (@ibm) on May 3, 2016 at 12:10pm PDT
Today, anyone can get started working with IBM Watson by heading to IBM BlueMix and signing up for a Watson Developer Cloud account. Back in February 2015, IBM boosted Watson Developer Cloud with speech-to-text, image analysis, visual recognition, and the ability to analyze tradeoffs between different drug candidates. In July last year, Watson gained a new Tone Analyzer that could scan a piece of text and then critique the tone of your writing. We’ve also interviewed IBM’s Jerome Pesenti on many of the latest Watson developments.