It was a pleasure to be interviewed by Wired‘s Cade Metz for a piece he was writing about AI at Google, specifically part of Google Magenta. Magenta is an attempt on Google’s part to make the tools of machine learning accessible to people who make art and music. Metz was researching the NSynth, a “neural audio synthesis” technology the Magenta folks were debuting this week at Moogfest in Durham, North Carolina (see: moogfest.com). AI is one of Metz’s main beats, and he’s soon to move from Wired to the New York Times, where he’ll continue to report on the subject.
Part of the promise of NSynth is the ability to merge the sonic, timbral qualities of multiple existing instruments in the pursuit of previously unheard sounds. On the NSynth website it’s described as follows:
Learning directly from data, NSynth provides artists with intuitive control over timbre and dynamics and the ability to explore new sounds that would be difficult or impossible to produce with a hand-tuned synthesizer.
Here, for example, is what a flute and an organ combined might sound like:
Metz and I had a good chat about the promises and potentials of the technology. In brief, I think it’s helpful to think of NSynth in the context of what conductors and composers have done for centuries to create an unidentifiable sound by combining instrumentation. Someone from Google does contradict this point directly in Metz’s article — I just think there’s a grey area worth exploring between comparing and contrasting orchestral chimera and algorithmic chimera.
To experience NSynth check out the Sound Maker (at withgoogle.com), an online instrument that lets you play with the instrument-merging tools:
I also think it’s especially exciting that Google is up to this sort of work, because where Google leads others generally follow. Furthermore, whereas it’s hard these days for a company to compete against Gmail or (other than Apple’s iOS) Android or many other of Google’s entrenched accomplishments, music provides a lot more opportunity for a new technology to distinguish itself, especially this early on in the realm of the AI arts.
You can read Metz’s full piece, which was published on May 15, at wired.com.