2x11: Alexa play the Bad Voltage podcast

Stuart Langridge, Jono Bacon, and Jeremy Garcia present Bad Voltage, in which we command Alexa to play our podcast and it works, Jono turns off his phone so it doesn’t explode with helpful comments every time we say “OK Google” (you may wish to do the same), and:

  • [00:01:55] The news... Verizon spends $4.5 billion on Yahoo for reasons we don't understand... Pinboard buy Delicious, hilariously... Apple add podcast analytics with possible big implications for the advertising industry... and Apple also confirm that they're working on self-driving cars... while losing a bunch of value in their share price, along with everyone else in tech...
  • [00:18:35] Voice controlled "assistants", on phones ("OK Google", Siri, Cortana) or on in-home devices (Amazon's Echo, Google's Home, the upcoming Mycroft and Apple devices) are a whole new platform in its early stages. We've been testing out some of what's on the market, and experimenting with how they fit into our lives and with how one might use them in the future. Are they a revolutionary new thing like smartphones? Or a flash in the pan like smartwatches? The Bad Voltage verdict...
  • [00:51:15] You can win an Endless Mission One computer, which the Endless people have given to us to give away! Simply enter our competition: think up the best idea you can for a "skill" or plugin for voice-controlled assistants like Alexa or OK Google. You can aim to inspire us with a great idea, or make us laugh with a daft one. (And you don't have to _write_ it, just think of the idea.) Email your suggestions to [email protected] by Monday 26th June 2017 and we'll choose the best suggestion from somewhere in Europe and they'll win the Endless Mission One! (You have to be in Europe to win, here. Loose definition; if you think you're in Europe, you probably are. But this competition is only open to people in Europe, because the previous one was only US and Canada.)
  • and finally, if you want to hear Stuart answer a particular question, he's doing a live "mashup" show as part of FOSS Talk Live in London on June 24th 2017; Stuart, Joe Ressington, Dave Megins-Nichols, and Marius Quabeck will be answering your questions, so fill in a question you want their views on!

Download the show now!

But where does Europe end. What about Eurasia, or even Afro-Eurasia? It’s all connected. (mostly) Still not me, I will continue to sit on my island.

I want a kind of flashcard type thing, but I don’t want to have to write them myself. Something smart enough that I can share my notes with and it will generate question to do with the subject material provided.

Me: Alexa, Give me a question on vectors.
Alexa/Google/whatever: Find the position vector that splits (1,2,3) and (5,2,4) in a 2:3 ratio.

I agree that there is not need for a ‘killer app.’ Having content access and services are the ‘killer’.

Can I ask you to drop us an email with this suggestion? I don’t want to lose track of it…

Can I make a suggestion and enter my German friend to win the competition if my idea is picked?

Listening to this episode inspired me with an idea and I sort want to see if it can be made a reality, so if one of my best friends wins in my place I would be happy.

The assistants today are nothing more than just toys. They feel unnatural and have lots of problems when it comes to understanding context, sloppy language, and various accents. Siri in my case has problems with my ts and ds, because I speak English as a second language, so she mixes up “thumb” and “dumb”. :slight_smile:

However, I could really see them in the future playing a huge role in the house especially when we start integrating IoT enabled devices with them. Imagine the following dialogue:

  • Hey Siri, I think I’m going to have fried eggs and a salad for breakfast tomorrow.
  • What do I need to get from the store?
  • Hey N. You need to get eggs and tomatoes for the salad.
  • Excellent Siri, remind me to do that 45 minutes before leaving work.
    … (45 minutes before leaving from work)
  • Hey N, don’t forget to pass by the store and get eggs and tomatoes for the breakfast tomorrow.
    … (on coming home)
  • Hey Norm, did you get the groceries for tomorrow’s breakfast?
  • Nah Siri, I forgot.
  • Okay Norm, should I remind you tomorrow?
  • Nah Siri, it’s fine.

Today this dialogue is kinda possible but not really. And this is what I want. Just a helpful voice around the house that could help me with old fashion house chores. What I don’t really need or want is an assistant that can answer the question “Who is the Queen of England?”

p.s. You also mixed up Tesla’s and Apple’s primary goals when it comes to cars. Tesla wants to build an electric car for the 1st world, Apple wants to expand its eco-system to self-driving cars. My impression is that they’re going for the augmented reality, where the AI is just a helpful hand rather than something that drives you everywhere while you read the news.

Tesla is not the biggest electric car manufacturer in the world, it’s BYD. They are a Chinese company that produces li-ion batteries and electric cars for the Chinese market. Whether Tesla succeeds as a mainstream car manufacturer is a different question. I don’t know. They’ve been pretty successful so far, but whether they are going to continue their success is a tough question to answer.

Yep! No problem with that.

Picture the scene, a remake of 2001: A Space Odyssey, as directed by Jono Bacon:

Dave Bowman: Open the pod bay doors, HAL.
HAL: I’m sorry, Dave. I’m afraid I can’t do that. You haven’t got your finger on the voice recognition fingerprint authenticator.

If Amazon Echo shows us anything it is how many years away we truly are to having actual AI. Don’t get me wrong, I like my Echo devices. They are very useful for controlling the brightness/colour of my lights and the temperature of the heating zones for my house as well as streaming music (and podcasts such as this one) via Bluetooth. But they are also a pain in the ass for anything else as can be seen by this tweet I mad in January:

Alexa is still a glorified stack of if/else statements with a voice to text parser. In fact with certain queries you give it you can even tell which words in the sentence you say that it completely ignores. I’ve got it to ignore the key words in my queries a few times (not recorded). I suspect other assistants are the same. I use Google’s to set timers and alarms or to run web searches that will take me too long to type on a phone virtual keyboard, it is good for those things but I haven’t found a use for it yet beyond that. Google’s voice recognition is an order of magnitude better than Amazon’s though.


I’m guessing that voice controlled “assistants” in their current incarnation won’t see much market penetration (beside tech geeks) and won’t reach a global market until they don’t get better at being “assistants”. The call for them to be used to take appointments could be one of the great features that the next generation of these devices could bring. Unfortunately, to reach this level, we either have to give away all of our personal information to the big tech companies (while hoping they won’t do evil stuff) or they could improve to the point of being completely local without relying on “the cloud” for their operation. I would definitely be more onboard if the later options turns out to be possible.

Also, the previous contest was not open to Canadian citizens. The “fruit challenge” was only open to US citizens.

1 Like

@jonobacon you laughed at Cortana, but according to various figures there are 1.5billion Windows 10 devices, all of which have cortana on them… That’s a f**k ton of Cortana’s out there!!!

1 Like

Indeed, but I doubt that many are using Cortana actively. I bet it is a fraction of that number.

I think it’s more accurate to say many don’t know they’re using it… Cortana is now the “search provider” for the majority of Windows devices, which gives MS a pretty decent data set to analyse (if you don’t go to the lengths of switching telemetry and online/web searches off)

And looking at machine learning and bots and intelligent assistant tech, I would actually bet on MS instead of Google or Apple because they seem to have a much better grasp of what you’re actually asking for. Their natural language processing has always seemed a little more “intelligent” than either Siri or Google Now/Assistant. I haven’t used an Amazon device so I can’t comment on that front, but having used the others Cortana does feel a little more like you’re talking to a person (for better or worse).


That number sounds improbable, given Microsoft announced there were 400 million active devices in September, and that their target of 1bn devices by FY2018 was going to slip.

Ya. I think that number might be 0.5 billion, not 1.5 billion. (Source: http://www.computerworld.com/article/3195802/microsoft-windows/microsoft-now-claims-half-a-billion-windows-10-devices.amp.html from a month ago.)

Another interesting stat is that around 12% of Windows devices are running Windows 10. As @jonobacon says, how many are actively using Cortana? Compare that to the 86% of iOS devices that are running iOS 10 (as of a fortnight ago), and then wonder how many are actively using Siri.

We don’t have to guess about the number of Cortana users, as Microsoft announced there were “more than 141 million monthly users” at Build last month. The problem is, that’s a pretty disingenuous number as it includes any interaction with Cortana on any platform. How many people use it in the context of the segment in this episode? There are no public numbers available, but I’d wager it’s a pretty small fraction of that number. If there were more active monthly voice users than other platforms, I’d contend Microsoft would have made quite a bit of noise about that.

From https://www.windowscentral.com/cortana-now-has-over-141-million-users-every-month

On stage at the Build 2017 developer conference today, Microsoft announced 141 million monthly active users for Cortana. This number is slightly off from the 145 million number that was rumored to be sent to prominent Alexa developers in order to lure them over for the “Cortana Skills Kit” developer tools launch. But it’s an impressive figure all the same.

That 141 million monthly user number surpasses Amazon’s Alexa by a considerable margin, considering only 3 million Echo units have reportedly been sold as of last month. The big question here, which in an early briefing Microsoft was unable to answer, is how many of those monthly active users actually speak to Cortana. Microsoft’s figures are for any kind of Cortana use, which includes text, voice, and assistant notifications based on user input. That information may prove useful for developers looking to target Cortana Skills to the largest portion of their users’ activity type. For now, those details aren’t available.

I’d also be curious what the breakdown is between Windows, Xbox, Android, and iOS usage but I wasn’t able to find any numbers.


1 Like

My apologies… The 1.5 billion is an estimate of the number of Windows devices in active use - serves me right for not reading the article properly…

It would be quite interesting to know what percentage of the total user base of a platform actually use a voice assisstant. Although it would need to be broken down a little further in the MS case since they have Windows 10 on wider variety of devices (phones, Xbox, PC/Laptop/Tablet/Raspberry Pi/Hololens), compared to Siri which is only available on iOS and macOS or Alexa that’s slowly becoming available on more devices (currently TV sticks/tablets/speakers). It would also be interesting to know which voice assistants people use by platform (for example Android users using Alexa or Cortana).

Just a random thought, but since all these assistants provide an API I’m wondering if there’s an opportunity here to write an app that can combine data from Cortana, Siri, Google Assistant and Alexa (and it will be called CAGS or something) and display it all in the one place, maybe add something into the app that can obfuscate who’s sending the request so you can still get some use out of the assisstants without the privacy implications, although that probably takes away some of their more useful features that require access to your personal data (like appointments and reading your email and the like). You could even do something like “CAGS ask Alexa/Siri/Google/Cortana [something]” to use a specific one! That way you could still take advantage of the fact that Google knows your Calendar I suppose.

Similarly I’ve found Siri utterly worthless because it isn’t setup for my manner of speaking. I don’t think through what I’m going to say word for word before opening my gob. I have an idea what I want to express and let other parts of my brain sort out the details!

As a result I might say something like:

Siri, when was, uh, Pride and Prejudice published?

That pause and “uh” won’t be very long but it’s usually long enough for it to decide that the input was “Siri when does the new series”. More to the point though it doesn’t learn my speech patterns or have a way to register my frustration. If I keep asking similarly phrased questions (followed by grumpy noises) it should figure out that its speech recognition isn’t registering properly and fire up a different recognition routine. Perhaps ask for some context, for example.

  • Siri, what’s Anna’s address?
  • -> “What’s Anaz address”
  • I’m not sure. I believe you’re near (address).
  • What’s Anna’s address?
  • -> “What time is address”
  • I’m not sure. I believe you’re near (address).
  • What’s Anna’s address?
  • -> “What’s Anaz address”
  • Sorry, I don’t see ‘Anaz’ in your contacts.

Surely it should realise due to the similarities and repeated question that the information it’s giving me isn’t what I want? From there it should run alternate possibilities for what I said and try a different interpretation. I have several Annas and Hannahs on my contact list so I could understand there being confusion over which address I’m after but surely there’s enough information there to ask me a follow up question?

  • What’s Anna’s address?
  • Sorry about this. You’re asking after a contact, right?
  • Yes. Anna.
  • There’s several people that could be, which is it?

A similar principle as used in things like Akinator, I guess!

1 Like

I’m curious about how we’ll walk the line between humanising to the point of supporting true natural language and going too far to the point where they express enough personality to cause issues for children growing up around them.

I’m thinking about this in terms of parasocial relationships.

I like to use the model from The Sims to describe this sort of thing, imagining myself to have a “social meter” that fills up by engaging in satisfying social activities and results in low mood if allowed to fall too low. Should a well built digital assistant/companion aim to push the social buttons that fill up the meter?

1 Like

“Haw, Siri, gonnae gi’us the nearest shoap fir a carry oot?”
"…? Steve? Help me, Steve. My mind is going …"

Please respect our code of conduct which is simple: don't be a dick.