Sunday, October 30, 2011

Speech interfaces: UI revolution or intelligent evolution?

Speech interfaces have received a lot of attention recently, especially with the marketing blitz for Siri, the new speech interface for the iPhone.

After watching some of the TV commercials you might conclude that you can simply talk to your phone as if it were your friend, and it will figure out what you want. For example, in one scenario the actor asks the phone, “Do I need a raincoat?”, and the phone responds with weather information.

A colleague commented that if he wanted weather information he would just ask for it. As in “What is the weather going to be like in Seattle?” or “Is it going to rain in Seattle?”.

Without more conversational context, if a friend were to ask me, “Do I need a raincoat?”, I would probably respond, “I don’t know, do you?” — jokingly, of course.

Evo or revo?
Are we ready to converse
with our phones and cars?
Kidding aside, systems like Siri raise an important question: Are we about to see a paradigm shift in user interfaces?

Possibly. But I think it will be more of a UI evolution than a UI revolution. In other words, speech interfaces will play a bigger role in UI designs, but that doesn't mean you're about to start talking to your phone — or any other device — as if it’s your best friend.

Currently, speech interfaces are underutilized. The reasons for this aren't yet clear, though they seem to encompass both technical and user issues. Traditionally, speech recognition accuracy rates have been less than perfect. Poor user interface design (for instance, reprompting strategies) has contributed to the overall problem and to increased user frustration.

Also, people simply aren't used to speech interfaces. For example, many phones support voice-dialing, yet most people don't use this feature. And user interface designers seem reluctant to leverage speech interfaces, possibly because of the additional cost and complexity, lack of awareness, or some other reason.


"Relying heavily on speech can lead
to a suboptimal user experience..."

As a further complication, relying heavily on speech as an interface can lead to a suboptimal user experience. Speech interfaces pose some real challenges, including recognition accuracy rates, natural language understanding, error recovery dialogs, UI design, and testing. They aren't the flawless wonders that some marketers would lead you to believe.

Still, I believe there is a happy medium for leveraging speech interfaces as part of a multi-modal interface — one that uses speech as an interface where it makes sense. Some tasks are better suited for a speech interface, while others are not. For example, speech provides an ideal way to provide input to an application when you can capitalize on information stored in the user’s head. But it’s much less successful when dealing with large lists of unfamiliar items.

Talkin' to your ride
Other factors, besides Apple, are driving the growing role of speech interfaces — particularly in automotive. Speech interfaces can, for example, help address the issue of driver distraction. They allow drivers to keep their “eyes on the road and hands on the wheel,” to quote an oft-used phrase.

So, will we see a paradigm shift towards speech interfaces? It's unlikely. I'm hoping, though, that we'll see a UI evolution that makes better use of them.

Think of it more as a paradigm nudge than a paradigm shift.


Recommended reading

Situation Awareness: a Holistic Approach to the Driver Distraction Problem
Wideband Speech Communications for Automotive: the Good, the Bad, and the Ugly

 

Thursday, October 27, 2011

Enabling the next generation of cool

Capturing QNX presence in automotive can’t be done IMHO without a nod to our experience in other markets. Take, for example, the extreme reliability required for the International Space Station and the Space Shuttle. This is the selfsame reliability that automakers rely on when building digital instrument clusters that cannot fail. Same goes for the impressive graphics on the BlackBerry Playbook. As a result, Tier1s and OEMs can now bring consumer-level functionality into the vehicle.

Multicore is another example. The automotive market is just starting to take note while QNX has been enabling multi-processing for more than 25 years.

So I figure that keeping our hand in other industries means we actually have more to offer than other vendors who specialize.

I tried to capture this in a short video. It had to be done overnight so it’s a bit of a throw-away but (of course) I'd like to think it works. :-)




Tuesday, October 25, 2011

QNX and Freescale talk future of in-car infotainment

Paul Leroux
QNX and Freescale enjoy one of the longest technology partnerships in the field of automotive infotainment. The roots of their relationship reach back to 1999, when QNX became a founding member of MobileGT, an industry alliance formed by Motorola (Freescale's parent company) to drive the development of infotainment systems.

If you've read any of my blog posts on the QNX concept car (see here, here, and here), you've seen an example of how mixing QNX and Freescale technologies can yield some very cool results.

So it's no surprise that when Jennifer Hesse of Embedded Computing Design wanted to publish an article on the challenges of in-car infotainment, she approached both companies. The resulting interview, which features Andy Gryc of QNX and Paul Sykes of Freescale, runs the gamut — from mobile-device integration and multicore processors to graphical user interfaces and upgradeable architectures. You can read it here.
 

Monday, October 24, 2011

When will I get apps in my car?

I read the other day that Samsung’s TV application store has surpassed 10 million app downloads. That got me thinking: When will the 10 millionth app download occur in the auto industry as a whole? (Let’s not even consider 10 million apps for a single automaker.)

There’s been much talk about the car as the fourth screen in a person’s connected life, behind the TV, computer, and smartphone. The car rates so high because of the large amount of time people spend in it. While driving to work, you may want to listen to your personal flavor of news, listen to critical email through a safe, text-to-speech email reader, or get up to speed on your daily schedule. When returning home, you likely want to unwind by tapping into your favorite online music service. Given the current norm of using apps to access online content (even if the apps are a thin disguise for a web browser), this begs the question — when can I get apps in my car?

Entune takes a hands-free
approach to accessing apps.
A few automotive examples exist today, such as GM MyLink, Ford Sync, and Toyota Entune. But app deployment to vehicles is still in its infancy. What conditions, then, must exist for apps to flourish in cars? A few stand out:

Cars need to be upgradeable to accept new applications — This is a no-brainer. However, recognizing that the lifespan of a car is 10+ years, it would seem that a thin client application strategy is appropriate.

Established rules and best practices to reduce driver distraction — These must be made available to, and understood by, the development community. Remember that people drive cars at high speeds and cannot fiddle with unintuitive, hard-to-manipulate controls. Apps that consumers can use while driving will become the most popular. Apps that can be used only when the car is stopped will hold little appeal.

A large, unfragmented platform to attract a development community — Developers are more willing to create apps for a platform when they don't have to create multiple variants. That's why Apple maintains a consistent development environment and Google/Android tries to prevent fragmentation. Problem is, fragmentation could occur almost overnight in the automotive industry — imagine 10 different automakers with 10 different brands, each wanting a branded experience. To combat this, a common set of technologies for connected automotive application development (think web technologies) is essential. Current efforts to bring applications into cars all rely on proprietary SDKs, ensuring fragmentation.

Other barriers undoubtedly exist, but these are the most obvious.

By the way, don’t ask me for my prediction of when the 10 millionth app will ship in auto. There’s lots of work to be done first.

 

Wednesday, October 19, 2011

Marking over 5 years of putting HTML in production cars

Think back to when you realized the Internet was reaching beyond the desktop. Or better yet, when you realized it would touch every facet of your life. If you haven’t had that second revelation yet, perhaps you should read my post about the Twittering toilet.

For me, the realization occurred 11 years ago, when I signed up with QNX Software Systems. QNX was already connecting devices to the web, using technology that was light years ahead of anything else on the market. For instance, in the late 90s, QNX engineers created the “QNX 1.44M Floppy,” a self-booting promotional diskette that showcased how the QNX OS could deliver a complete web experience in a tiny footprint. It was an enormous hit, with more than 1 million downloads.

Embedding the web,
dot com style:

The QNX-powered Audrey
Also ahead of its time was the concept of a tablet computer that provided full web access. When I started at QNX, I was responsible for tablets, thin clients, and set-top boxes. The most successful of these pioneering devices was the 3COM Audrey kitchen tablet. It could send and receive email, browse the web, and sync to portable devices — incredibly sophisticated for the year 2000.

At the time, Don Fotsch, one of Audrey’s creators, coined the term “Internet Snacking” to describe the device’s browsing environment. The dot com crash in 2001 cut Audrey’s life short, but QNX maintained its focus on enabling a rich Internet experience in embedded devices, particularly those within the car.

The point of these stories is simple: Embedding the web is part of the QNX DNA. At one point, we even had multiple browser engines in production vehicles, including the Access Netfront engine, the QNX Voyager engine, and the OpenWave WAP Browser. In fact, we have had cars on the road with Web technologies since model year 2006.

With that pedigree in enabling HTML in automotive, we continue to push the envelope. We already enable unlimited web access with full browsers in BMW and other vehicles, but HTML in automotive is changing from a pure browsing experience to a full user experience encompassing applications and HMIs. With HTML5, this experience extends even to speech recognition, AV entertainment, rich animations, and full application environments — Angry Birds anyone?

People often now talk about “App Snacking,” but in the next phase of HTML 5 in the car, it will be "What’s for dinner?”!

 

Tuesday, October 18, 2011

BBDevCon — Apps on BlackBerry couldn't be better

Unfortunately I joined the BBDevCon live broadcast a little too late to capture some of the absolutely amazing TAT Cascades video. RIM announced that TAT will be fully supported as a new HMI framework on BBX (yes, the new name of QNX OS for PlayBook and phones has been officially announced now). The video was mesmerizing — a picture album with slightly folded pictures falling down in an array, shaded and lit, with tags flying in from the side. It looked absolutely amazing, and it was created with simple code that configured the TAT framework "list" class with some standard properties. And there was another very cool TAT demo that showed an email filter with an active touch mesh, letting you filter your email in a very visual way. Super cool looking.

HTML5 support is huge, too — RIM has had WebWorks and Torch for a while, but their importance continues to grow. HTML5 apps provide the way to unify older BB devices and any of the new BBX-based PlayBooks and phones. That's a beautiful tie-in to automotive, where we're building our next generation QNX CAR software using HTML5. The same apps running on desktops, phones, tablets, and cars? And on every mobile device, not just one flavor like iOS or Android? Sounds like the winning technology to me.

Finally, they talked about the success of App World. There were some really nice facts to constrast with the negative press RIM has received on "apps". First some interesting comparisons: 1% of Apple developers made more than $1000, but 13% of BlackBerry developers made more than $100,000. Whoa. And that App World generates the 2nd most amount of money — more than Android. Also very interesting!

I can't do better than the presenters, so I'll finish up with some pics for the rest of the stats...








New release of QNX acoustic processing suite means less noise, less tuning for hands-free systems

Paul Leroux
This just in: QNX has released version 2.0 of its acoustic processing suite, a modular software library designed to maximize the quality and clarity of automotive hands-free systems.

The suite, used by 18 automakers on over 100 vehicle platforms, provides modules for both the receive side and the send side of hands-free calls. The modules include acoustic echo cancellation, noise reduction, wind blocking, dynamic parametric equalization, bandwidth extension, high frequency encoding, and many others. Together, they enable high-quality voice communication, even in a noisy automotive interior.

Highlights of version 2.0 include:

Enhanced noise reduction — Minimizes audio distortions and significantly improves call clarity. Can also reconstruct speech masked by low-frequency road and engine noise.

Automatic delay calculation and compensation — Eliminates almost all product tuning, enabling automakers to save significant deployment time and expense.

Off-Axis noise rejection — Rejects sound not directly in front of a microphone or speaker, allowing dual-microphone solutions to hone in on the person speaking for greater intelligibility.

To read the press release, click here. To learn more about the acoustic processing suite, visit the QNX website.


The QNX Aviage Acoustic Processing Suite can run on the general purpose processor,
saving the cost of a DSP.


 

Sunday, October 16, 2011

Wanted: Haunted Vehicles

Halloween is just around the corner, and that reminds me of the haunted room at Lucent Bell Labs. Mind you, it wasn’t really haunted. But for a moment, I was convinced.

Let me explain. As I entered the room, I could hear two of my colleagues talking to each other, and by the sound of their voices, they were both sitting right in front of me. But when I looked, I could see only one person. Creepy, to say the least.

It took a few seconds, but I finally realized what was happening: The other colleague was in a different room, talking over a perfectly tuned prototype of a conference phone. The sense of presence was so real that I couldn’t help but feel we were all in the same room — even after I became aware of the “trick” being played!

It was then that I realized it: We don’t know what we’re missing until we experience it.

Making it real
Current telephone calls don’t sound like face-to-face conversations because the telephone network and terminals band-limit speech from about 50-10000 Hz down to 300-3400 Hz. To make matters worse, the phone’s single channel of audio eliminates spatial information about the sound source. As a result, we perceive most sounds as coming from the same point in space.

But here's the thing: The historical reasons for transmitting these single-channel narrowband speech signals no longer apply. Current technologies — such as wideband speech coders, spatial audio, and VoIP — are enabling speech communications with wider bandwidth speech and greater spatial information.

Many in the industry refer to these next-generation telecommunications systems as telepresence systems. “Telepresence” refers to the degree of realism created by a telecommunications system. Traditional systems have low telepresence while newer systems that use wider bandwidth speech and spatial audio have high telepresence.

Some people believe that a visual display is a must-have for a telepresence system. In reality, a display can decrease telepresence if its quality is poor. Experience shows that an audio-only system can have such high telepresence that people can't distinguish it from face-to-face communications — witness my haunting experience at Lucent Bell Labs.

Until recently, widespread deployment of telepresence systems has hit a roadblock: lack of standardization. Fortunately, the IETF CLUE Working Group and ITU-T Study Groups 16 and 12 are actively developing standards to remedy this situation.

Pimp my ride with telepresence
Telepresence systems have a lot to offer in an automotive environment. For instance, they could:

  • reduce driver distraction
  • make it easier to understand speech in the presence of vehicle noise
  • reduce the fatigue that comes from trying to understand a degraded voice signal

Moreover, a telepresence system makes the talker on the far end of the phone connection sound more like they are in the vehicle; it also makes the talker easier to identify.

Successful deployment of telepresence in an automotive environment depends on several factors:

  • attention to the design of vehicle platforms
  • use of high-performance acoustic processing algorithms (AEC, NR, etc.), such as those provided by the QNX acoustic processing suite
  • the ability to transport telepresence signals between telephony terminals — this is being enabled by increased VoIP availability (via LTE, for instance)

I don't know about you, but I'm looking forward to the day when my vehicle is haunted like that lab in New Jersey!

For additional reading on this topic, download the whitepaper, "Wideband Speech Communications for Automotive: the Good, the Bad, and the Ugly".

 

Tuesday, October 11, 2011

Harman infotainment systems gear up with QNX technology

Paul Leroux
If you've ever driven an Audi, BMW, Ferrari, Mercedes, or Porsche, chances are it used a sound system or infotainment unit from Harman International.

Mind you, Harman isn't just about the high end. They also offer a scalable infotainment platform that can target both higher-end and lower-end vehicles. And they aren't just about European cars, either. Earlier this year, they became the first non-Japanese supplier to supply an infotainment system (the QNX-based Entune system) to Toyota. They also supply systems to Hyundai, Lexus, Subaru, and Ssangyong.

Since 2003, Harman has used the QNX OS as the software platform for its infotainment products. (In fact, Harman owned QNX Software Systems for about 5 years, before QNX became a subsidiary of RIM.) In this video, Rick Kreifeldt, Harman's VP of global automotive research and innovation, discusses how QNX technology and expertise help Harman cut time-to-market and create greener products. Check it out:



A version of this post originally appeared on the On Q blog.
 

Friday, October 7, 2011

QNX Auto Summit Japan 2011

How many car guys does it take to change a light bulb?  Three normally, but only one if you've already lifted  the engine block out of the way!

There's an actual reason for this joke which I'll explain in the epilogue.  But let me shamelessly segue: there's a whole room full of car folks here in Nagoya, and they're working on something which needs a good more heavy lifting than needed to change that troublesome light bulb.  They're building tomorrow's car systems.  To help, QNX is hosting a full day event here in beautiful Nagoya, covering what the latest goings on are in the automotive space that's near and dear to our hearts--in-cab vehicle electronics.

Getting seated before the day begins
Our first presenter was Dr. Motoyuki Akamatsu, who broke the ice with a very entertaining video about an early 1966 study on driver distraction.  The driver is wearing a device that looks like a giant eyelid that closes over his face on regular intervals, occluding his vision.  The driver is coincidentally also the narrator, calmly describing the whole experiment as his view of the road is completely blocked every couple seconds or so.  This is all while normal traffic is flowing around his car, totally unaware that this test driver could run into them at any moment. What they got away with in the sixties! The rest of his talk was equally informative; Akamatsu-san talked about how modern testing for driver distraction is done, and how mobiles can impact that.

I gave a talk about picking the right HMI (or UX if you prefer) framework for automotive infotainment.  There's a ton of choice out there--HTML5, Adobe AIR, Qt, Android, Meego, EB Guide,  OpenGL ES--I could go on and on. There are a lot of things to consider.  Given that I didn't have an abundance of time and that it was all being dynamically translated into Japanese, I couldn't cover as much as I might have wanted.  Look for a future blog from me where I can give the topic a little bit more space.  (Mini-spoiler alert: I've listed my favorite first.)

The president of ARM Japan gave a talk about ARM use in vehicles. A short summary: ARM processors are on the rise everywhere in the car, and trending upwards. ARM licensed 6 billion CPUs last year, and they predict 100 billion devices by 2020.  Japan is probably the only area where they aren't dominating (yet). A short roadmap then was presented about Cortex family--A8, A9, A15, and A5. Also he talked about the ARM M series. I must apologize that my jet lag affect my attention span during ARM M series (that architecture is almost irrelevant in infotainment, if that's an excuse). He also talked about ARM's Mali GPU built on Midgard architecture (supporting both OpenGLES 2.0 and DirectX9). 

Alex Kinsella of RIM gave a talk about separation of personal and enterprise use of devices, in a way that gives simultaneously more freedom for the user and more options and security for the enterprise.  All very cool stuff for enabling more OEM options to the vehicle.

Our very own Andrew Poliak talked about the various different connectivity options between cars and mobiles--both where we are today (MirrorLink, iPod Out, Remote Skin/BlueTooth SPP+A2DP) and where we see that they're going (HTML5, HDMI/MDL, USB 3.0).
Andrew explains OEM's evolving needs and timelines
Roger Lanctot of Strategy Analytics had a lot of interesting things to share about their in-depth research, covering current industry trends and some of Roger's predictions.
  • Global smartphones are right now 38% of the mobile market, with all signs of growth.
  • You need good traffic info that's predictive.  If it's not, its worthless, and navigation and traffic services are still the biggest customer desire out of an in-car system.
  • Solve distraction problems before they are regulated out by governments.
  • Apps in the car will be very important.  Their research shows over 55% buyers (world wide average) want it.  OEMs though take note--this will sell cars, but there's no profit in it for the OEM themselves.
  • Many different connectivity options exist. Nobody has hmi nailed yet, so there's a big opportunity to get it right.
  • HMI solutions are converging on mobile device communications.
Roger lists every mobile connectivity solution known to man
Probably most emphasized facts of Roger's presentation were about China.
  • China is important. If you don't play there you're writing your own ticket for irrelevance. It's the fastest growing automotive market, with a rich aftermarket space.
  • China infotainment solutions are like the wild west right now, and include some crazy displays that Roger showed us with dozens of touch buttons.  They've even got systems that have the ability to create and edit docs while driving! Microsoft Word at 70mph, here I come!  I can't wait for the next IDE to have in-vehicle recognition so I can program while driving.  
  • China has exceedingly complex HMIs with apparent disregard for any regulations that might exist. 
  • China is not the safest place.  They've got tons of new drivers, new infrastructure, and growth rates that exceed their experience.  The World Health Organization estimates 200,000 vehicle fatalities (significantly higher than China's officially reported numbers).  That's around 18% the world total vehicular fatalities.  Wow.

Finally, our VP of Sales and Marketing Derek Kuhn ended up with a description of where QNX is going for the future automotive platforms with QNX CAR 2.  In a word?  Awesomeness.  (Coincidentally, this is Derek's favourite word.) In ten? Full support for almost everything car makers ever dreamed of. 

We wrapped up the day with a cocktail hour for all our guests and some Formula 1 race day ticket give-aways to some lucky attendees.  

Guests having caught their taxis or trains, and the show nicely wrapped, the QNX staff gave secret surprise birthday wishes to our Alison Canavan, the world's best event coordinator, and to the world's most thoughtful person and our Auto Summit Japan emcee, Kosuke Abe.

All in all, a very busy and successful day.  I'm pooped.  And that's a wrap.

You've probably forgotten about the car guy lightbulb joke already, but I'll finish explaining it anyway. My girlfriend had one of her car's headlights burn out, and she asked me if I could fix it. My male chivalry and handyman pride made me jump at the opportunity to help! I naively went out to the car with a screwdriver, expecting to maybe loosen the screws around the light enclosure, pop out the bulb, put in the new one, and dust off my hands for a well deserved beer in one minute flat. It became immediately obvious that Honda had something much more nefarious in mind when they built the Civic. No screws.  You had to remove the bulb from the back, but there wasn't any obvious way for a human hand (well, no adult human, anyway) to fit in the allotted space.  I went back into the house, grabbed a handful of tools this time, and spent the next 20 minutes trying to figure out what parts of the car needed to be disassembled to get at the light bulb. This wasn't immediately fruitful either, so I went back in the house, consulted the Internet, and lo and behold--I wasn't just an idiot.  I found many other posts from many other delighted Civic owners.  It looked like the most popular solution was to remove the battery, battery cage and power steering pump mounts, lift the power steering out of the way, and then you could get at the bulb.  More of a challenge than I was really looking for, I'm afraid, so I went back to my girlfriend, tail between my legs, and shamefully recommended that she take it to the dealer.

I couldn't help smiling at her retelling of the dealer visit.  The first mechanic came out, all confident with a line something like "well, many guys don't really know how to do car stuff, so we'll take care of it."  Then he spent about 15 minutes digging around, trying to discover how on earth you get the stupid bulb out.  He finally had to call over his boss to get assistance.  They did end up replacing the bulb, but it was a little more complicated than he expected too! Tally it up--me and two mechanics--three guys to replace a light bulb. 

Good thing that building automotive software is so much easier.

Thursday, October 6, 2011

QNX Auto Summit Japan 2011 – ‘Automotive at the speed of Mobility’

The QNX Auto Summit Japan 2011 will be starting soon, here in Nagoya, Japan.  We’re expecting a good crowd of Tier 1s and OEMs from across Japan.  QNX technology is being showcased in a BMW X5 and Audi A8 - as seen below - as well as in many of our partner demos.  The theme of the event is ‘Automotive at the speed of Mobility’!  Industry speakers will focus on how the worlds of automotive and mobility are converging at a break-neck speed, and what Tier 1s and OEMs will need to know to design for these next-generation systems.  Should be interesting!  Stay tuned for more.


Countdown to the Auto Summit Japan

I’m here at the QNX Auto Summit Japan 2011  in Nagoya, Japan and really energized about our event!  We’re being supported by some of our key partners including ARM, Redbend, Freescale, A&W, Elektrobit, Texas Instruments, Renesas and TCS.  

There’s presentations, amazing car demos based on QNX and serious networking planned!  Looking forward to meeting many of the Japanese Tier 1s and OEMs.  More to come once the event starts. 

Wednesday, October 5, 2011

Car Connectivity Consortium (CCC) MirrorLink meeting, Chicago, September 29, 2011

For those who aren't yet reset on "MirrorLink", it's the new term for what previously was called TerminalMode.  The name change is a definite improvement.  I informally polled people to ask them what they though when they first heard "Terminal Mode".  Basically the answers fell into two camps: either a telnet replacement or a disease you really don't want your doctor saying that you have. Neither sound like a real ringing endorsement!  MirrorLink as a term makes sense.  Good job CCC.

Here are some of my observations and notes from the CCC show in Chicago last week.

  • MirrorLink will not be going away any time soon--there is enough industry momentum to keep it alive for a while.  Sounds like roughly 60% of the car makers and 60% of the mobile makers are behind it to some extent or another.
  • QNX is very bullish on HTML5 as a replacement for MirrorLink-like features, but it doesn’t look like HTML5 is part of the future MirrorLink strategy at all.  Instead, they’re looking at HDMI or MDL—direct video from the mobile with a control channel.  This is a generic replacement for iPod out, and it's an approach that we've considered as well and will likely support, so this is a good alignment at least in direct video technology. Even though they don't see the wisdom of the HTML5 path yet (patience--they'll get there :-).
  • OEMs don’t seem to realize how badly this will impact their revenue chain or are taking the "cross your fingers" approach.  Certainly many seemed to be focused solely on the value MirrorLink provides by enabling customers and building new markets.  I think it's somewhat Pollyanna-ish to not admit MirrorLink has the potential to completely decimate in-vehicle navigation uptake.  If I can bring my phone in for navigation for a half-way decent experience with a built-in screen, who's going to spend $3000 on a nav-only solution? 
  • MirrorLink isn’t as focused as much on enabling third party apps (although they did talk about it), but more about mirroring custom-built phone apps into the car.  Everything that was demoed in the demo room breakout was a customized app that provided an integrated experience.  This is both bad and good.  Bad because it definitely reduces the short-term promise of opening up a huge third party ecosystem.  Good because I think it's the only reasonable way to go--there's really no other way OEMs can justify the liability of phone apps within the car, unless they can have some measure of control.
  •  I still think that there's a significant amount of work they need to address safety concerns around driver distraction. MirrorLink the specification, and the general CCC communications contains "driver-safe" messaging.  However, my take is that the actual participation level people, especially on the mobile side seem to discount their accountability when you bring third party apps into the car, and nothing in the specification really makes it possible for an automotive outsider to make a car-safe app.  I highly doubt this approach will fly. The application level certification that is planned for a future MirrorLink 1.1 release seems almost a mandatory requirement before this issue can be put to bed.
Proposed app certification process
  • Interestingly, almost every car company I talked to had a different take on how MirrorLink will impact their strategy—some see it as a low-end only play, but others see it as a high-end play.  There's still a lot of confusion as to where it slots into product lines.  I didn’t talk to anyone there who isn’t going to do it at all (not surprising given that the show was completely MirrorLink-focused), but some didn't seem to put a lot of weight behind it.  The perception I had was that some were doing it to "keep up with the Joneses."
  • I give the CCC credit for realizing that MirrorLink has a lot of danger for the fragmentation whirlpool that has plagued Android releases and that makes Bluetooth interop the biggest nightmare for those who implement it and test it.  To that end, they're really trying to take this one head-on. It's still very early days to see if they will be successful, with the first MirrorLink 1.0.1 systems coming out in production.  (Alpine's aftermarket ICS-X8 earns that "first to market" distinction.) I hold out hope that CCC can keep MirrorLink interop from becoming a quagmire, but this is a bugger of a problem to fix in an area that tries to tie "slow-moving" car tech with the mobile space, so keep your eyes peeled...

Tuesday, October 4, 2011

What’s next for the connected car?

It’s been almost three years since QNX Software Systems launched its connected car concept, and I thought it would be an interesting exercise to look at what has been accomplished in the automotive industry around the connected car and how some of the concepts are evolving. When the QNX CAR Application Platform was introduced, we provided a simple way to look at a connected car, using four “dimensions” of connectivity:
  • Connected to portable consumer devices for brought-in media and handsfree communications
  • Connected to the cloud for obvious reasons
  • Connected within the car for sharing information and media between front and rear seats, between the center stack and the cluster, and for other similar functions
  • Connected around the car for providing feedback to the driver about the environment surrounding the car, be it pedestrians, other cars, or vehicle to infrastructure communications
We’ve seen significant advances and evolutionary thinking on all fronts. Although QNX is not (and cannot be) at the forefront of all of these, our primary emphasis has been on cloud and consumer device connectivity. Nonetheless, it is interesting to look at each area.

Connected to consumer devices, connected to the cloud
Why lump these two together? There is not exactly a clear line between the two since consumer devices are often just extensions of the cloud. If my car connects to a smartphone which, in turn, draws information from the cloud, is there much point in creating a distinction between consumer device and cloud connections? Although it made sense to differentiate between cloud and consumer device connections when phones provided only handsfree calling and simple music playback, today the situation is quite different.

Device integration into the car has been a beehive of activity over the last few years. Smartphones, superphones, and tablets are providing entertainment, social networking, news, and access to a myriad of other content and applications to consumers anywhere, anytime. Automakers want to take advantage of many of these capabilities in a responsible, non-distracting way.

The primary issue here is how to marry the fast-paced consumer electronics world to the lifecycle of the car. At present, there are solutions at the opposite end of the spectrum: standardized Bluetooth interfaces that allow the car to control the smartphone; and screen replication technologies (iPod Out, VNC/Terminal Mode/Mirror Link) where the smartphone takes control and uses the car as a dumb display.

Neither of these scenarios takes full advantage of the combined processing power and resources of the car and the brought-in device. This, to me, is the next phase of car and cloud connectivity. How can the power of the cloud, brought-in devices, and the in-car systems be combined into a cooperative, distributed system that provides a better driver and passenger experience? (If the notion of a distributed system seems a bit of a stretch, consider the comments made by Audi Chairman Rupert Stadler at CES 2011.)

When looking for technologies that can bring the cloud, devices, and car together, you do not need to look any further than the web itself. The software technologies (HTML5, Javascript, AJAX, peer-to-peer protocols, tooling, etc.) that drive the web provide a common ground for building the future in car experience. These technologies are open, low cost, widely known and widely accessible. What are we waiting for?

Connected within the car
The integrated cockpit has emerged as a prevalent automotive design concept. It is now commonplace in higher-end vehicles to see seamless integration between center stack functions and the instrument cluster and driver information displays. For example, turn-by-turn directions, radio tuning, and song now playing are all available to the driver on the cluster, reducing the need to constantly glance over to the main display. One such example is the Audi cluster:

Connected around the car
Three years ago, systems in this category were already emerging, so there really wasn’t much of a crystal ball required here. Adaptive cruise control has become one of the most common features that illustrate how a car can connect to its surroundings. Adaptive cruise control detects the car’s surroundings (cars in front of you) and adjusts your speed accordingly. Other examples include pedestrian detection (offered in Volvo S60 and other models)automatic parking, lane departure warning, and blind spot detection/warning systems.

These Advanced Driver Assist Systems (ADAS) will become more common as cost reductions take place and the technology is provided in lower-end vehicles.

In contrast, vehicle-to-infrastructure communication that requires industry-wide collaboration is proceeding at a pace that you’d expect from an internationally standardized solution.

Monday, October 3, 2011

Ready to Rock and Roll !

Paul Leroux
What single word best describes a perfectly executed piece of art? More often than not, I would choose the word seamless. Because when a painting, song, or story is done right, nothing feels out of place. Everything fits together so well that, almost magically, the whole becomes greater than the sum of the parts.

The same principle applies when you design a vehicle. Whether it's a dune buggy or an ultra-luxury sedan, your end product should evoke a consistent vision and emotional appeal. People don't want dune buggies with walnut dashboards, nor do they want Bentleys with rollbars. They want a vehicle, and a vehicle brand, that clearly embodies what they value most, be it styling, safety, reliability, or performance.

Well, that's the theory. But putting it into practice isn't easy. For instance, people now expect their cars to integrate with smartphones, tablets, and other mobile devices. By satisfying this demand, automakers have a real opportunity to win over consumers. But how does the automaker provide this integration without losing control over the user experience, and without sacrificing their brand identify to that of a third-party device? And lest I forget, without sacrificing safety?

Consumers, conditioned by smartphones and tablets, also expect in-dash user interfaces to be slick and intuitive. To create these interfaces, automotive developers are tapping into HTML5, Adobe AIR, OpenGL ES, and specialized toolkits. Problem is, none of these is a magic bullet. None addresses every problem. But what if you could take the best of each and seamlessly blend the results on an in-dash display, without breaking a sweat?

These are the kinds of problems that QNX is tackling. To create compelling products, our automotive customers must integrate numerous mobile and UI technologies, so we've created a platform to help make that integration as clean and simple as possible. I keep thinking of the 1950s, when some innovative musicians fused multiple styles — blues, boogie-woogie, country, gospel — into a thrilling new musical form. You've probably heard of it; it's called rock and roll. That's the kind of exciting synthesis we want to enable.

Simply put, we want to make our customers rock stars. Because when you create an automotive platform, you are judged by only one criterion: your customers' success. Little else matters. And speaking of stars, we've enlisted some remarkably knowledgeable and insightful people to contribute to this new blog. I can say that with confidence, because I have the privilege of working with many of them every day.

So buckle in, because it's going to be a great ride. Our goal is to keep you up to speed on what matters in automotive infotainment. Sure, we'll talk QNX now and then, but we're going to take the high road and cover everything from data plans to driver distraction. So make sure you don't miss a single post: subscribe to our RSS feed and follow us on Twitter at @QNX_Auto