Why AirPlay Mirroring is the Biggest Thing to Happen to User Research in 2011

by Nate Bolt. Average Reading Time: about a minute.

The problem with doing any mobile device observation is there is no way to see what’s on-screen without an awkward camera or physically looking at the device’s screen. That sucks because there is usually a participant who is right there trying to use their phone or device. And it also means they can’t walk around and interact with the world, which is often important to applications being developed for, well, mobile devices. The iPhone 4S and iOS5 change all that with “AirPlay Mirroring.” Like the iPad 2, the iPhone 4S will mirror the entire display onto a computer, not just silly video output of a movie or game, but instead of needing an HDMI cable, you’ll be able to wirelessly share the screen. Once you get that image on a desktop computer, you can share that screen with anyone in the world. Presto. Mobile device remote research. It’s probably the single most common question I get around the world when I’m giving talks about remote usability and user testing. And given how desktop computers are dying off faster than the word “winning,” it’s important we start getting better at functional observation of mobile device interaction. You’ll have to have WiFi available for the AirPlay Mirroring, but that’s not hard to setup outside with a MiFi 4G unit from Sprint or something equivalent, and easy to do indoors, so that a participant is free to use their phone or iOS5 device however the heck they want. And that’s critical to natural user research.

Update Nov 6th. The real tricky part is setting up your mac to accept AirPlay signal! That way you can start a GoToMeeting and share that signal with the world. Thanks to LOUIS BEAUREGARD’s comment below, I thought I’d add this information. The video below describes it below best, but you’ll basically need to install this desktop app called, accurately, AirPlay.



Check out this detailed write-up of how AirPlay mirroring works. 


  • Louis Beauregard

    Interesting post. Ok, my participant needs to mirror his iPad2 to an Apple TV which presumably is connected to a MAC as an input fed into some acquisition tool (Final Cut Pro)? Then, participants need to share their computer screen with me through some web conferencing tool. Obviously, think aloud is critical because I cannot see how the participant is interacting with his fingers on the touch screen. Is this the vision, or am I missing something?

    I can see using the mirroring in a lab setting to get the signal from the iPad to the observation room though you still won’t see the physical interaction with the device, like you would with a device mounted camera (e.g. Noldus). But, I don’t see how I could run a remote test with just any Ipad2 owner, without the complicated setup.

  • Great question, Louis. Just updated the article to show how you could stream the iPad/iPhone signal from the field to anywhere.