Device Testing TestChat

Tonights TestChat about Device Testing, our second of 2018 got us all talking about what our test labs look like and how we decide what devices to test on. A brief summary of the answers is below.

Q1. How do you decide what devices to test on? How do you prioritise those devices?

@deborah.lee was full of chat on this one :slight_smile: “Try and get stats to give an idea on what the website/app users are using”
“If no stats available, find general stats for popular models/OS/browser combos”
“You can’t cover everything but aim to cover popular devices/OS/browser combos. If users report issues, ask them what environment they are on and try and reproduce with real or virtual device(s)”
“It also depends in which country/countries the app is used i.e. popular phone models vary according to the country”

@ranjani read my mind “For android it is more challenging as there are n number of devices across the globe!”

Doug sounds like he’s had a fun time “Fragmentation is still an issue, so simplifying this to a “device” without considering OS version is shortsighted. As we’ve found out the hard way!”

Q2. How does your mindset differ based on the device you’re testing on? For example, for mobile testing you need to test for recoverability after signal loss. Do you have particular heuristics you use for each?

@deborah.lee was on fire for this one!

  • LONG FUN CUP
  • SFDPOT too - not specific to desktop/mobile
  • I SLICED UP FUN
  • Mustn’t forget testing without a stable wi-fi or ethernet connection for mobile apps. One of the biggest differences between desktop/mobile. That and orientation

I’ve seen some great orientation specific bugs, have you?

Q3. What does your device testing lab look like? Do you use physical devices? Or Emulators, cloud, simulators?

@ranjani “For automation, I use Firebase labs - mostly emulators coz physical devices are expensive.” and “The only caveat is Xamarin test labs were expensive than Firebase.” “I have also used AWS Device Farm in the past but ran into a lot of issues with the test failing on their device farm due to popups/device updates, etc” plus tonnes more awesome insight, check out the transcript!

For me “I had a handful of real devices that I borrowed from anyone and everyone getting rid of their old phones. We had a good variety of phones & laptops in the office too. The rest was browserstack & Chrome dev tools”

Simon Gilmurray “Mix, mainly physical as it feels to replicate reality better, but Simulator in iOS and Genymotion. Used to use Xamarin Test Cloud which was a really nice way to submit automated tests up and run them simultaneously against many different devices and OS’s.”

@ustanisic was the first to mention a crowd sourcing solution “mostly emulators and physical devices. We’re also using some crowd testing approach between testers from different projects in order to cover more ground and more real devices”

Q4. What has been your most interesting bug discovery while device testing?

Zena experienced “crashes at Boundary Value analysis!” and “there is an issue with Captive WIFI on current IOS update, once u try to connect it and lock the screen then on UI you can still see Captive WIFI login screen, I have raised it, lets see when they can fix it!”

@ustanisic “crashes when switching from wifi to 3G. When I took the device outside :)”

@deborah.lee “One issue was actually specific to the iOS version (i think iOS 8.0), where you couldn’t upload a photo in Safari. Took a while to work out (alongside dev) that it was not the website or the phone causing the issue!”

My own one “number picker on Android appeared as a nice numeric keyboard, iOS it appears as a huge scrollable wheel, try get to 30th of the month using that”

Ian Emery “couple of rendering bugs specific to an iPhone 7 were a touch weird. The product owner also found a bug by pressing to one side of a button this week, caused an invisible option to pop up!”

If you missed the TestChat, why not continue the conversation here?

Something on the subject from me

1 Like