Getting smart about conducting research on smartphones

Getting smart about conducting research on smartphones ESOMAR WORKSHOP 2011 Mobile Workshop Gord McNeill and Monique Morden Remixed for IAB Research...
Author: Sharyl Nichols
1 downloads 0 Views 4MB Size
Getting smart about conducting research on smartphones

ESOMAR WORKSHOP 2011 Mobile Workshop Gord McNeill and Monique Morden Remixed for IAB Research Council by Andrew Grenville

Mobile is on a rapid growth curve, despite recession

Gartner predicting 800,000,000 smartphones to ship in 2014—biggest growth in BRICI

WHAT is Mobile?

Types of Mobile Surveys SMS Text Message

Mobile Web

Applications

App Versus Mobile Internet 1. APP –



Mobile App – Mobile apps or applications are application software developed specifically for smartphones. People have to download it. Can be device specific.

2. Mobile Internet – – –

Seamless to the respondent Seamless to the administrator Extension of the platform

Devices – Touch Screen ANDROID

Devices – Qwerty Smartphones

Old School Devices = SMS

MOBILE: REACHING PEOPLE WHERE THEY ARE

Benefits of using Mobile? 1. BETTER EXPERIENCE •

We know some panelists complete via mobile despite the bad experience, now they will have a good mobile experience.

2. INCREASE RESPONSE RATES •

Encourage the more elusive segments to participate: travelers, younger demo, tech savvy.

3. UNIQUE BENEFIT: IN SITUATION RESEARCH •

Get them where they are, while they are experiencing the situation/event.

% of Canadians doing omnibus surveys on mobile, even though it was difficult

% 5 They are already doing it. It behooves us to help them.

Research on Research on Response to a Mobile Survey We conducted 2 studies with smartphone owners: One we forced them to respond on their mobile The other we gave them the choice: mobile or desktop

GIVEN A CHOICE, 11% OF SMART PHONE USERS COMPLETED THE SURVEY ON A MOBILE

11% of respondents used their mobile phone

11%

89%

Desktop

Mobile 15

GIVEN A CHOICE: WHO ARE MOBILE RESPONDENTS? • Women and those 18-34 are most likely to respond using their mobile device Mobile

Desktop

55% 55% 45%

45%

48%

48%

47%

42%

42% 35% 26%

28%

27%

29% 25%

35% 33% 32% 31%

17% 16% 10%

16

MOBILE respondents are more likely to complete surveys out of the house 85% 74%

Mobile Desktop

11%12%

9% 1%

At Home

At Work

On the Road

4% 1%

3% 1%

In a public place

Other 17

COMPLETES PER DAY • Across all panels, MOBILE respondents answer more promptly than DESKTOP respondents, • MOBILE average 88%* of their completes the day of the FULL LAUNCH compared to 57%* for DESKTOP completes Mobile ARF

Desktop

SBA

69% 69%

SBUK

84% 73%

67% 58%

25% 10%

Day 1

15%

Day 2 +

17%

14% 5% Day 1

Day 2 +

Day 1

• Average = [ARF Day 1 / (Day 1 and onward)]+[SBA Day 1 / (Day 1 and onward)]+ [SBUK Day 1 / (Day 1 and onward)] 3

Day 2 +

18

GIVEN A CHOICE: MOBILE DEVICE USAGE • Smart phone ownership is different for MOBILE respondents and DESKTOP respondents • MOBILE respondents skew towards the iPhone over other Android devices Mobile Desktop 63% 44%

18% 13%

Apple iPhone Andriod/Google Phone

9% 12% HTC

5%

9%

Samsung

1% 4% LG

19

WHAT HAPPENS WHEN THE CHOICE OF METHOD OF COMPLETION IS FORCED?

Methodology:

All respondents were owners of an iPhone or Android handset. Those assigned to the mobile arms were told they must answer the survey using their handset. Those assigned to the desktop arm were told they must complete the survey on a desktop.

Response Rates The response to the invite was basically the same across all arms but the actual completion rate was much lower for the mobile arms Responded to Invite but Dropped Out

Successfully completed survey

50% 45% 40%

38%

38%

39%

16%

15%

16%

41%

35% 30% 25%

39%

20% 15% 10%

22%

23%

23%

5%

2%

0% Short

Medium

Long

Desktop

Where they fell by the wayside Only 4 in 10 of those who responded to the invite to do the study on a mobile device actually completed it. Many tried to do it on a desktop, despite our instructions, and then gave up. Most of the others who were non compliant just gave up after being asked to go mobile 100% 90%

80% Completed survey Dropped out when asked to go mobile Dropped out in survey Tried to do survey on wrong device and gave up

41%

40%

40%

17%

17%

17%

40%

41%

41%

70% 60% 50%

96%

40% 30% 20%

10% 0% Short

Medium

Long

3% 2% Desktop

Who responded on mobile Those who persevered and completed the study on a mobile device turned out to be quite different from those smart phone owners who did the study on a desktop. The people who completed the study on a mobile device were more likely to be hardcore mobile phone users. For example there were: More likely to own an iPhone (65% vs. 49%) More likely to use their smart phone for music, GPS/maps, internet and pictures or video, at least once a week Spend less of their time with a smart phone actually talking on the phone (28% vs. 37%) Were more likely to be “extremely satisfied” with their smart phone (39% vs. 30%)

Using mobile is, for the most part, about reaching people where they are, and offering them a convenient option

I was just checking my email, and I'm sitting on the toilet.

I'll be honest, I had time to kill.

MOBILE: FINDING PEOPLE on location research with QR codes and Geolocation

QR Codes: Common Uses • Short cut to a url or website • Share information such as contact info • Links to maps for easy directions Requires a smart phone with a QR code reader application. Many free applications for all of the major platforms

MR Use Cases for QR Codes • In store feedback - signage • Purchase intent questions when people scan for more information in store • Or customer satisfaction with service encounter • Or anything in store, at event or on the go that you can think of…

MR Use of Geolocation • Sends surveys to visitors of a store • Requires an application approach • Download Retailer app • Sends coupons or surveys when in range • Ideal for exit surveys, onsite surveys, customer satisfaction, etc.

Geolocation Challenges • Requires people to download an app • Need to know the population • They need to agree to download the app • They need to agree to terms and conditions, including that they will be sharing their location in real time • Need a mechanism to push/send them surveys 500,000+ iphone apps

Geolocation Challenges • GPS takes up battery power • Data charges for respondents • Can be interpreted as invading privacy • Accuracy varies • E.g., 2 storey malls • Small store fronts

32

Let’s try a mobile survey. If you have an iPhone, iPad or Android OS phone, email me at [email protected] and Colin will reply and email you a link to a short (5 minute) survey. https://www.visioncriticalsurveys.com/rm.aspx?a=2169&t=1

DESIGN CONSIDERATIONS FOR MOBILE: Usability Considerations

What is Usability? Learnability

Errors

Memorability

Usability

Accessibility

Efficiency

Why Usability in Survey Design?

1

Improved data quality

2

Increased response rates

3

Engaged participants

4

Focus on responses

5

UI invisible 3535

“Visual” Questions • All questions have a “visual” component, of course! • Standard HTML widgets:

– Small targets – Variable and unpredictable visual appearance (rendered by browser and operating system, changing over time and different by respondent)

36

Target Size and Fitts’ Law FITTS’ LAW: • “The time required to rapidly move to a target area is a function of the distance to the target and the size of the target.”

37

Single Choice Questions: Target Sizes

Which target is easier to hit? Option B Option A

QUESTION TYPES Optimized for Mobile

Question Types

All of your favorite question types – redesigned from the ground up

Single Choice

Multi Choice

Numeric (open end and sliders)

Date

Single Choice Grids: grids, card sorts and sliders

Multi Choice Grids

Allocation (Grids and Sliders)

Everything Else • • • • •

Open Ends Text Instructions Quotas Redirects Conditional logic

BE SURE TO ……. • Rethink image use • Both the size of the image and frequency of using images

Question Types Supported        

Single choice Multiple Choice Date Open End Numeric Verbatim and Slider Allocation Slider and Grids Single Choice Grids (including slider and card sort) Multiple Choice Grids

If a question isn’t mobile ready…..

ENGAGEMENT AND VALIDITY OF MOBILE

“This survey was more enjoyable than most” Agree strongly

45% 35%

Mobile

Desktop 52

“This survey was fun to complete”

Agree strongly

50% 38%

Mobile

Desktop 53

Mobile and Desktop respondents do not differ in terms of their strong agreement with survey difficulty and ease Agree Strongly 76%

81% Mobile

3% …was easy to complete

Desktop

2%

…was hard to complete 54

The interface can influence how we respond. So a good question is “does a survey using this smartphone interface produce the same answer as the desktop study?” We controlled for differences between the mobile and desktop groups in terms of age, gender and type of smartphone owned. We found that for single choice, multi-choice, numeric and rank order questions there was no difference in the pattern of answers, beyond that you’d expect to see by chance. 55

We did see an apparent difference with the multichoice grid. Overall, more words were associated with each country on the mobile survey. The differences appear greater than chance, and will be the subject of some subsequent research. 56

2011 ROR Survey Length and Response Rates

Research Objectives • The objectives of this study were to determine: Length – How long the study took to complete on mobile, compared to a computer – How long a survey we could conduct on a mobile device before respondents began to dislike the experience and/or before data quality began to decline Response Rates, Drop Out Rates – How responsive people were to a request to conduct the study on a mobile device

Research Methodology •

• •

There were four arms to the study. There were three mobile arms, of varying length (short, medium and long), while the fourth was conducted as a desktop study, with as many questions as the “long” mobile survey Each survey utilized text pages, single choice, multi choice, numeric, rank order, open ended and date entry questions The sample, for all arms, was owners of iPhones and Android handsets, from across ARF, SBUK and SBA Total Completes

ARF

SBA

SBUK

Short (11 Q)

n = 500

n = 175

n = 196

n = 129

Medium (16 Q)

n = 501

n = 176

n = 196

n = 128

Long (21 Q)

n = 501

n = 187

n = 198

n = 116

Desktop (21 Q)

n = 503

n = 168

n = 167

n = 168

Survey Length • The questions took, on average, about 50% longer to complete on the mobile device than on the desktop. • Assuming roughly a minute was spent with the intro, it took an average of 21 seconds per question on the mobile vs. 14 seconds on the desktop.

Short (11 Q) Medium (16 Q) Long (21 Q) Desktop (21 Q)

Average Survey Length 4:43 minutes 5:52 minutes 9:25 minutes 6:03 minutes

Survey Length and Drop Outs Length had no effect on the % of people who dropped out midsurvey, at least not up to the 9 ½ minute mark. % dropped out in survey

4%

5%

4%

0%

Short

Medium

Long

Desktop

Length also had no effect on the percentage of people who did not answer a question.* The percent was 1% for all basic questions (at the beginning or end) and consistent across all mobile arms. * We did not force an answer to a question. So respondents were able to hit “next” without answering

Positive Feedback on Experience “This survey was hard to complete.”

“This survey was easy to complete.” % strongly agree

% strongly disagree

80%

82%

74%

86%

85%

86%

77%

88%

Short

Medium

Long

Desktop

Short

Medium

Long

Desktop

“This survey was fun to complete.” % strongly agree

89%

Short

90%

Medium

90%

Long

“This survey was more enjoyable than most.” % strongly agree

91%

Desktop

44%

40%

42%

43%

Short

Medium

Long

Desktop

ROR on Survey Length ROR rule of thumb = mobile study will take about 50% longer to complete than a desktop study. Longer Mobile Surveys are tolerable • A longer mobile survey (~10+ min) may be more difficult for some, but it is certainly does not increase dropouts, harm data quality or cause widespread disgruntlement . • Subsequent research on research showed smartphone users would happily go up to 15 minutes (10 minute desktop) without problems with data quality or drop-out.

Incentives – We know mobile is longer but we don’t recommend custom incentives for those doing mobile, logistically hard to manage, also would increase costs.

64

Let’s break into small groups and review your questionnaires and think about what you might have to do differently if this were a mobile study

BEST PRACTICES

Best Practices: Mobile Back-End 1. Profile your panel for Smartphones • Brand and model of device • Business or personal phone • Interest/willingness to do surveys via phone

2. Be aware of devices/models supported or not supported by your software 3. Communicate capability and option for doing mobile surveys

Best Practices: Mobile Survey Design REAL ESTATE IS AT A PREMIUM

1. Determine if mobile only survey or mobile/desktop 2. Design shorter, more concise question • • •

Less text in the questions and answer categories No long lists of answer categories Reduce items or attributes in grids

3. Design based on supported question types 4. Question design with larger targets 5. Minimize Scrolling

Best Practices: Survey Length MOBILE SURVEYS TAKE LONGER

1. Keep in mind that mobile surveys are 50% longer than desktop surveys 2. Let respondents know the approximate length of the survey 3. Surveys longer than 15 minutes not recommended 4. Individual incentive program - incent by length as usual

5. The longer the survey, the more likely respondents will get interrupted

Best Practices: Mobile Skin REAL ESTATE IS AT A PREMIUM 1. Blank mobile skin 2. Custom mobile skin – name only 3. Custom skin with logo (caution on size and shape of logo)

When thinking Mobile, think… • • • • • • •

App versus Platform Devices supported Offer choice of device used Question types optimized for mobile Survey Length QR codes Location data

QUESTIONS?