Kiwis Count 2.0 Technical Report Version 1 (April 2015) This technical report was written for the State Services Commission by Nielsen. Contents: 1.0

Introduction

3

2.0

Background

3

3.0

Questionnaire

4

3.1

Questionnaire development

4

3.2

Questionnaire review process

6

3.3

A modular approach

6

4.0

5.0

Sampling

8

4.1

Sampling frame

8

4.2

Target population

8

4.3

Sampling process

9

4.4

Sampling for first iteration of Kiwis Count 2012

10

Continuous Surveying and Fieldwork

11

5.1

Initial contact – invitation letter

11

5.2

Second contact – first reminder postcard

12

5.3

Third contact – survey pack

12

5.4

Final contact – second reminder postcard

13

5.5

Exceptions – February and December mailout process

14

6.0

Data Entry / Editing and Cleaning

15

7.0

Response to Survey

16

7.1

Response rate

16

7.2

Mode of response

17

7.3

Cut off dates

18

8.0

Sample Composition

20

9.0

Weighting

21

10.0

Data Analysis Explanation

22

10.1

Overall Service Quality Scores Calculation

22

10.2

2014 Changes to Kiwis Count calculations

23

10.3

The Rolling Average Calculation

23

10.4

The Statistical Significance Calculation

23

Appendix 1

Questionnaires

25

Appendix 2

Service Groupings

40 1

Appendix 3

Questionnaire Change Log

42

Appendix 4

Data Entry Protocols

53

The Kiwis Count survey and this report contain material from Citizens First, used under licence and reproduced with the permission of the Executive Director of the Institute for Citizen Centred Service. This work is licensed under the Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 New Zealand licence. [In essence, you are free to copy and distribute the work (including in other media and formats) for non-commercial purposes, as long as you attribute the work to the Crown, do not adapt the work and abide by the other licence terms.] To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/3.0/nz/. Attribution to the Crown should be in written form and not by reproduction of any such emblem, logo or Coat of Arms.

2

1.0

Introduction

The purpose of this document is to outline the technical details of the Kiwis Count Survey, including methodology, sampling, weighting and data analysis. Kiwis Count is a continuous survey which allows the approach to the survey to be improved from month to month. This document was updated after more than three full years of surveying. The examples in this report are based on recent examples where appropriate and reflect new reporting procedures which have been implemented for June Year End 2014 reporting onwards. The State Services Commission (SSC) publishes survey findings in separate reports on a regular basis. 2.0

Background

The Kiwis Count survey is an all-of-government national survey measuring New Zealanders’ experiences of public services. Kiwis Count uses the Canadian “Citizens First” survey methodology. Canada is among the world leaders in providing citizen-centred public services. By adapting international best practice for use in New Zealand, this allows useful comparisons to be made between New Zealand’s public service and those of an acknowledged world leader. The Kiwis Count survey was first carried out by the State Services Commission in 2007 (www.ssc.govt.nz/kiwis-count-research-survey) and then repeated in 2009 (www.ssc.govt.nz/kiwiscount-2009 ). The purpose of the 2009 survey was to measure New Zealanders’ experiences of public services, comparing against the 2007 baseline to measure progress and identify opportunities to further improve frontline service delivery. Both the 2007 and 2009 surveys were point-in-time measures. Each was conducted via a postal survey (with an option to complete online). While providing valuable information, the outputs were periodic data releases and written reports that provided static performance measurement. In 2012, Kiwis Count changed to become a continuous survey. This change allowed the capture of trend information and provides regular updates so the survey is a business-as-usual management tool rather than an occasional one-off project reviewing performance. Every month, a number of New Zealanders are invited to participate in the survey. Data is provided quarterly to SSC via SPSS and Excel tables. Each quarter SSC publishes an update outlining results, in particular any significant movements and trends.

3

3.0

Questionnaire

The questionnaire (included in Appendix 1) reflects the most recent questionnaire (as at Feb 2015) and is based on the questions from the 2007 and 2009 surveys. (Copies of all past questionnaires are available at http://www.ssc.govt.nz/kiwis-count.) The core of the Kiwis Count survey asks New Zealanders about the services they have used in the past 12 months and the quality of these services. It asks respondents about their experience of 42 government services (a list of services included in the survey can be found in Appendix 2). Respondents are also asked in more detail about their most recent experience. Specific questions are asked depending on the type of service experience (telephone, online, or face-to-face). Perceptions of public services versus the private sector are also evaluated, with the quality of service received from private sector organisations (such as banks, insurance companies and Internet service providers) measured for benchmarking purposes. The survey is also used by other government agencies as a cost effective way of providing other information. For example, the Feb 2015 survey includes questions about •

the “pain points” of dealing with government services for the Department of Internal Affairs’ Result 10 team



privacy and security of personal information provided to the government for the Office of the Government’s Chief Privacy Officer.

A full list of the changes to the questionnaire is included at Appendix 3. The survey concludes with an interchangeable module (see below section 3.3 A modular approach) and demographic questions. 3.1

Questionnaire development

Prior to the establishment of the project, SSC determined which parts of the questionnaire to review based on their experiences from the 2007 and 2009 surveys. Where possible, question wording remained consistent with previous Kiwis Count surveys to allow comparisons. However, the sequence of questions was amended to make navigation through questions clearer and more logical for respondents. The questionnaire was divided into four sections as follows: Section A: Experiences of public services (the key service quality section - virtually unchanged from previous measures) Section B: Experiences of non-government services Section C: Contact with public services Section D: Demographic section These changes were tested during the pre-testing process. Pre-testing was carried out during November 2011, involving 10 interviews with a cross-section of the population as follows: •

Two males and eight females



Nine of European ethnicity and one of Pacific ethnicity



Two aged under 25 years, four aged 25 to 39 years and four aged over 40 years



A mix of employment status including employed, student, beneficiary and retired



Those who have had recent experiences with the services that were changed between 2009 and 2012 (see table 3.1.1 below) 4

In particular, the pre-testing reviewed changes to the services listed and to other question wording changes, as well as general questionnaire order and flow. It also reviewed reaction to, and persuasiveness of, the communication materials. The details of, and reasons behind, the questionnaire changes related to the list of services respondents are asked to consider at the start of the questionnaire, are outlined in the table below. A full list of all questionnaire changes can be found in Appendix 3 of this report. TABLE 3.1.1: CHANGES TO THE LIST OF SERVICES 2009 wording

2012 wording

Reason for change

A drivers licence, registering a vehicle or changing ownership of a vehicle

Obtained, renewed, changed or replaced a driver licence

In 2009 this service had very high usage. However, there were several aspects of a service rolled into one rating.

Licensed or registered a vehicle

This service has been split into two, each now rated separately (following consultation with NZTA).

Sickness, domestic purposes or unemployment benefit

Receiving a benefit such as the sickness, domestic purposes or unemployment benefit

Clarification for respondents

State or council rental accommodation

Living in a Housing New Zealand home

The question was more useful to Housing NZ if council rental accommodation was excluded.

New service added in 2012

ERO (Education Review Office) school or early childhood reports

ERO provided information showing many New Zealanders used these services and provided feedback on the question wording

New service added in 2012

Visited sorted.org.nz for information to help manage your personal finances or retirement income’

Previous research by the SSC on the drivers of satisfaction with online services showed that many New Zealanders use these services

A visa or permit to work in New Zealand

Removed in 2012

Low usage in the 2009 survey

Citizenship

Removed in 2012

Low usage in the 2009 survey

Funding assistance for a business or a business grant

Removed in 2012

Low usage in the 2009 survey

2012 wording A kindergarten that your child attends or may

2014/2015 wording A kindergarten, day-care, crèche, preschool, homebased service, playcentre, 5

Reason for change Wider range of early childhood services added to reflect all available services (following

attend in the future

Kōhanga Reo, Aoga Amata, Puna Reo or playgroup etc that your child attends or may attend in the future

consultation with the Ministry of Education)

Receiving a benefit such as the sickness, domestic purposes or unemployment benefit

Receiving a benefit such as Jobseeker Support, Sole Parent Support or a Supported Living Payment

Updated to reflect new benefit categories as at 15 July 2013 (following consultation with the Ministry of Social Development)

Visited sorted.org.nz for information to help manage your personal finances or retirement income

Visited sorted.org.nz for information to help manage your personal finances or planning for retirement

Change to wording to be more inclusive of all retirement options (not just those related to retirement income) (following consultation with the Office of the Retirement Commissioner)

A university, polytechnic or wananga about a course you are attending or may attend in the future

A university, polytechnic or wānanga about a course you are attending or may attend in the future

Accent added to the word wānanga

3.2

Questionnaire review process

The full questionnaire is generally reviewed every 6 months, to ensure it continues to meet its objectives. Feedback from respondents, received via cognitive testing and via comments made by those who have completed the survey, are also monitored to ensure the questionnaire remains easy to understand and to complete. 3.3

A modular approach

Whilst the majority of questions remain unchanged each quarter to continue the time series, some questions have been changed to gather new information and there is a section of the questionnaire (Section C) that takes a modular approach enabling completely new information gathering. This section can be changed as frequently as quarterly, dependant on information requirements and the printing of hard-copy materials. The table below outlines the major changes to the questionnaire and modules employed so far: [Note Appendix 3 is a log of all changes to the Questionnaire.] TABLE 3.3.1: QUESTIONNAIRE CHANGES AND MODULE SCHEDULE Date

Module name

February - December 2012

Contact with public services Government and the digital environment (note questions amended at January 2014 and then augmented at June 2014)

January 2013 -

Also question at A4 was amended at January 2014 to enable relative satisfaction of single or multi-channel use information 6

to be gained January 2014 - June 2014 June 2014 -

January 2015

Ministry of Education questions New question added at A10 to gather information on the “pain points” of dealing with public services New question added at A13 to gather information on the privacy of personal information New question added at A5 and A6 to gather information on perceptions of staff effort involved in providing service

The questions about contact with public services from the 2009 survey were retained in February to December 2012 to allow comparisons to be made.

7

4.0

Sampling

4.1

Sampling frame

The target population for this research is New Zealanders aged 18 years and over. The sampling frame for the Kiwis Count Survey is the New Zealand Electoral Roll. This was also the case for the 2007 and 2009 surveys. The Electoral Roll allows a random selection of named respondents as well as allowing more targeted oversampling of those of Māori ethnicity (identified as being of Māori descent in the Roll). It also allows more targeted sampling by age group, as it provides age data (within five-year bands). To ensure the sampling frame is up-to-date with most recent addresses, an updated version of the Electoral Roll is used every three months. To ensure that people selected at random and invited to take part in the survey are not re-contacted more than once within a 12-month period, steps are put in place when an update of the Electoral Roll is used. As there is no unique identifier on the Electoral Roll, a special process is undertaken. This includes a search involving combinations of names, addresses, occupations and meshblock to track respondents who have already been contacted. It is impossible to track all respondents based on this search, however a hit rate of 88% was achieved when identifying participants selected in February 2015 (the most recent Electoral Roll update).

4.2

Target population

The target population for the survey is based on population estimates from Statistics New Zealand. The table below shows the target population proportions for 2012 and for the latest 2013 census data. The 2013 census data has been used for the processing of the July-September 2014 quarter onwards. Prior to that the 2012 estimates were used.

8

TABLE 4.2.1: POPULATION ESTIMATES 2013 Census Population % 13 16 18 19 15 19

June 2012 Population Estimates1 % 14 17 18 19 15 18

Age 18-24 years 25-34 years 35-44 years 45-54 years 55-64 years 65 years or older Gender Male 48 Female 52 Ethnicity NZ European 71 Māori 11 Pacific 5 Asian 11 Other 8 Region Auckland 33 Upper North Island 22 Lower North Island 24 South Island 20 1 Ethnicity population data is based on 2006 Census data

48 52 69 11 5 9 12 34 20 22 24

Note that respondents are able to record multiple ethnicities, hence this breakdown adds to more than 100%. 4.3

Sampling process

Over the course of a year a minimum of 2000 surveys are completed (500 per quarter). The sample is selected on a monthly basis (see section 5.0. Continuous Surveying and Fieldwork). Key aspects of how the sample database is selected each month are as follows: •

The sample is drawn in iterations in the following order: 1. A random sample of Māori electors (based upon the Māori descent indicator field in the full Electoral Roll). Māori electors are oversampled, at a rate which is adjusted as required, to ensure sufficient completed surveys are received, based on response rates from this group in previous months 2. A random sample of Youth (aged 18 to 24) is selected using the age band field in the Electoral Roll. Youth are also oversampled at a rate that is adjusted as required 3. The remaining sample is a stratified sample (by area) of all non-Māori descent electors



To ensure the sample is geographically representative of the New Zealand population, sample is drawn from each TLA in proportion to population using the 2013 census data from Statistics New Zealand. 9

The minimum number of surveys completed each year, for each target group, is outlined below. TABLE 4.3.1: MINIMUM NUMBER OF COMPLETED SURVEYS Incidence in population

Minimum number of completed surveys each year

Māori

15%

300

18 – 24

13%

260

All other

-

1440

Target Group

TOTAL

2000

The number of invitations sent per target group is flexible and is constantly adjusted based on the responses achieved in previous months. 4.4

Sampling for First Iteration of Kiwis Count 2012

In February 2012, the first month of fieldwork, the number of survey invitations sent was double that of a typical month. This acted as a ‘pilot’ stage for the new methodology and also ensured the first release of data in August 2012 would include a full six-month sample size of approximately 1,000 (February – June 2012). This oversampling in February has been repeated each year since 2012 (repeated for 2013, 2014 and 2015), given the typical lower response to surveys in general over the summer holiday period. Initially, oversampling was also undertaken to help ensure that the achieved sample proportionately represented Pacific peoples. However, targeted sampling from the Electoral Roll for Pacific people is inefficient compared with Maori or youth oversampling, as unlike these two groups Pacific people are not identifiable from the details contained in the Roll. It required oversampling people living in the 1000 meshblocks across New Zealand known to contain the highest proportions of Pacific people (thus, most of those over-sampled were still of non-Pacific ethnicity). The rate of oversampling differed over the months it was used (February – June 2012). However this was discontinued from July 2012, the decision being that the negative impact on the response rate of over-sampling Pacific meshblocks out-weighed the benefit of a slightly increased number of Pacific respondents.

10

5.0

Continuous Surveying and Fieldwork

Fieldwork is continuous, with a new sample of 432 chosen from the Electoral Roll and sent invitations to complete the survey each month. The following diagram illustrates the timing of communications to potential respondents each month. (Note international addresses are not sent to). FIGURE 5.0.1 TIMING OF COMMUNICATIONS TO POTENTIAL RESPONDENTS

Respondents complete the survey in their own time, which means there are completed surveys received most days of the month. More details about each communication are provided below (section 5.1 through to 5.4). 5.1. Initial contact – Invitation letter An invitation letter, which contains a link to the online survey and provides an individual login ID and password, is sent to all those selected from the Electoral Roll to take part in the survey. As of the start of 2015 the letter also includes a QR (quick response) code which enables a respondent to access the survey quickly using a mobile device such as a cell phone or tablet. The letter gives no indication that the respondent will be given the opportunity to complete the survey on paper at a later date. This approach is taken to maximise the proportion of respondents who complete the survey online, thus reducing the amount of data entry required as well as printing and postage costs. The letter directs respondents to an 0800 number if they have any questions about the survey. Those without internet connection are able to contact Nielsen on the 0800 number and are advised at this stage that they will be receiving a paper copy survey by post in the near future. The invitation letter is sent on the first Thursday of every month to the full sample of potential respondents.

11

5.2 Second contact – first reminder postcard The invitation letter is followed up with a reminder postcard, also with the survey link and individual login ID and password.

The reminder postcard is sent the Monday following the initial invitation letter. The reminder postcard is also sent to the full sample of potential respondents. Again, no reference is made to a paper copy being posted. 5.3 Third contact – Survey Pack Approximately one week later, those who have not yet completed the survey online are sent a survey pack with a cover letter, hard copy questionnaire, reply paid envelope and a Kiwis Count pen. The survey link and individual login ID and password are repeated in the letter should the respondent prefer to complete online.

12

The survey pack is sent the Tuesday of the week following the reminder postcard being sent (i.e. approximately one week later). 5.4 Final contact – second reminder postcard Those who have still not replied approximately one week later receive a final reminder postcard:

The final reminder postcard is sent the Wednesday following the survey pack (i.e. approximately one week later). Each month, this communication cycle is repeated, although data is collated and reported quarterly as illustrated below. FIGURE 5.4.1 OUTLINE OF A TYPICAL QUARTER

Data into SPSS and Excel

13

A 0800 number ( staffed by Nielsen) is provided on each of the communication pieces. This number is available for respondents to call to ask questions about the survey and / or request paper copies of the questionnaire. 5.5 Exceptions - February and December mailout process In order to avoid the possible negative impact of a lower response rate by conducting fieldwork close to Christmas and the holiday period, a different process is followed for the February and December mailout. No field work takes place in January, instead a double sample is used for the February mailout i.e. invitation letters are sent to 864 people rather than 432. In December, the invitation letter is sent out one week after the November mailout, with subsequent correspondence following the normal dates as shown in Figure 5.0.1 above. The February mailout process has been in place since 2012. The December mailout process was put in place in December 2013 following a low response rate in December 2012.

14

6.0

Data Entry / Editing and Cleaning

As completed questionnaires are returned to Nielsen’s Wellington office, they are data entered directly into Confirmit, the same software programme used for the online component of the survey. Using the same software removes the chance of error in combining data sources. The data enterer has different access to the survey tool from a survey respondent. For example, the data enterer has the ability to select ‘no response’ for any question where a hard copy respondent has not selected a response. Data entry protocols from 2009 have been adopted again for consistency purposes. The 2009 protocols and their adaption to the 2012 survey and beyond are included in Appendix 4. As part of Nielsen’s quality control processes, 10% of data-entered surveys are verified.

15

7.0

Response to Survey

7.1 Response Rate To calculate response rate, every individual sent an invitation to complete the survey is tracked and the outcome of the invitation recorded. A call-log tracks which of the letters, postcards or questionnaire packs are returned as ‘gone, no address’, as well as any telephone notification of refusal to participate. This log also records notifications from third parties that the nominated respondent is not available or capable to complete the survey due to age, language issues, health reasons, death or other disabilities. Every effort is made to remove any respondent from subsequent samples. If a respondent is having difficulty completing the survey they are able to call the 0800 number (staffed by Nielsen) and ask for assistance. Alternatively, the respondent can ask for someone to complete the survey on their behalf. However the respondent must be present and it must be their experiences or opinions that are recorded and not those of the person helping them complete the questionnaire. The response rate is calculated as follows: Completed surveys / total number of invitations mailed out (excluding GNAs and ineligibles) x 100 Ineligibles are defined as those who are unable to participate due to age, language issues, health or other disabilities. The table below is an example of response rate calculations using data from the October-December 2014 quarter of interviewing. TABLE 7.1.1 RESPONSE RATE CALCULATION EXAMPLE OCT-DEC 2014 Total surveys mailed out (a)

2592

Ineligibles (b)

159

Gone no address

105

Unable to participate (age, language, health / disability)

54

Completes (c)

1209

Online

652

Hardcopy

557

Incomplete eligibles (d)

1292

Refused (0800 number)

15

Did not hear back from

1261

Survey not fully completed

16

Response rate c/(a-b)

50%

The response rate for Kiwis Count is released on a quarterly basis in the quarterly updates and is based on two quarters of data. The State Services Commission has a strong focus on increasing this response rate over the life of the survey. The continuous approach to the survey means that changes can be made to the methodology each month and the impact of these changes on the response rate can be monitored. 16

Below is a table of some of the initiatives that have been trialled to evaluate their impact on response rate. TABLE 7.1.2 RESPONSE RATE IMPROVEMENT INITIATIVES Date

Response rate initiatives

Apr 2012

Oversampling rate decreased, thank you line added to postcard

Jun 2012

End date included on cover letter, final postcard prompts for survey to be returned the following week

Jul 2012

Pacific sample group removed

Sep 2012

Change password to survey code

Nov 2012

FAQ document introduced

Apr 2013

New pen included in survey pack (some old stock still used)

Feb 2014

New letter layout, new (longer) questionnaire, new pen included in survey pack (some old stock still used)

Mar 2014

Change to timing of survey materials

Feb 2015

QR code added and FAQ document updated

7.2 Mode of response On average (based on response rate results since 2012), 56% of respondents are choosing to complete the survey online, with the remaining 44% returning hard copy questionnaires. For all completed surveys the method of completion (whether online or hardcopy) is captured in the survey tool. This allows for the proportion of completed online and hardcopy questionnaires to be calculated. Using the Oct-Dec 2014 response rate data as an example (see table 7.1.1 above), the proportion of online and hardcopy completes for that period is calculated as follows: Online proportion = Completed surveys (n= 1209) / number of online completes (n=652) = 54% Hardcopy proportion = Completed surveys (n= 1209) / number of hardcopy completes (n=557) = 46%

17

The pattern of response is illustrated in the following chart, based on Q4 2014 data. FIGURE 7.2.1 RESPONSES FOR OCT-DEC 2014

7.3 Cut off dates for inclusion of data There is limited control over when a respondent will complete this survey. For example, a respondent that is sent an invitation in February (Q1) for a number of reasons may not complete the questionnaire until May (Q2). Because reporting is carried out on a quarterly basis, protocols were agreed for cut off dates for inclusion in a particular quarter’s data and for reporting of response rate. The approach that is consistently applied is that a completed questionnaire will be recognised in the month in which it was completed online or received in the post (not the month it was sent out). In the above example, this survey will be included in the Q2 data (for processing). For simplicity, the same protocol is applied for reporting of response rate: that is, any survey that is received after the cut off date for the quarter will be counted in the following quarter’s response rate calculation.

18

The final dates for inclusion are shown in the table below: TABLE 7.3.1 CUT OFF DATES FOR QUARTERLY RESPONSE RATE Month questionnaire completed: July August September October November December January (conducted in Feb) February March April May June

Quarter:

Quarter 1 (financial year reporting) (referred to as Q3 for processing)

Quarter 2 (financial year reporting) (referred to as Q4 for processing)

Quarter 3 – Please note there is no fieldwork in January, but February has a double sample

Cut off date for inclusion in quarter’s data and response rate: 3rd Sunday October

3rd Sunday January

3rd Sunday April

(referred to as Q1 for processing)

Quarter 4 (financial year reporting) (referred to as Q2 for processing)

3rd Sunday July

Imposing a cut off date allows the results to be reported quickly; however, it does mean that late respondents are missed from the report. The first reported response rate is potentially slightly lower than a typical quarter because no late respondents have been carried over. As the survey progresses, the impact of respondents carried over from the previous quarter will be cancelled out by late respondents in the current quarter.

19

8.0

Sample Composition

As an example, the sample composition of the first quarter of surveying (February to March 2012) is outlined in the table below, along with the results from the 2009 and 2007 surveys. FIGURE 8.0 UNWEIGHTED SAMPLE COMPOSITION

Age

2013 Census Population %

Q4 2014 (Oct-Dec) profile n=573 % 8 9 14 22 20 27

June 2012 Population Estimates1 %

18-24 years 13 14 25-34 years 16 17 35-44 years 18 18 45-54 years 19 19 55-64 years 15 15 65 years or older 19 18 Gender Male 48 45 48 Female 52 56 52 Ethnicity NZ European 71 77 69 Māori 11 6 11 Pacific 5 3 5 Asian 11 9 9 Other 8 5 12 Region Auckland 33 30 34 Upper North Island 22 22 20 Lower North Island 24 24 22 South Island 20 23 24 1 Ethnicity population data is based on 2006 Census data

20

Q1 2012 (Feb-Mar) profile n=537 % 15 12 18 20 17 18

2009 profile n=3354 %

2007 profile n=3653 %

17 15 20 19 15 14

8 12 20 21 17 21

42 58

45 55

42 58

68 21 8 8 6

73 12 6 7 7

76 8 3 8 8

38 20 21 21

26 26 23 25

25 25 23 27

9.0

Weighting

To account for factors such as sample design and non-response bias, the data is weighted each quarter before reporting. The purpose of weighting is to adjust the sample to represent the overall New Zealand population. The variables used for weighting are: •

Age



Gender



Location (TA)



Ethnicity.

Weighting is based on the proportions in the adult population of New Zealand (18+) in the Statistics New Zealand 2013 Census results for age, gender, location and ethnicity. Iterative Proportional Fitting is used to adjust the cell based Age Gender and Area weights by Ethnicity groups. This approach is broadly consistent with that used in the previous surveys. The composition of the weighting matrixes is determined by the sample obtained for age and ethnicity groupings (within each area) for each survey quarter. As an example, the weighting distribution for the first quarter of surveying (February to March 2012) is shown below. FIGURE 9.0.1 WEIGHTING DISTRIBUTION FOR Q4 2014 (OCT-DEC)

21

10.0

Data Analysis Explanation

The Kiwis Count surveys ask New Zealanders to rate services or express opinions using a scale from 1 to 5. This is consistent with Canada’s Citizens First 4 survey, on which the Kiwis Count surveys are based. In the reports from Citizens First 4 & 5, the responses from the five point scales are converted to a scale ranging from 0 to 100. The average of these converted scores is referred to as the service quality score. This is to allow comparison with the results of previous Citizens First surveys. To enable comparisons between Kiwis Count and Citizens First to be made, we have adopted the Canadian approach of converting the five point rating scales to service quality scores. The two scales correspond as follows: Very poor

Very good

1

2

3

4

5

0

25

50

75

100

Where the survey measures satisfaction, a five point scale is used where one equals very dissatisfied and five equals very satisfied. In the report when we refer to the percentage satisfied this equals ratings of four or five, the percentage neutral equals rating of three, and the percentage dissatisfied equals ratings of one or two. Very dissatisfied 1

Very satisfied 2

3

Dissatisfied

4

Neutral

5 Satisfied

The same approach is taken where people are asked their level of agreement with statements on various attributes of public services, where one equals strongly disagree and five equals strongly agree. Total agree equals ratings of four or five, total neutral equals rating of three, total disagree equals ratings of one or two. Strongly disagree 1

2 Disagree

10.1

Strongly agree 3 Neutral

4

5 Agree

Overall Service Quality Scores Calculation

The Overall Service Quality Score is calculated by rescaling the result from each respondent’s five point scale (1,2,3,4,5) to a 101 point scale (0,25,50,75,100), as shown above, then calculating an average of these scores from all the services used. The overall average uses all service experiences, so a respondent who has used ten services contributes ten observations to the overall score and a respondent who has used one service contributes one observation to the overall score. This same approach is used for service groups (e.g. Health, Local Government) and individual services. The groupings used in 2012 are shown in Appendix 2.

22

10.2

2014 Changes to Kiwis Count calculations

An opportunity arose mid-2014 to review the calculations and reporting conventions for the Kiwis Count survey. This resulted in a change to how the rolling averages were calculated (i.e. the averaging of results across six months) and how statistical significant differences were reported. All results since 2012 have been recalculated in line with these changes and this has resulted in some minor changes to the value for some services (there are now some additional services which are identified as having significant changes than with the previous significance calculation). Below is a full explanation of the 2014 Kiwis Count Calculation Changes as detailed in the June 2014 annual report which can be found at the following link: http://www.ssc.govt.nz/sites/all/files/kiwiscount-june2014-annual-report.pdf. 10.3

The Rolling Average Calculation

In order to ensure a sufficient sample size, Kiwis Count quarterly releases report on the past two quarters of data. For example, a September quarterly release, will report on data collected from April to September, and compare these results to those collected between January and June. Using two quarters of data boosts the sample size to over 1,000 per quarterly release. In the past, calculations for all quarterly results were based on averaging the aggregate results from two quarters of data, i.e. Result Period (n) =

Result Quarter (n-1) + Result Quarter (n) 2

For example, if the SQS score for a service in the June quarter was 70 and for the September quarter was 72, then the SQS score for the September rolling-average time period would be 71. Going forward, calculations for all results will be based on pooling two quarters of data into one six monthly sample, i.e. Result Period (n) = Result (Quarter (n-1) sample + Quarter (n) sample)

The two methods will give the same result if the same number of people answered a particular question in both Quarter (n-1) and Quarter (n). However, if there are more responses in Quarter (n-1) than Quarter (n), averaging the quarters under the old method meant that the responses in Quarter (n-1) were worth less with respect to the final result than the responses in Quarter (n). Therefore, the new method has the advantage that each survey response will be worth the same with respect to the final result of the period. The new method also has the advantage of being less IT intensive to calculate. 10.4

The Statistical Significance Calculation

In the past, calculations for statistical significance in changes in service quality scores were made: •

for quarterly results, using the rolling average samples, and



for both annual and quarterly results, at the 90% significance level (to be consistent with the methodology used to measure significant change between 2007 and 2009 in the 2009 Kiwis Count report).

23

Quarterly Rolling Averages This meant the statistical significance test for differences in quarterly results was being applied on non-independent samples. This is illustrated below, where if significance is calculated on R2 against R1, sample Q2 is repeated:

Month

Quarter

Jan

Feb

Mar

Apr

Q1

May

Jun

Jul

Q2

Aug

Sep

Oct

Q3

Nov

Dec

Q4

R1

Reporting

R2 R3

where R1 = (Q1 + Q2)/2 where R2 = (Q2 + Q3)/2 where R3 = (Q3 + Q4)/2

Significance testing for R2 cf. R1 = independent t-test Q3 vs Q1 Q2 is excluded because it is constant

Including Q2 in both R1 and R2 had two opposing effects on the significance testing. First, it increased the sample size making the change appear more significant. Second, it averaged out the difference between the quarterly results, making the change appear less significant. The second effect is larger, meaning that the previous method was calculating significance around the 97%-98% significance level. Going forward, for results from 2012 onward: •

The calculations for statistical significance of changes in quarterly results will remove the common quarter sample from both rolling samples. In this example, calculating the significance of R2 against R1, will mean Q2 is removed from both rolling samples and effectively the significance of Q3 is calculated against Q1. This is the most commonly used method statisticians use to deal with non-independent samples.



Calculate significance at the 95% level. Because the previous method (non-independent samples and testing at 90%) produced results at the 97%-98% significance level, the new method (calculating quarterly significance on independent samples at the 95% level) means more services are identified as having quarterly significant changes than previously published.

24

Appendix 1:

Questionnaires

Full copies of previous questionnaires can be found at the links below: •

2007: http://www.ssc.govt.nz/sites/all/files/kiwis-count-survey-questionnaire2007.PDF



2009 (includes Contact Methods (Channels) Module): http://www.ssc.govt.nz/sites/all/files/kiwis-count-survey-questionnaire-2009.PDF



2012 (includes second Contact Methods (Channels) Module): http://www.ssc.govt.nz/sites/all/files/kiwis-count-survey-questionnaire2012.PDF



2013 (includes Government and the digital environment Module): http://www.ssc.govt.nz/sites/all/files/kiwis-count-questionnaire-2013.PDF



2014 January to June (includes Education in New Zealand Module and Government and the digital environment Module): http://www.ssc.govt.nz/sites/all/files/kiwis-countquestionnaire-2014v1-jan-jun14.PDF



2014 July onwards (includes new questions in Section C: Government and the digital environment Module and new questions about experiencing public services at A10): http://www.ssc.govt.nz/sites/all/files/kiwis-count-questionnaire-2014v2-jul-dec14.PDF



2015 (includes new questions about the privacy of personal information and the inclusions described in the July 2014 onwards Questionnaire): http://www.ssc.govt.nz/sites/all/files/kiwis-count-questionnaire-2015.PDF and as shown below (as at February 2015).

25

Appendix 1:

January 2015 questionnaire version

26

27

28

29

30

31

32

33

34

35

36

37

38

39

Appendix 2

Service Groupings

Passports and citizenship • A passport • Registering a birth, death, marriage or civil union Education and training • A state or state integrated (public) school that your child attends or may attend in the future • A university, polytechnic or wānanga about a course you are attending or may attend in the future Employment or retraining opportunities • Applying for or receiving a student loan or student allowance • A kindergarten, day-care, crèche, preschool, home-based service, playcentre, Kōhanga Reo, Aoga Amata, Puna Reo or playgroup etc that your child attends or may attend in the future ERO (Education Review Office) school or early childhood reports Health • Received outpatient services from a public hospital (includes A & E) • Stayed in a public hospital • Obtaining family services or counselling • Used an 0800 number for health information Local government • Visited a public library • Your local council about rubbish or recycling (excluding the actual collection of rubbish and recycling from your household each week) • Your local council about property rates • Your local council about road maintenance • Your local council about a building permit Environment and recreation • Visited a National Park • A hunting or fishing license • National environmental issues or the Resources Management Act Social assistance and housing • The Community Services card • Accident compensation for injuries • Sickness, domestic purposes or unemployment benefit • A housing subsidy or accommodation supplement • A childcare subsidy • Living in a Housing New Zealand home • A rental property bond lodgement, refund or transfer • New Zealand Superannuation Border services • The arrival process after landing at a New Zealand international airport from Australia • The arrival process after landing at a New Zealand international airport from anywhere except Australia • Importing goods into New Zealand or customs duties Justice and security • The Police (for a non-emergency situation) • Paying fines or getting information about fines • Emergency services i.e.111 • A court, about a case you were involved with Motor vehicles 40

• Obtain, renewed, change or replace a driver licence • Licensed or registered a vehicle Taxation and business • Enquired about tax, receiving tax credits (such as Working for Families), Student loan repayments or KiwiSaver • Contact with Statistics New Zealand for information or about taking part in a survey • Importing goods into New Zealand or customs duties • Registering a new company or filing an annual return for a registered company • Visited sorted.org.nz for information to help manage your personal finances or retirement income • Registered a business entity for tax purposes or filed a tax return

41

Appendix 3

Questionnaire Change Log

Changes between the 2009 and 2012 surveys Questions A4-10 (identification of most recent contact method, rating on specific aspects of this most recent contact method and overall satisfaction with most recent contact) replaced 2009 sections ‘Your most recent experience’ Q5 – 8

This section now asks people about their most recent experience. Specific aspects of service are rated depending on the channel used (telephone, online, or face-to-face). The specific aspects rated by channel are those known to be the key drivers of satisfaction with that channel. These changes were made to ensure all questions asked were relevant and meaningful to respondents.

The ‘drivers of trust’ statements asked in 2009 have been reduced to three specific trust questions: 1 1. Trust based on most recent experience 2. Trust based on overall perception of the public sector (A11) 3. Perceptions of trust in the private sector

The perceptions of trust in the private sector have been added to enable benchmarking against public service results.

Question D7 (previous Q18) was reworded to: ‘Do you have a long-term disability (lasting 6 months or more) that stops you from doing everyday things other people can do?’

To match the question recommended by Statistics NZ and used in the Population Census

1

Please note that monitoring levels of trust and integrity is primarily carried out through the SSC’s Integrity and Conduct survey.

42

Changes from 2012 onwards Year

Period in survey or date of action

Question numbers hardcopy (& online)

Questions

Action

Reason

2012 2012

Feb

A4 and C9 (in the Q3 fin. year. 2011/2012 questionn aire)

You selected as the service you had used or had contact with most recently. Thinking about all the types of contact you have had with this service, which one method of contact was used most recently If a public service wants to get in touch with you with some routine personal information how would you prefer they do this (assuming they have all your current contact details)?

(Q4,Q23) 2012

Jan-Dec

C1-C13 (Q15-Q27)

• • • • • • •

Which of these methods did you use when looking for information about public services in the last 12 months? Overall, how satisfied are you with looking for information about public services online? Thinking about all of the methods that you've used when looking for information about public services, overall how satisfied are you with looking for information about public services? Which method do you prefer to use when looking for information about public services? Which of these methods did you use when carrying out transactions or dealings with public services in the last 12 months? Overall, how satisfied are you with your transactions or dealings with public services online? Thinking about all of the methods that you've used to transact or deal with public services, overall how satisfied are you with your transactions or dealings with public services? 43

Changed to multi response (but instructed respondents to only select one) Removed for Jan 2013

This is to allow for these questions to be mapped with historic data which had these questions as multi response

Channels questions



Which method do you prefer to use when carrying out transactions or dealings with public services?



If a public service wants to get in touch with you with some routine personal information how would you prefer they do this (assuming they have all your current contact details)? Thinking about the cell phone services listed below, which of these have you already done using a cell phone or would you do if the option was available to you?



• • •

2013

Feb

D1,D2 (Q28,Q29)

2013

Feb

C1,C2,C2a (Q37,Q38, Q39)

Have you had dealings or contact with public services over the Internet, in the last 12 months? Which of the following best describes why you did not use any public services over the Internet in the last 12 months? What would encourage you to use public services over the Internet (or to use these more often)? 2013

Thinking now about your day-to-day internet usage, have you used any of the following services in the last 12 months?

Removed

D1 and D2 were removed from the survey (since they were for the purpose of analysing the old module). Replaced by Digital Q's.

Added

New questions added focusing on the ease of dealing with Government in a digital environment (R10 questions)

Why haven’t you used the internet at all in the last 12 months? How easy is it for you to complete your transactions with government in a digital environment? Have you done any of the following in the past 12 months? Thinking about each of the following, how easy was it for you? 2014

2014

Feb

C1

How easy is it for you to complete your transactions with government in a digital environment? 44

Amended wording

(Q37) Was: A digital environment includes any transactions done over the internet, on a smart phone, at a kiosk or using a Customs "SmartGate" Changed to: A digital environment includes any transactions done online, over the internet, on a smart phone, at a kiosk or using a Customs “SmartGate” 2014

Feb

A1,A2 (Q1,Q2)

In the last 12 months have you personally used or had contact with a public service organisation about any of the following?

Amended wording

Was: A kindergarten that your child attends or may attend in the future Changed to: A kindergarten, day-care, crèche, preschool, home-based service, playcentre, Kōhanga Reo, Aoga Amata, Puna Reo or playgroup etc that your child attends or may attend in the future Was: Visited sorted.org.nz for information to help manage your personal finances or retirement income Changed to: Visited sorted.org.nz for information to help manage your personal finances or planning for retirement

2014

Feb

A4a, A4b (Q40, Q40b)

Was: You selected as the service you had used or had contact with most recently. Thinking about all the types of contact you have had with this service, which one method of contact was used most recently? • • • • •

Visited an office or location Received a visit Sent a letter Received a letter Sent a fax 45

Amended wording and additional follow-up question added

• • • • • • •

Received a fax Sent an email Received an email Called on the telephone Received a call on the telephone Visited a website looking for information about public services Visited a website for transactions or dealings with public services (including: ordering something, applying for something, booking or paying for something online)

Changed to: You selected as the service you had used or had contact with most recently. Please select all the contact methods you had in the last 12 months that related to . • • • • • • • • • • • •

Visited an office or location Received a visit Sent a letter Received a letter Sent a fax Received a fax Sent an email Received an email Called on the telephone Received a call on the telephone Visited a website looking for information about public services Visited a website for transactions or dealings with public services (including: ordering something, applying for something, booking or paying for something online)

Changed to: Which one method of contact was used most recently? 2014

Feb -Jun

D1-D4 (Q41-Q44)

How many children do you have, or are a primary carer for, in each of the following groups? •

Preschool-aged children who attend an ECE or early learning service (e.g. Kindergarten, day care, crèche, preschool, home based service, playcentre, 46

Added

Ministry of Education initiative which also included boosting the sample

• • •

Kōhanga Reo, Aoga Amata, Puna Reo, or playgroup) Primary school-aged children (school year 0-6) Intermediate school-aged children (school year 7-8) Secondary school-aged children (school year 9-13)

You mentioned you have a pre-school aged child who attends an ECE or early learning service. Please rate your satisfaction with your child's ECE or early learning service in each of the following areas: • • • • • • • •

The ECE or early learning service overall How well your child's culture is supported How well your child's language is supported How well your child is included in activities run by the service The information you receive about your child's wellbeing (e.g. physical, emotional) The information you receive about your child's learning The information you receive about how you can support your child's learning at home The opening hours of the service

You mentioned you have a primary school-aged child or intermediate school-aged child (school years 0-8). Please rate your satisfaction with your child's school in each of the following areas: • • • • • • • • •

The school overall How well your child's culture is supported How well your child's language is supported The information you receive about your child's wellbeing (e.g. physical, emotional) The information you receive about your child's learning The information you receive about how you can support your child's learning at home The information you receive about your child's progress How well your child is included in school activities How approachable your child's teacher is 47



How welcome staff make you feel at school

You mentioned you have a secondary school-aged child (school years 9-13). Please rate your satisfaction with your child's school in each of the following areas: • • • •

2014

Feb -Jun

E1-E7 (previously D1-D7) (Q30-Q36)

2014

Jul -

A10

The school overall How well your child's culture is supported How well your child's language is supported The information you receive about your child's wellbeing (e.g. physical, emotional) • The information you receive about your child's learning • The information you receive about your child's progress towards achieving NCEA (if applicable) • The information you receive about how you can support your child's learning at home • The advice you receive about how subject choices link to your child's future study or employment options • How well your child is included in school activities • How well the school is preparing your child for tertiary education • How well the school is preparing your child for work • How well the school is preparing your child for life when they leave school • How welcome staff make you feel at school • Gender • In which one of the follow age groups do you belong? • What is your personal income from all sources, before tax or anything else taken out? • What is your highest completed educational qualification? • Do you have a long-term disability (lasting 6 months or more) that stops you from doing everyday things other people can do? • Which ethnic group do you belong to? • From time to time, the State Services Commission undertakes other research projects. Would you be willing for us to contact you in the future to see if you are interested in taking part in such research for the State Services? Thinking now about all the times you have personally used or had contact with a public 48

Moved from section D to section E for Feb – Jun 2014 (Moved back to section D from Jul 2014) Added

To allow for Education questions

service in New Zealand in the last 12 months, which of the following happened to you? (Q45)

• • • • • • • •

A situation where you have had to provide the same information to several different government agencies A government agency making you provide too much information to prove who you are A government agency appearing not to have kept information you have provided them with in the past A situation where you wanted to complete a whole transaction with a government agency online, but being unable to do so Finding out that you had missed out on getting something from government because you did not know it was available A government agency not seeming to understand the effect their decisions and requests would have on you A government agency that seemed to be looking for a reason to turn down your requests, rather than truly considering your request on merit A government agency failing to treat you with proper respect and empathy



2014

Jul -

A11,A12 (previously A10,A11) (Q10,Q11)

Finding out that you could have avoided being penalised or fined if you had better information about what you had to do • Different government agencies or staff in a single government agency providing you with conflicting information • A situation where you had to approach several different government agencies before finding the one who could actually deal with your enquiry • A problem with one government agency which could have been solved if they had communicated with another government agency • None of these have happened to me Overall, thinking about all the different kinds of public services provided by government, circle one number for each statement to indicate your level of agreement or disagreement with the statement. • • •

Public sectors have a more difficult task than the private sector I find the quality of service provided by public services to be higher than the private sector I expect public services to provide a higher level of service quality than the private sector 49

Moved

To allow for new A10 question as specified above

Again it is your overall impressions we are interested in from what you know or have heard from family, friends or the media. Overall to what extent do you trust the public service? 2014

Jul -

C2,C3 (Q46,Q47)

Thinking about the last time you transacted with government in a digital environment, please indicate your level of agreement or disagreement related to this occasion.

Added

• The process was smooth and efficient • I had to contact too many different places to complete the transaction (By different places we mean methods (such as online as well as telephoning or visiting an office to speak to someone, sending mail or email) or different agencies / departments). The next time you need to do this transaction, would you do it online (as opposed to visiting an office or over the phone etc)?

2014

Jul -

C4, C4a (previously C2, C2a) (Q38, Q39)

• Yes, I would do it online again • No, I would prefer to do it a different way • I won’t be needing to do this transaction again • Don't know / unsure For each of the services listed below please select the statement that best describes whether you have used it in the past 12 months. • • • • • • •

Been through New Zealand Customs (online refers to a SmartGate) Made a booking with the Department of Conservation (such as DoC hut) Renewed a passport Applied for an IRD number, filed an individual tax return or paid tax Applied for financial assistance from Studylink, Work and Income or Senior Services Paid a Police fine Paid for a vehicle licence

Then rate how easy the service was to use. 50

Moved

To allow for new C2, C3 questions as specified above

2015 2015

Feb

A1,A2 (Q1,Q2)

In the last 12 months have you personally used or had contact with a public service organisation about any of the following?

Amended wording

Updated to reflect new benefit categories

Added

Additional statement

Added

New questions about the privacy of personal information

Was: Receiving a benefit such as the sickness, domestic purposes or unemployment benefit Changed to: Receiving a benefit such as Jobseeker Support, Sole Parent Support or a Supported Living Payment

2015

Feb

A5, A6 (Q5, Q6)

2015

Feb

A13 (Q48)

Thinking about your most recent service contact, indicate your level of agreement or disagreement related to using this service on this occasion. • Staff went the extra mile to help you get what you needed Thinking now about the privacy of your personal information when dealing with public services, for each statement indicate your level of agreement or disagreement. There is no right or wrong answer, it’s just your opinions that we’re interested in. •

2015

Feb

D3

I am satisfied that any personal information I provide to government agencies is properly protected • Government agencies are doing a better job at keeping personal information safe than they were 12 months ago What is your total annual personal income from all sources, before tax or anything else Amended wording is taken out?

Reference frame added (i.e. annual)

(Q32) 2015

Feb

D4 (Q49)

Which best describes your household’s total annual income before tax? • •

$0/none $1 - $5,000 51

Added

New question for household income

• • • • • • • • • • • • •

$5,001 - $10,000 $10,001 - $15,000 $15,001 - $20,000 $20,001 - $25,000 $25,001 - $30,000 $30,001 - $35,000 $35,001 - $40,000 $40,001 - $50,000 $50,001 - $70,000 $70,001 - $100,000 $100,001 - $150,000 $150,001 - $200,000 More than $200,000

52

Appendix 4

Data Entry Protocols

The following data entry protocols from 2009 have been adopted again for consistency purposes. The 2009 protocols and their adaption to the 2012 survey are shown in the table below: TABLE A3.1 2009 DATA ENTRY PROTOCOL ADOPTION FOR THE 2012 SURVEY 2009 Protocol

Adaption for 2012 Adoption

Blank responses (which were left blank)

No adaption required

Selection of multiple items, where only one should be selected (all codes entered in the comment box and questions subsequently recoded to multiple response)

No adaption required

Providing a service rating for a service, when the service itself had not been identified in Question 2 (service was recoded as having been selected) The description in Q5B clearly does not match the description of the Q2 item that was entered in Q5a; and it was not possible to identify a single service from Q2 (this was dealt with by coding Q5a as 43 (incorrectly answered) and any answers until Q9A were noted to be skipped/crossed out)

No adaption required Also applied to B1

Adopted for question A3 Coded as 97

If no service had been selected for Q5a and more than one service has been selected and rated in Q2. (Coded as 43 “incorrectly answered” at Q5a)

Adopted for question A3

If the service selected in Q5a had not been selected and rated in Q2 (coded as 43 “incorrectly answered” at Q5a).

Adopted for question A3

Coded as 97

Coded as 97

New protocols adopted in 2012: •

Any online or hard copy response is counted as a completed survey so long as it has been completed as far as question A2



If a respondent has answered the wrong set of questions at A5-A8 (based on their response to A4), their responses are transferred to the correct question where they align. Where there is no alignment they are coded as not answered



If a hard copy respondent selects more than one service at A3, their response is coded as not answered



If a hard copy respondent writes “superannuation” as a response to D5, their response is coded as not answered



If a qualification is not easily aligned to the existing codes or is not legible and the respondent is not able to be contacted for clarification, their response is coded as not answered

To enable these edits to be made, the data enterer has different access to the survey tool from a normal respondent. For example the data enterer has the ability to select ‘no response’ for any question where a hard copy respondent has not selected a response.

53

New protocols adopted in 2014: •

If a hard copy respondent rates a service at C2, when the service itself has either not been selected or selected as not having done this, the service answers will stand.



At A4, if selected something in column B which has not been selected in column A, the answer at B will be transferred to A also.



If a respondent has answered the wrong set of questions at D2-D4 (based on their response to D1), their responses are transferred to the correct question where they align. Where there is no alignment they are coded as not answered.

New protocols adopted in 2015: •

If total annual personal income is greater than household total annual income, then ‘not answered’ is selected at household total annual income.

54