Student-Collected Survey Data:

The professional clinical practice of disciplinary teaching in journalism and mass communications rests on deep understanding and respect for the comp...
Author: Vincent Flynn
5 downloads 0 Views 939KB Size
The professional clinical practice of disciplinary teaching in journalism and mass communications rests on deep understanding and respect for the complexity of learning. The Graduate Teaching Academy encourages the future professorate to master teaching and learning scholarship in all its complexities as a basis for service as effective, principled scholars. Graduate student submissions to GTA are encouraged and are reviewed by editors of Journalism & Mass Communication Educator for their contribution to the evidence-based understanding and practice of teaching scholarship.

373

W

I

~

'no R

Student-Collected Survey Data: An Examination of Data Quality and the Value of Survey Research as a Learning Tool MICHAEL l? BOYLEAND MIKESCHMIERBACH

Instructors who use students enrolled in survey methodology courses to collect data m u s t ensure that the process i s not unreasonably difficult for some students and that the data gathered are representative of the population. Two sets of student-collected survey data are used to examine the fairness of the data-collection process for students and the quality of the data based on comparisons with census data. The findings demonstrate this process i s a useful learning tool that efficientlyyields quality data. The authors provide suggestions for improvement in student survey-data collection based on their experience as both students and instructors in the experiential-education-based data-collection process.

Introduction Instructors in a research methods course h a v e t h e difficult task of presenting complicated material to a diverse set of students and trying to help them learn and stay involved w i t h t h e process. This difficulty is particularly great for instructors i n mass communication, who must also overcome t h e rigid professional orientation of many s t u d e n t s ( “ W h y d o I care about getting good data-I just want to make a d s ” ) . In addition, the selfprofessed math phobia of journalism students, even at the graduate level, can make material linked to quantitative research especially hard to present.

One s o l u t i o n to overcoming student difficulties and disinterest is to provide experiential learning. In research methods courses, this often takes the form of a research project built on a student-conducted survey. Such projects potentially offer students t h e c h a n c e to use a real-world experience to better understand course concepts. S u c h projects also can generate data for researchers looking to build on mass communication theory and possibly reshape other courses. However, this win-win scenario is not ensured. For students, the value of the project hinges on their ability to learn through experiential learning, the fairness of data-gathering as a courseassessment tool, and their engagement

Michael I? Boyle ([email protected]) and Mike Schmierbach ([email protected]) are doctoral students in the Department of Journalism and Mass Communication, University of Wisconsin -Madison.

IOURNALISMi+ MASSCOMMUNICATION EDUCATOR 374

w i t h the survey process. For researchers, important questions remain about the quality of studentgathered d a t a , especially w h e n students may look for shortcuts to ensure a high grade. In this paper, we address the quality of data from s t u d e n t conducted surveys, the fairness of the process for students, and some ways in which instructors can ensure a successful survey project in a research methods course. Using data on student performance on two class projects, census d a t a , a n d qualitative observations from the perspective of both students and instructors, we evaluate the potential strengths and weaknesses of survey research projects in the classroom.

Experiential Learning The concept of experiential learning originates with Dewey and other education philosophers, who argued that exposure to real-world materials was an important part of the learning process.’ This exposure is expected to p r o d u c e several pedagogical benefits. In particular, Kolb argues that learning occurs i n four stages: concrete experience, reflective observation, abstract conceptualization, a n d active experimentation.2 A well-designed experiential educational activity can facilitate all forms of learning, particularly active experimentation. In a d d i t i o n , s u c h a project has t h e potential to reach s t u d e n t s w i t h different learning styles-students w h o might n o t grasp material presented just in traditional lectures and exams.3

In o n e s e n s e , a l l learning i s experiential in n a t ~ r e Whenever .~ an individual interacts with the world, that experience potentially contributes to new knowledge. However, educators interested i n experiential learning generally focus on deliberate efforts to promote learning through experience. Itin labels this experiential educat i ~ nAlthough .~ more focused than the broad concept of experiential or active learning, experiential e d u c a t i o n nevertheless takes m a n y forms, i n c l u d i n g i n t e r n s h i p s , lab w o r k , simulations, student-led debates, and case studies.6 Because of the diversity of forms, instructors in numerous fields including biology, chemistry, political science, economics, and education all have used experiential education approaches. In some cases, educators have conducted empirical assessments of these new learning approaches. In general, these studies have found support for experiential learning as an effective tool for t e a ~ h i n g .More ~ generally, Salemi outlined several reasons why active learning approaches are better than “chalk and talk” for helping s t u d e n t s learn, including increased student engagement and greater opportunities for reflection and discussion.* On the other h a n d , successful experiential education requires more t h a n just sending s t u d e n t s o u t to perform “hands-on” activities. Shor argues that promoting experiential learning involves a “transactive” relationship between instructors and students; both parties must contribute their knowledge and be engaged in the activity for t h e process to b e s u c c e s s f ~ lAmong .~ other things, this implies the importance of designing a project that will overcome student

375

WINTER ’04

reluctance.'O Such reluctance can, in some cases, result i n nonlearning, which can occur when students fail to join in a new learning experience or o p t o u t of a n ongoing project." Reluctance to participate in an activelearning experience may result from perceptions of the workload required to complete a task; fear of outcomes of t h e project at h a n d ; or general unwillingness to put i n the effort, among other reasons. T h e risk of r e l u c t a n c e i s particularly high when communicating complex topics s u c h a s research methodology, b u t experiential education techniques can also be a useful way to convey such material. Brock a n d Cameron argue t h a t a n u m b e r of successful teaching methods can involve experiential learning in political science, and many of their examples can be extended to research methods and other courses in mass communication.12 However, Itin cautions that integrating experiential e d u c a t i o n i n t o a m e t h o d s course requires going beyond well-rehearsed lectures to create more complex projects.13 Although these projects can take many forms, they all provide direct experience a n d promote reflection on that experience. For example, a traditional approach to teaching interviewing techniques would be to give a lecture listing basic rules for interviewing. An active approach would require providing students with some kind of experience w h e t h e r i t i s interviewing t h e instructor or a partner, or carrying out actual survey interviews. Examples of active-learning projects implemented and described by scholars include having students do fieldwork to understand how gender

c o n c e r n s c a n b e integrated i n t o research design and theory;I4 gather data to b e analyzed i n statistics courses;15carry out content analysis of political advertisements;"j and conduct polls on political issues.17In addition, much of what takes place in skillsbased journalism courses can be seen as experiential education, although the need to assess all mass-communication coursework against experiential learning goals remains.I8

Eva1ua ting Experiential Learning in Research Methods Courses Evaluating the effectiveness of experiential learning i n mass communication instruction requires assessment at several levels. One form of evaluation concerns the educational outcome s t u d e n t s get from these projects. Few researchers have made such assessments of hands-on research projects i n methods courses. For example, Hakeem found that students involved in a data-gathering project performed better on a subsequent statistics exam than those not involved i n t h e project,lg a n d Jones a n d Meinhold considered whether participating in survey interviewing might lead s t u d e n t s to feel more positive about civic engagement and community members but found no support for such a claim.2" Although such evaluations should continue, we need to consider more fundamental questions about using experiential learning i n teaching methods courses. As the discussion of nonlearning suggests, there are risks associated w i t h n o n t r a d i t i o n a l

IOURNALISM& MASSCOMMUNICATION EDUCATOR 376

approaches to education s u c h as experiential learning. Some students may opt out of the process, and others may be left behind if the experiential procedure is somehow unfair. Many studies on survey methodology have focused on the impact of interviewer characteristics on the data-collection process.21 Interviewer characteristics such as gender, race, age, and training have been shown to impact survey response rateszz and willingness to answer certain questions.23 If interviewer characteristics like these affect performance, then assessments based on performance on interviewing tasks may be unfair a n d promote student frustrations. A fundamental principle of education is that student assessment is based on effort and performance. If other characteristics such as race or nationality influence assessments, then this principle is violated. Therefore, a fair survey project is one that would be equally challenging for all students regardless of factors such as gender or nationality. Concerns about the data-collection process also raise questions about the quality of data gathered by students. If students receive poor training, hurry through the process, or resent the project, the quality of data gathered by them may suffer. Clearly, concerns about both fairness and data quality need to be addressed as part of any assessment of a data-gathering project. Given the scarcity of literature on this subject, this paper does not test any formal hypotheses on questions of data quality, fairness, and student learning. Rather, w e provide exploratory data to broadly assess these issues in terms of two research methods courses offered at a major Midwestern university. One project

was part of a course offered for more than forty years that doubtlessly has influenced the way mass communication instructors throughout the United States design survey projects as part of a research methods curriculum; the second project took place in two closely related courses at the same institution.

Methods for Study 1 The data for Study 1 were collected by two separate research methods classes ( N = 1 3 1 s t u d e n t s ; N = 6 3 8 surveys) during the winter of 2001. As part of a course requirement, students were i n s t r u c t e d to complete five surveys ( M = 5.31; s.d. =1.29) over a two-week period i n late October to early November (students h a d the o p t i o n of doing t w o extra c r e d i t studies, explaining the high mean). The survey dealt with a number of topics, including emotional responses to the terrorist attacks, civic engagement, and media-use habits. The demographic variables gender, class standing, and nationality were used as interviewer characteristics for this analysis. A third of the students were male (32.3%), and a majority of the interviewers were undergraduate ( 7 9 . 5 % ) . About 1 2 . 7 % of t h e interviewers were i n t e r n a t i o n a l students (all international students were also graduate students). To measure how efficient students were in carrying out the interviewing task, we evaluated the day of interview completion. For each interview, we recorded how many days after the project start date the interview was conducted. We then used these figures to c o m p u t e t h e mean d a t e of completion for each interviewer. This value, average day of completion ( M =

377

WINTER '04

Table 1 COMPARISON OF 2000 CENSUS DATAFOR AGEWITH SURVEY DATAUSEDIN STUDY 1

18-24

25-44

45-64

65 81 up

Census Data

21.7%

41.4%

26.3%

11.5%

Survey Data

14.8%

40.7%

34.2%

10.2%

10.29; s.d. = 2.73), represents how early or late i n the interviewing process the interviewer completed their task. This value excludes extra credit interviews which, by their nature, were conducted late in the process. Inclusion of extra credit interviews would distort the meaning of this variable. Dependent Variables. As part of the calling process, students logged all calls, including refusals, no answers, and invalid numbers. From this we are able to calculate a nonrefusal rate- the number of successes divided by the sum of the successes and refusals ( M = .49; s.d. = .18). To assess how arduous the calling process was for students, we consider the average length of an interview for each student. Average length (M= 23.1; s.d. = 4.57) was calculated by finding the mean reported interview length, in m i n u t e s , for e a c h interviewer (recorded at the end of each interview). Finally, a s a measure of s t u d e n t performance, we assessed the number of completed interviews ( M = 5.31; s.d. =1.29).

Results for Study 1 One issue with any survey is how well it represents the population being

JOURNALISM

b MASSCOMMUNICATION EDUCATOR

studied. This concern is amplified with student interviewers. To test the representativeness of our sample, we compared demographics from our sample with census data from 2000. These figures match well. In the survey, 50.5% of the respondents were male, compared with 49.2% in the population. The median income in the study falls in the $30,000-50,000 range, and the census estimates the median household income to be $45,500.24 Given that the census data indicate more than 90% of the population has at least a high school education, we evaluate the percentage of respondents who have at least college degrees. The census indicates that 45.9% of the community has graduated from college, while 47.5% of our sample has achieved t h a t educational level. Finally, w e compare t h e age distribution for the community and the study (see Table 1). In general these figures match. A notable difference, however, is i n the 18-24-year-old category. The census data indicate that 21.7% of the population is between 18 a n d 24, but only 14.8% of survey respondents fell into this age range. One reason may be the sampling strategy for the study, which uses the

378

~~~

~

Table 2 REGRESSION ANALYSIS PREDICTING NONREFUSAL RATEFOR STUDY 1 Nonrefusal Rate for Study 1 Gender" Class Standingb Nationality" Average Day Completed

,051 (.06) - . l o 7 (.075) -.02** (.006)

RZtotal

4.5%

-.001 (.035)

* = p