National Study of Living-Learning Programs

LL National Study of Living-Learning Programs University of Maryland, College Park Customized Report June 2004 LL National Study of Living-Learnin...
Author: Scott Powers
2 downloads 0 Views 377KB Size
LL National Study of Living-Learning Programs University of Maryland, College Park Customized Report

June 2004

LL

National Study of Living-Learning Programs Sponsored by the Association of College & University Housing Officers International

University of Maryland, College Park Customized Report June 2004

Project Collaborators: Karen Kurotsuchi Inkelas Principal Investigator University of Maryland Aaron M. Brower University of Wisconsin Scott Crawford MSIResearch Mary Hummel University of Michigan Duston Pope MSIResearch William J. Zeller University of California, Irvine

2003-2004 Graduate Assistants at the University of Maryland: Zaneeta E. Daver Dawn R. Johnson Zakiya S. Lee Susan D. Longerbeam Kristen E. Vogt

National Study of Living-Learning Programs

Customized Report Contents I. Introduction...........................................................................................................................I - 1 Introduction Research Context Conceptual Framework Study Methods Format of the Report Description of the Thematic Living-Learning Program Typology Uses of the Data References How to Reach Us II. Tips for Interpreting the Tables........................................................................................... II - 1 Tips for Tables with Percentages Tips for Tables with Means III. Institutional Comparison Tables ........................................................................................ III - 1 List of Participating Schools by Carnegie Classification Results for Student Inputs Results for Student Environments Results for Student Outcomes IV. Living-Learning Comparison Tables ................................................................................. IV - 1 List of Participating Living-Learning Programs by Type Legend for Participating School Living-Learning Programs Results for Student Inputs Results for Student Environments Results for Student Outcomes V. Institutional Custom Questions Tables ............................................................................... V - 1 VI. Institutional Open-Ended Responses ................................................................................. VI - 1

Appendices Appendix A: NSLLP 2004 Composite Measures Appendix B: NSLLP 2004 Thematic Typology of Living-Learning Programs

Section I Introduction This report, summarizing the findings from your institution for the National Study of Living-Learning Programs, represents the first multi-institutional study of living-learning programs, conducted during the winter of 2004 at 34 colleges and universities across the United States. The National Study of Living-Learning Programs (NSLLP) was developed by a collaborative team of researchers led by Karen Kurotsuchi Inkelas from the University of Maryland. The primary purpose for this research was to study the impact of living-learning programs on various student outcomes. The original collaborative team included Aaron M. Brower (University of Wisconsin), William J. Zeller (University of California, Irvine), Mary Hummel (University of Michigan), and Merrily Dunn (University of Georgia). This study was funded by a four-year grant from the Association of College and University Housing Officers International (ACUHO-I). The first two years of the grant (2002 – 2003) were spent writing an extensive literature review, developing and refining the survey instrument used for this study, and conducting a pilot test of the instrument at four campuses. This national data collection occurred in year three (2004) and the final year of the grant (2005) will be used for manuscript preparation and report dissemination. Dr. Inkelas contracted with MSIResearch to conduct the data collection for the NSLLP. The MSIResearch data collection office was led by Scott Crawford and Duston Pope. The definition of a living-learning program used to determine if a school was eligible for the study was: programs that involve undergraduate students who live together in a discrete portion of a residence hall (or the entire hall) and participate in academic and/or extra-curricular programming designed especially for them. The variability of this definition allowed for great variety in the types of programs and campuses that were included in the study. For the national data collection, colleges and universities with living-learning programs on campus were eligible to participate in the NSLLP. Schools interested in participation were offered the ability to participate for a fee to cover data collection costs, and were provided with a final analytic dataset and report of results. Thirty-four schools enrolled and all successfully completed the data collection. For a complete list of participating schools see Table I-A. I-1

Table I – A Participating Institutions in the 2004 National Study of Living-Learning Programs NUMBER OF L/L PROGRAMS INSTITUTION NAME

CARNEGIE TYPE

Arizona State University Bowling Green State University Central Arkansas University Central Washington University Clemson University Colorado State University Florida State University George Washington University Indiana University Louisiana State University North Carolina State University Northeastern University Northern Illinois University Pennsylvania State University Purdue University San Jose State University Southern Illinois University Syracuse University University of California, Irvine University of Florida University of Illinois, Urbana-Champaign University of Maryland, Baltimore County University of Maryland, College Park University of Michigan University of Missouri University of North Carolina, Chapel Hill University of North Carolina, Wilmington University of Northern Iowa University of Richmond University of South Carolina University of Tennessee, Knoxville University of Vermont University of Wisconsin Western Kentucky University

Research Extensive Research Intensive Masters College & Univ. Masters College & Univ. Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Masters College & Univ. Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Research Extensive Masters College & Univ. Masters College & Univ. Masters College & Univ. Research Extensive Research Extensive Research Extensive Research Extensive Masters College & Univ.

1-5

6 - 10

11+

Research Context In the last decade and a half, there has been a resurgence of interest in undergraduate education at large research universities (Boyer Commission, 1998, 2002; National Science Foundation, 1996; Ad Hoc Committee, 1987). “Shrinking” the megaversity to a manageable size

I-2

for undergraduates, especially first-year students, requires administrative commitment and collaboration between student- and academic-affairs practitioners. Living-learning programs represent a significant response to, and product of, the broader movement to improve undergraduate teaching and learning. Shapiro and Levine (1999) identify four major types of learning communities: 1) paired or clustered courses; 2) cohorts in large courses or first-year interest groups (FIGS); 3) team-taught courses; and 4) residential learning communities. The first three types of communities are more curriculum-focused, and have been examined by several national studies, including the National Learning Communities Project, the National Center on Postsecondary Teaching, Learning, and Assessment, and the Learning Community Effectiveness Project. However, there have been fewer focused studies conducted on the fourth type – the residential learning community (also known as living-learning programs) – and there have been no multi-institutional or national studies of this category of learning community until this study. At the same time, public outcry for greater accountability in higher education has prompted widespread assessment efforts in almost every corner of academe. Responding to the assessment call, individual living-learning programs have endeavored to show how their activities and services enhance various student outcomes, from retention to academic performance to intellectual and social development. The results of these assessments, while informative in discrete ways, have created a body of empirical literature on living-learning programs in the form of a patchwork. Because most studies of living-learning impact were conducted by individual programs with idiosyncratic research questions and varied empirical methods, the findings of these studies are mostly disconnected and limited in representativeness. Campus leaders still need access to research that both identifies common (not idiosyncratic) and positive student outcomes across different types of living-learning programs as well as multiple institutional contexts, and articulates the conditions that foster these positive outcomes so that they can be built into an institution’s policies, planning, and programming. This study builds on and complements previous research by introducing an innovative typology, using a standard method of inquiry for the different types of living-learning programs, and investigating a range of outcomes related to student learning and development.

I-3

Conceptual Framework The conceptual framework for the National Study of Living-Learning Programs is based on Astin’s (1993) “inputs-environments-outcomes” (I-E-O) college impact model, in which outcomes (student characteristics after exposure to college) are thought to be influenced by both inputs (pre-college characteristics) and environments (the various programs, policies, relationships with faculty and peers, and other educational experiences that impact students). Astin argues that research examining how the college environment influences student change or development will always be biased unless it controls for as many student inputs as possible. Living-learning participants come to college with diverse pre-college perceptions and experiences, and they respond differently to the variety of campus environments that mediate the impact of college and influence student outcomes. By identifying and accounting for these differences, this study provides a robust assessment of the effects of living-learning programs on student learning and development. The environments of primary importance for the NSLLP are types of living-learning participation, faculty-student and peer interactions that occur in relation to living-learning participation, and students’ perceptions of academic and social support in residence halls. The NSLLP also examines other forms of campus experience, such as enrollment patterns, quality of effort in various activities, and extra-curricular involvement. Finally, it incorporates several input measures, including demographic characteristics, high school achievement, and pre-college motivations for college attendance. This last type of measurement attempts to account for students’ intrinsic and extrinsic motivations that may shape their initial engagement with the college experience. Outcomes in the NSLLP include students’ perceptions of their academic and social transition to college, intellectual abilities and growth, self-confidence, diversity appreciation, civic engagement, and satisfaction, as well as reports of their alcohol use and behaviors, academic achievement, and plans for persistence. Table I – B outlines the major constructs examined through the NSLLP survey instrument.

I-4

Table I – B Major Constructs of the NSLLP Survey Instrument (Based on Astin’s (1993) Inputs-Environments-Outcomes Model) Inputs • • •

Demographics High school achievement Pre-college assessment of importance of college involvement and perceptions of self-confidence

Environments • • • • • • • • •

Peer interactions Faculty interactions Co-curricular involvement Alcohol-related experiences Use of residence hall resources Perceptions of residence hall climate Diverse interactions Campus racial climate perceptions Time spent on leisure activities

Outcomes • • • • • • • • •

Estimations of academic and social transition to college Perceptions of intellectual abilities and growth Perceptions of self-confidence Appreciation of diversity Sense of civic engagement Alcohol use and behaviors Plans to return to institution Self-reports of cumulative college grade point average Overall satisfaction

Study Methods The NSLLP data collection was conducted using an Internet survey. Respondents were contacted primarily via email. All data collected and most emails that were sent were handled by internal systems at MSIResearch. Each participating school provided sample lists containing students and contact information. The sample contained two types of students; those participating in living-learning programs, and a comparable control sample made up of students not participating in a living-learning program. University of Maryland’s sample included 1,915 living-learning students and 1,912 control sample students. Two sample groups were pulled in order to create a comparison between those students who participated in living-learning programs and those who did not. This sample pull was to be a random sample (or census if the full population was used). Some schools selected a control group that matched the characteristics of the living-learning group. Demographic characteristics of the two samples were as comparable as possible and when possible, were sampled from the same residence halls.

Instrumentation The NSLLP questionnaire contained two main sections; the base questionnaire and the custom question section. The base questionnaire was created by the NSLLP staff through two I-5

years of review and pilot testing. The original questionnaire was pilot tested at four universities in the spring of 2003; based upon those survey results, several tests were conducted to test the reliability and validity of the pilot questionnaire. Reliability was tested primarily through the internal consistency of scales designed to measure several of the constructs discussed in Table I – B. Composite measures representing the major constructs were developed using exploratory factor analysis and Cronbach alpha reliability testing. Additionally, the consistency of the scales across the campuses was tested using data from each individual institution in the pilot study. Cronbach alpha reliabilities of the scales for the 2003 pilot test ranged from .623 to .898. Reliability of the scales was re-tested with the 2004 NSLLP data, and Cronbach alpha scores ranged from .624 to .918. For more information about the NSLLP composite scales, see Appendix A. Two kinds of validity of the NSLLP were evaluated: content validity and construct validity. In order to establish the content validity of the instrument, prior to the 2003 pilot test administration, approximately 15 living-learning program administrators reviewed the items on the instrument. In addition, as mentioned previously, the survey was pilot tested at four campuses in the spring of 2003 and a previous version of the survey was administered on one campus in the spring of 2002. After each new administration, the content of the questions was revised for clarity. Construct validity was evaluated by investigating expected similarities within themes, and dissimilarities across themes. Construct validity was also determined by studying group differences. The differences between living-learning and control students, and the differences among demographic groups, matched higher education theory and the results from prior research. The custom question section contained some questions that were required but had custom response choices (school/college of enrollment [e.g., College of Arts & Sciences], residence hall, living-learning program). The remaining questions were truly custom and were written and provided to the NSLLP staff by each school. Custom questions were only asked of the students enrolled in the school that provided the questions.

I-6

Schools were asked to provide a logo that was shown in the upper left hand corner of every page in the survey. This helped to provide legitimacy to the survey as something in which the school was invested. If a logo was not provided, the MSIResearch logo was used in its place.

Data Collection A data collection schedule for each participating school was created to allow for at least five weeks of data collection before spring break. Additionally, data collection did not start before two weeks had passed since the start of the semester. This resulted in many different schedules, with the first official day for data collection occurring on January 26, 2004. All data collection was closed on March 19, 2004. Each campus received Institutional Review Board (IRB) approval or provided an exemption letter before data collection could begin. Email communications were sent to each respondent, inviting them to participate in the survey. Each contained a URL and a unique survey ID number that was used to access the survey. The use of a unique survey ID allowed respondents to return to the spot at which they left an incomplete survey. It also allowed for the tracking of respondents for survey reminder emails. For those who did not complete the survey, up to three reminders were sent. Some schools also chose to make extra contacts in an effort to boost response rates. All schools were instructed to obtain IRB approval for any change to the original data collection protocol, including additional efforts outside of the standard NSLLP data collection protocol. The NSLLP also encouraged participating schools to include an incentive for students to participate. The incentive was mentioned in all email communications. The incentive used at the University of Maryland, College Park is listed below:

Sweepstakes for: $50 gift certificates from Target or Best Buy, choice of one of the following: MP3 player, DVD player, Palm Pilot, or gift certificate to Best Buy (total value not to exceed $150).

Responses The overall national response rate for the NSLLP was 33.33% and the total number of respondents was 23,910. This response rate compared similarly with other studies of college student populations using a similar design with similar incentives and data collection rigors. The

I-7

overall responses for the NSLLP are shown in Table I - C. The final count of responses for your institution can be found in Table I - D.

Table I - C Overall Responses for the National Study of Living-Learning Programs Sample Size*

Total Responses*

Response Rate*

33,562 38,166 71,728

12,241 11,669 23,910

36.47% 30.57% 33.33%

Sample Living-Learning Sample Control Sample Total *See Table I - E for definition of terms.

Table I - D Responses for University of Maryland, College Park

Sample Living-Learning Sample Control Sample Total

Sample Size*

Total Responses*

Response Rate*

1,915 1,912 3,827

871 614 1,486

45.5% 32.1% 38.8%

*See Table I - E for definition of terms.

Table I – E Definitions of Terms Sample Size (N)

Total Responses Response Rate

The count of students who were eligible to take the survey. This number in most cases is the number of sample lines provided from the school to the NSLLP staff. In some cases students were removed from the sample during or after data collection if they were deemed to be ineligible for the study (i.e., they were no longer a student, they were not 18 years of age, etc.). A sum of completed and partial surveys. (C+P) The number of completed surveys plus the number of partially completed surveys divided by the total sample size. This rate is accepted as a standard rate to report response rates by the American Association for Public Opinion Research (AAPOR, 2000).

I-8

Other than never logging in to the web survey and attempting to answer the questionnaire, there were several reasons why a subject selected to participate may not have been included in the study. First, approximately three percent of the emails did not get delivered to their destinations, otherwise known as “bounced” emails. If a respondent’s email invitation bounced, reminders were still sent to this respondent. Emails used a subject line that identified the study and appeared to be coming from the primary investigator. Second, less than one percent of the students selected for the survey did not give their consent to participate. The first page of the NSLLP survey was a consent screen that explained the research to the respondents and asked them if they consented to take the survey. If they did not consent, respondents were not allowed to proceed with the survey and they were thanked for their interest. Finally, a few students (again less than one percent) informed NSLLP staff, or even local school contacts that they did not wish to participate in the study. These types of requests were received via email and phone calls. Respondents’ rights not to participate were guarded and when received, such requests were noted in the sample database and no further contacts were made with the refusing respondents. Data Delivery Each school was delivered a SPSS data file containing all data from their respondents. This data file contained all data collected in the base questionnaire in addition to the data collected in the school’s custom question section. Any sample information that was provided to MSIResearch was also merged into the data files. Any variables that were included in the sample file outside of those items requested (class, ethnicity, residence hall and gender) were merged into variables called ADD1 – ADD18. All open-ended text responses were also included in this data file. Due to the limits of SPSS, any response that exceeded 255 characters was truncated. NSLLP staff can provide the full open-end responses in a text file format if desired.

Data Analyses Most of the survey questions were combined to form composite scales based upon the factor analysis and reliability testing described in the instrumentation portion of this chapter. Composite scales were used instead of individual survey items because they provided more rigorous reliability and validity than single items, and because, often, the individual items were

I-9

designed to be developed into composite measures. For a complete list of all of the composite measures and the constructs they represented, see Appendix A. Composite scales were analyzed using one-way ANOVAs, and the single item measures were analyzed using chi-square.

Format of the Report The customized data are presented in Sections III and IV of this report. Section II provides tips on how to read and interpret these tables. Section III: Institutional Comparison Tables Section III reports the findings for your institution’s entire living-learning (L/L) and nonliving-learning (control) samples, as well as the statistical significance of the differences between your institution’s L/L and control samples. Section III also includes the results by L/L and control samples for four types of institutions represented in the study: 1. Research extensive universities with 1-5 L/L programs 2. Research extensive universities with 6-10 L/L programs 3. Research extensive universities with 11 or more L/L programs 4. Masters universities with 1-6 L/L programs.

The primary groupings for these categories were based on institutions’ Carnegie classifications. The Carnegie Foundation classifies all institutions of higher education into distinct groups. The institutions participating in the 2004 NSLLP represented three groups in the Carnegie classification system: • Research Extensive universities are generally large research institutions that award 50 or more doctoral degrees across at least 15 disciplines. Twenty-six of 34 of the 2004 NSLLP participating schools were classified as Research Extensive. •

Research Intensive universities differ from Research Extensive universities in that they offer fewer doctorates, generally at least 10 doctoral degrees per year across three or more disciplines, or at least 20 doctoral degrees per year overall. There was

I - 10

one Research Intensive university in the 2004 NSLLP, which was combined into the Research Extensive category for comparison purposes. •

Masters universities offer graduate education through the masters degree, awarding 40 or more masters degrees per year across three or more disciplines. There were 7 Masters universities in the 2004 NSLLP.

Finally, Section III also includes the results by L/L and control samples for the entire sample of 34 institutions.

Section IV: Living-Learning Comparison Tables Section IV reports the findings for the individual living-learning programs at your institution, as well as data from similarly-themed types of living-learning programs represented in the study. Because of the relatively small numbers of students in each living-learning program, student input data (including items such as gender, race/ethnicity, etc.) were not analyzed in order to preserve the confidentiality of respondents. In cases where there were fewer than 10 respondents from an individual living-learning program, either the respondents from the respective program were combined with other similarly themed programs or they were not included in the school’s living-learning comparison.

Description of the Thematic Living-Learning Program Typology Just as there are several types of colleges and universities that differ in mission and scope, there are also multitudinous kinds of living-learning programs. One of the aims of this project was to identify as many types of living-learning programs as possible. Knowing the various kinds of living-learning programs that exist at a variety of college campuses can help us better understand not only how these programs differ by goals, objectives, and outcomes, but what these diverse programs may also share in common with their counterparts. The thematic living-learning program typology was created by analyzing brief descriptions of each living-learning programs provided by the participating schools. Out of the over 275 different living-learning programs represented in the study, 247 were able to be categorized by theme. The programs were sorted into 14 primary types, and several of these types were sorted into sub-categories. The total number of types, including both primary types I - 11

and sub-categories, totals 26. The descriptions below briefly depict each of the primary types of living-learning program categories created. For the complete list of living-learning programs by type, see Appendix B. •

Civic/Social Leadership Programs (21 programs): these programs are divided into three sub-categories. The first sub-category represents Civic Engagement Programs, or programs that focus on active participation in the political process or public service. The second sub-category represents Leadership Programs, which focus on public leadership through community service or service learning. The third sub-category represents Service Learning/Social Justice Programs, which emphasize community service or service learning initiatives while striving to facilitate greater social responsibility among their participants.



Cultural Programs (32): these programs are also divided into three sub-categories. International/Global Programs provide students with an internationally oriented environment in which many countries and nationalities are studied and celebrated. Language Programs afford their participants the opportunity to learn more about a specific language and culture (e.g., German, Japanese, etc.). Multicultural/Diversity Programs focus on domestic diversity issues, which can include topics related to race/ethnicity, one specific race or ethnicity, sexual orientation, or the hearing impaired.



Disciplinary Programs (67): this eclectic group of programs, while distinct, all center on one curricular or disciplinary focus. There are seven sub-categories of Disciplinary Programs, including: 1) Business; 2) Education; 3) Engineering & Computer Science; 4) Health Science; 5) Humanities; 6) General Science; and 7) Social Science.



Fine & Creative Arts Programs (22): these programs celebrate different forms of the fine or creative arts, including visual arts, music, architecture, film, prose, cuisine, and photography.



General Academic Programs (7): these programs emphasize academic excellence and provide general support for academics in the residence hall. However, they have no particular curricular or co-curricular focus or theme.



Honors Programs (22): these programs provide a rigorous and enriched curricular environment to the university’s most academically talented students. Generally, these

I - 12

students are identified initially by their academic records (e.g., high school grades and/or SAT score) and are invited to join the program. •

Multi-Disciplinary Programs (4): these programs are often broader umbrella organizations which house several smaller communities clustered around a specific theme. An example would be a “living-learning center” that hosts 10 or more smaller programs of varying disciplinary themes.



Outdoor Recreation Programs (2): these programs emphasize sporting and outdoor skills. This is the only strictly non-academic theme in the living-learning typology.



Research Programs (2): these programs offer students the opportunity to conduct academic research, either in conjunction with a faculty member’s research project or in teams with fellow student participants under faculty supervision.



Residential Colleges (7): these programs generally span several years of the participants’ college experiences, offering a broad range of course offerings in artistic, social, and cultural pursuits. They often emphasize exploration and creativity, and most closely mirror the classic liberal arts tradition of colleges in the 18th and early 19th centuries.



Transition Programs (30): these programs generally focus on transition from high school to college, and/or offer students an overall introduction to the university and to college life. Two sub-categories are included in this theme: 1) New Student Transition Programs closely adhere to this theme’s definition; and 2) Career/Major Exploration Programs also introduce students to different academic disciplines or vocations in order to assist them in major or career choices.



Upper-Division Programs (4): these programs cater directly to junior and senior undergraduates with the intention of providing out-of-class experiences that complement their academic interests. These co-curricular involvements may include service learning projects, independent research studies, entrepreneurial business pursuits, and internships.



Wellness/Healthy Living Programs (9): these programs foster healthy lifestyles through emphases such as substance free residence environments, fitness programs, and/or health education.



Women’s Programs (18): these programs cater specifically to female students. Women in Leadership Programs focus on facilitating leadership development through several I - 13

opportunities, such as in Greek life, service learning, and cooperative living. Women in Math, Science, and Engineering Programs provide resources for women interested in pursuing majors and careers in mathematics, the sciences, and engineering – fields that have traditionally been male-dominated. We asked each participating institution to provide a brief description of their living-learning programs.

The living-learning typology can be useful for practitioners who wish to compare the results of their data with the typology grouping that is most similar to their own program. This activity is similar to “benchmarking,” in that the typology grouping consists of thematicallysimilar programs at several different institutions, so the typology score may reflect a “standard” for programs that are comparable to one’s own. Finally, Section V of this report summarizes the custom questions asked only of the participants at your institution, and Section VI presents the responses that your students left for the open-ended question at the end of the survey.

Uses of the Data Strategic use of institutional living-learning program data can give campus practitioners the ability to communicate to policymakers how living-learning programs contribute effectively to the institution’s core mission and goals: •

justification of living-learning programs as legitimate uses of limited resources;



evidence of student learning outcomes to contribute to programmatic and institutional accreditation reviews; and



support for the effectiveness of academic and student affairs partnerships on student outcomes.

As both public and private post-secondary institutions face challenging economic times, they may be forced to streamline or even eliminate programs that are not clearly central to advancing their missions. While living-learning programs may be at the forefront of reform efforts in undergraduate education, they are not exempt from scrutiny in difficult financial times. Living-learning program administrators and advocates cannot rest on the assumptions from which they were created; like all co-curricular programs, living-learning initiatives must link their programming directly to improved student learning and development. External stakeholders such as accreditors and professional associations also require evidence of student learning and development in all aspects of college life – not only in the I - 14

classroom. Several accrediting agencies have highlighted student skills and abilities, of which institutions must provide evidence, including: •

Analytical and information skills



Knowledge and cognitive abilities



Maturation in student attitudes, life skills, and involvement in co-curricular activities



Effectively addressing student needs, experiences, and levels of satisfaction

As a direct result of your institution’s participation in the National Study of Living-Learning Programs, you may be able to present concrete evidence of enhanced student learning outcomes when an accreditation review is scheduled. Detailed analysis of your institution’s data may benefit both your living-learning program and your campus, and a favorable review could win the gratefulness of the administration toward you and your program.

I - 15

References

Ad Hoc Committee on Undergraduate Education (1987). Promises to keep: The College Park plan for undergraduate education [the “Pease Report”]. College Park Senate, University of Maryland. At http://www.inform.umd.edu/CampusInfo/Reports, accessed June 16, 2003. The American Association for Public Opinion Research. (2000). Standard Definitions: Final Dispositions of Case Codes and Outcomes Rates for Surveys. Astin, A.W. (1993). What matters in college? Four critical years revisited. San Francisco: Jossey-Bass. Boyer Commission on Educating Undergraduates in the Research University (1998). Reinventing undergraduate education: A blueprint for America’s research universities. State University of New York at Stony Brook. At http://notes.cc.sunysb.edu, accessed April 19, 2000. Boyer Commission on Educating Undergraduates in the Research University (2002). Reinventing undergraduate education: Thee years after the Boyer Report. State University of New York at Stony Brook. At http://notes.cc.sunysb.edu, accessed June 16, 2003. National Science Foundation (1996). Shaping the future: New expectations for undergraduate education in science, mathematics, engineering, and technology. A report on its review of undergraduate education by the Advisory Committee to the NSF Directorate for Education and Human Resources. Arlington, VA: National Science Foundation. Shapiro, N.S., and Levine, J.H. (1999). Creating learning communities: A practical guide to winning support, organizing for change, and implementing programs. San Francisco: Jossey-Bass, Inc., Publishers.

I - 16

Suggest Documents