The resources highlighted below are a curated sample of SoTL Research assessment tools – ways of quantifying elements of student experience. We do not present these as an exhaustive list; rather, we hope that faculty will use this page as a starting point and explore the literature to identify the tools that are most relevant to their research questions.
Adapted from Yen et al. (2010)*, this survey addresses attitudes about novel technology across three dimensions: (1) impact, (2) perceived usefulness, and (3) perceived ease of use. Students rate each statement on a Likert scale from 1 (Strongly Disagree) to 5 (Strongly Agree). It is useful for comparing the implementation of a new technology relative to some traditional (but still technology-based) alternative. For example, one might use it to compare virtual reality-based class sessions to Zoom-based class sessions. This template version refers to [classroom context] (for example, recitation sections) and [discipline] (for example, biochemistry); these placeholders should be filled in as appropriate for the particular implementation. The technology is intentionally referred to generically as “technology” so that the same survey version can be used by all respondents. The final section, “Additional Items”, is a set of optional items that may be useful to include (but that were not part of the original instrument); it includes a verification that students did not perceive the instructor to be more invested in teaching in one modality (i.e., with one technology) over the other (item 1) as well as opportunities for free response (items 2-3).
Impact
- The technology used during my [classroom context] would be a positive addition for [discipline] students.
- The technology used during my [classroom context] is an important part of meeting my needs related to learning [discipline].
Perceived Usefulness
- The technology used during my [classroom context] makes it easier to learn [discipline].
- The technology used during my [classroom context] enables me to manage my learning more quickly.
- The technology used during my [classroom context] makes it more likely that I can learn [discipline].
- The technology used during my [classroom context] is useful for learning [discipline].
- I think the technology used during my [classroom context] presents a more equitable process for helping with learning [discipline].
- I am satisfied with the technology used during my [classroom context] for learning [discipline].
- I am able to learn [discipline] in a timely manner because of the technology used during my [classroom context].
- The technology used during my [classroom context] increases my ability to learn [discipline].
- I am better able to learn [discipline] with the technology used during my [classroom context].
Perceived Ease of Use
- I am comfortable with my ability to access the technology used during my [classroom context].
- Learning to operate the technology used during my [classroom context] is easy for me.
- It is easy for me to become skillful at operating the technology used during my [classroom context] app.
- I find the technology used during my [classroom context] easy to operate.
- I can always remember how to operate the technology used during my [classroom context].
Additional Items
- My professor made an effort to effectively implement the technology used during my [classroom context]. (Rated as above)
- What aspects of your [classroom context] were most useful or valuable? (Free response)
- How would you improve your [classroom context]? (Free response)
The MSLQ was developed by Pintrich & De Groot (1990) to measure the types of learning strategies and academic motivation used by college students. It consists of 44 items, and students are asked to rate each statement "based on your behavior in this class" on a scale from 1 ("not at all true of me") to 7 ("very true of me"). The questionnaire features the following subscale structure:
- Motivational beliefs
- Intrinsic value
- Self-efficacy
- Text anxiety
- Self-regulated learning strategies
- Cognitive strategy use
- Self-regulation
The first link above presents the student-facing version, and the second link (the paper itself) includes an appendix that sorts the items by subscale and denotes the reverse-scored items.
VALUE rubrics, published by AAC&U, are open educational resources that enable educators to assess students’ original work. The publisher offers a proven methodology for applying the VALUE rubrics to evaluate student performance reliably and verifiably across sixteen broad, cross-cutting learning outcomes. The rubrics address the following domains:
- Civic Engagement - Local and Global
- Creative Thinking
- Critical Thinking
- Ethical Reasoning
- Foundations and Skills for Lifelong Learning
- Global Learning
- Information Literacy
- Inquiry and Analysis
- Integrative and Applied Learning
- Intercultural Knowledge and Competence
- Oral Communication
- Problem Solving
- Quantitative Literacy
- Reading
- Teamwork
- Written Communication
The CCI (Fraser & Treagust, 1986) gives instructors insights about the extent to which a classroom is constructive and positive and, therefore, conducive to learning. Furthermore, the CCI assesses student preferences about classroom climate, which can provide a roadmap for necessary changes. The questionnaire is designed for use in small classes or seminars rather than lectures or laboratory classes.
COPUS – a product of Carl Wieman's Science Education Initiative – was created to facilitate the collection of information on the range and frequency of teaching practices in STEM fields at department-wide and institution-wide scales (Smith et al., 2013). COPUS allows observers, after a short 1.5 hour training period, to reliably characterize how faculty and students are spending their time in the classroom.
SALG (see Scholl & Olson, 2014) is a flexible tool that measures the degree to which a course helps students achieve learning objectives. SALG invites students to reflect on their own learning and how it relate to specific elements of the course.
The instrument includes five overall questions:
- How much did the following aspects of the course help you in your learning? [insert aspects of interest]
- As a result of your work in this class, what gains did you make in your understanding of each of the following? [insert concepts of interest]
- As a result of your work in this class, what gains did you make in the following skills? [insert skills of interest]
- As a result of your work in this class, what gains did you make in the following? [insert attitudinal issues of interest]
- As a result of your work in this class, what gains did you make in integrating the following? [insert sub-items that address how students integrate information]
Read more about about the development, validity, and reliability of SALG here.