Using creative artifacts to teach scientific communication to psychology students

KatieAnn Skogsberg

Beth Ann Rice

Prompt 7.2. Submitted July 11, 2021; accepted March 3, 2023; published July 15, 2023. For the PDF version of this essay and any supplementary material accompanying it, visit https://doi.org/10.31719/pjaw.v7i2.125 .

Abstract: The pandemic of 2020 forced many instructors to reevaluate their teaching and assessment practices. Assignments and assessments designed for face-to-face classes were quickly adapted to go online. Faculty-to- student relationships built through classroom interactions were transformed by the mediation of online platforms. At the time, the co-authors of this article were teaching different psychology courses at different institutions. However, we had similar concerns about the validity of our assessments in an unmonitored online environment and about maintaining personal connections with our students. We used the summer of 2020 to reimagine how our courses could be adapted to this new environment while satisfying specific learning goals, including demonstrating the ability to apply content knowledge and communicating scientific information through writing. To meet these challenges, we implemented a variation on authentic assessments. We replaced our exams with an assignment where students created artifacts of various forms to demonstrate what they had learned and how it connected to their future careers, personal interests, or real-world problems. They also had to include a written description for a nonexpert audience to demonstrate their ability to explain their artifacts. This article presents our rationale, requirements, assignments, grading rubrics, student feedback, and reflections on our experiences.


The pandemic of 2020 created a variety of disruptions in our usual ways of teaching and assessing students. For those accustomed to teaching face-to-face, the sudden shift to online learning disrupted how we connected with our students, their engagement with the coursework, and their sense of belonging to a learning community (Marler et al. 2021; Reid et al. 2022; Shin and Hickey 2021; Tulaskar and Turunen 2022). The authors of this article also recognized that the assignments and exams we had carefully designed for controlled, monitored classrooms were poorly suited for an unmonitored online learning environment. Therefore, we worked together to find a new way to assess our students, one that might not only be better suited to the online environment but could also be used in a traditional setting and help address a variety of concerns that we already had about placing so much weight on exams.

Background

At the start of the pandemic (spring 2020), we were acquaintances in an online community of behavioral neuroscience and psychology instructors. The pandemic had disrupted our usual methods of assessing students—primarily through exams and formal papers—so much that it felt like an opportunity to be bold and try something new. That summer, we worked together to revise and revitalize our courses, including exploring new ways to conduct assessments.

Even though we were excited to explore alternative assessment methods, we still had guidelines to follow. Specifically, we both needed to ensure that our course objectives aligned with the American Psychological Association (APA) guidelines for the undergraduate psychology major (2.0) (American Psychological Association 2013). These guidelines include various topics, such as content knowledge, critical thinking, ethics, communication, and professional development. Additionally, the shift to online and remote teaching had weakened the personal connections we were accustomed to forming with and among our students. Therefore, we wanted assessments that addressed the course requirements (and APA standards) and fostered feelings of inclusion and engagement.

Conrad and Openo (2018) argue that incorporating students’ personal and career interests into their assignments and assessments could help us accomplish our goals. They recommend encouraging students to incorporate their authentic selves into their work and sharing it with their classmates to improve engagement and feelings of connection and community. Similarly, Wang (2021) demonstrated that giving students the freedom to take a creative role in developing their assessments deepens their learning and improves engagement and enjoyment.

The conditions of rapidly shifting to remote learning also appeared to be increasing inequities among the students in our classrooms, especially when it came to taking exams. During emergency remote learning, some students had limited access to the internet, additional caregiver and family responsibilities, or could not find a quiet place to do their work (Shin and Hickey 2021). However, many instructors feel that traditional assessment methods may create inequities even under the best conditions (Darling-Hammond and Snyder 2000; Feldman 2018; Tobin and Behling 2018; Wiggins 1990). Students taking the exam in a second language or with undocumented or untreated learning disabilities may struggle reading the questions (Feldman 2018; Tobin and Behling 2018). Others may suffer from test anxiety and perform poorly on traditional exams due to increased cognitive load (Cassady and Johnson 2002; Cohen and Khalaila 2014; Tobin and Behling 2018).

Additionally, Dr. Skogsberg had started to suspect that her exams were more of a measure of the student’s test-taking skills than their actual comprehension and application of the course content. Her anecdotal evidence for this was that the students who typically asked the most interesting and challenging questions in class often performed near the mean on the assessments, suggesting they understood the material in a way that was not captured on the exams. Therefore, we wanted a way to assess our students without in-person or time-limited exams. This led us to explore the Principles of Universal Design for Learning to find alternative ways to assess our students that did not rely entirely on exams (Tobin and Behling 2018; Burgstahler 2020).

Although we are focusing primarily on writing in this article, we wanted to avoid limiting our students to a specific format for the overall project. Therefore, we used the term “Artifacts” to describe the assessments. While this term may be familiar to those in writing studies, it is rarely used in STEM fields. In fact, in science, the term typically means something that is introduced into the study that unintentionally biases the outcome. However, we applied the term in a more traditional sense, using the Merriam-Webster definition of “a simple object showing human workmanship…” (Merriam-Webster n.d.). Because the term is uncommon in our field, we felt it encouraged students to think about attempting things beyond their typical assignments in a science course. It also allowed us, as instructors, to be more open-minded about the materials we would accept. For example, in an introduction to psychology course, a student majoring in economics may write a paper using material from the chapter on emotions to examine economic principles. An artist could create a model of the brain using their preferred medium, or a computer scientist could write code demonstrating a simple neural network.

Rationale

To create our new assessments, we applied the principles of backward design (Reynolds and Kearns 2017; Wiggins and McTighe 1998). As noted earlier, we needed to ensure that our assessments addressed the APA guidelines, which include five learning goals (LG): 1) Knowledge base in psychology, 2) Scientific Inquiry and Critical Thinking, 3) Ethical and Social Responsibility in a Diverse World, 4) Communication, and 5) Professional Development (American Psychological Association 2013). The activities we describe here can address all these learning goals, but the examples we will share focus primarily on demonstrating knowledge of psychology (LG 1), scientific inquiry and critical thinking (LG 2), communication (LG 4), and professional development (LG 5). Additionally, the communication goal includes the objective “demonstrate effective writing for different purposes” (LG 4.1), and the professional development goal includes “apply psychological content and skills to career goals” (LG 5.1).

The term “authentic assessments” is typically applied to assignments that align with the student’s interests and prepare them for work they will do in their professional lives or address a real-world problem (Conrad and Openo 2018; Mueller 2005; Wiggins 1990; Zilvinskis 2015). After graduation, there are few situations where our students will be taking monitored exams, writing traditional research papers, or completing activities where the correct answer is already known. Therefore, we wanted assessments that reflected their interests or career trajectories. For example, since Dr. Skogsberg’s Introduction to Psychology course satisfies a general education requirement, most of her students were majoring in other fields, such as biochemistry, computer science, economics, history, international studies, mathematics, and theater. Their career goals included becoming business leaders, doctors, programmers, lawyers, politicians, artists, and educators. Therefore, the students were encouraged to create artifacts that reflected their interests and potential career goals. Their artifacts included creative writing, visual media, models, computer programs, music, and in some cases, traditional research papers or research proposals.

Since we are not poets, computer scientists, or artists, we assessed our students on two primary factors. The first was the ability to accurately interpret and apply terms and concepts from the course (APA LG 1, 2, and 5). Psychology is a field many people think they understand until they are evaluated on their ability to apply the concepts. Because of this, many non-psychologists use terms that have specific meanings in psychology in ways that misrepresent their true meaning. Perpetuating misconceptions in everyday conversations, books, and films can harm those with psychological disorders and those who care for them. Therefore, correctly explaining and providing examples of psychology terms and concepts is essential to being a psychology student.

The second primary assessment factor was effectively explaining the artifact to a non-expert audience in a written format (APA LG 4 and 5). Science students often use scientific terms and jargon, assuming that their audience will understand. They also seem to believe that using the field’s jargon makes them sound more knowledgeable. However, research shows that scientific jargon confuses and alienates non-expert readers and can make them less likely to believe what they are reading (Oreskes 2021; Woolston 2020). Additionally, Boyd and colleagues ((2020)) point out that writing for a non-expert audience is an important skill many non-science students lack. These writing issues also apply to students who are pursuing non-science fields. Artists, business leaders, economists, doctors, lawyers, programmers, and politicians all need to be able to communicate with others who do not share their expertise. Therefore, this writing assignment requires perspective-taking while also revealing a great deal about what the student does and does not accurately understand about the content.

Audience

We implemented these alternative assignments at two distinctly different institutions. Dr. Skogsberg taught an Introduction to Psychology course at a small liberal arts college. The course is required for psychology majors and behavioral neuroscience majors and satisfies a general education requirement for non-majors. Before the pandemic, the students were assessed primarily on exams and lab activities, with one writing assignment (analysis of two empirical papers) spread out throughout the semester. In the fall of 2020, the labs remained, the analysis paper was dropped, open-book online quizzes replaced the exams, and the artifacts became the primary grade-determining component.

Dr. Rice was at a mid-sized public university, where she taught an upper-level course on Learning. Specifically, this course focused on the fundamental mechanisms of learning new behaviors. The Learning course is an elective for psychology students and a requirement for neuroscience students. Before the pandemic, the students were assessed primarily on exams, lab activities, and a literature review. In the fall of 2020, the lab experiences and lab reports remained. However, the artifacts comprised 60% of the final grade. Students taking Learning are typically juniors who have previously taken both Introduction to Psychology and Psychological Science, the latter covering basic research methods and scientific writing using APA format. The Learning course also meets the requirements for being a high-impact practice (HIP) course at Dr. Rice’s institution. High-impact practice courses go through a review to earn this designation and require that students learn not only content but also a skill. The HIP for this course was in undergraduate research.

Assignment

Even though we were teaching different courses to students from different populations, the assessments were administered similarly. Instead of taking an exam or writing a traditional term paper over a specific content unit, our students created artifacts with a written explanation to demonstrate their proficiency in understanding and applying the material. Each explanation needed to include at least five key concepts from the unit.

In alignment with the APA guidelines, the learning goals for both courses included demonstrating knowledge of key concepts in psychology (APA LG 1), connecting concepts to real-world problems or applications (APA LG 2, 4 and 5), and writing effectively for different purposes (APA LG 4 and 5) which includes correctly applying APA formatting (APA LG 4.1d). These learning goals were outlined in our rubrics, which listed the requirements to ensure that all submissions had similar levels of rigor. These included correctly defining and applying a specific number of terms per chapter, page, or time length (depending on the medium), correctly using in-text citations, and a complete reference list using APA format. Example rubrics and instructions can be found in the online supplementary materials to this article.

The number of terms per chapter were based on the number of questions we typically ask on an exam. For Dr. Skogsberg’s class, this was five terms per chapter. Since exam questions typically incorporate multiple concepts, we required our students to use the terms in an example or explain them in context. We recognize that this approach may allow some students to explore topics in depth while ignoring others, but we were willing to make this trade-off.

To determine page and time length requirements, we talked with colleagues specializing in fields such as poetry, film, photography, painting, pottery, theatre, and computer science. We asked for their input on what would be equivalent to a three- to five-page essay. Admittedly, this was not scientifically derived but simply their best guess.

Students were also required to use correct in-text citations, provide a complete reference list, and properly cite their sources in the artifacts. Even though not all of our students will be going into a field where they will use APA format, learning to give credit where credit is due was important. We chose the APA format instead of other citation methods because learning to adapt and recognize different citation formats is likely something they will need to do in other courses or future careers (e.g., doctors, lawyers, and politicians). Applying a new citation format also teaches students to pay attention to details. One of the elements that students often need help with is recognizing the subtle differences between citation formats. While it is important to provide a citation, recognizing the slight differences and learning to adapt to new requirements can be a valuable lesson in paying attention to details and following instructions. These skills are transferable to a broad range of life and career goals.

The students developed their artifact ideas through an iterative process that started with a proposal submitted for the approval of the instructor, then refined through peer feedback sessions. The proposal consisted of a single paragraph explaining what topic they wanted to cover and what format they planned to use. They were given free rein to decide what they wanted to do and encouraged to tap into their interests or career goals. Some students had ideas immediately, whereas others struggled to develop ideas independently. For those students who struggled, we asked probing questions about their career and life goals and what mediums they felt comfortable using. Some students opted for what they perceived as the safe route of writing a research paper. However, recognizing that they could do something they enjoyed was often a revelation. For example, one student in Dr. Skogsberg’s class wrote and recorded a video of himself playing the guitar and singing a country-western song about how to deal with stress and anxiety. The key was helping the student recognize how they could apply the material in a way that made it personal, engaging, or enjoyable.

Once the students had a topic and an idea, they met with a small group of classmates to discuss their ideas. The purpose of these small group meetings was to help them flesh out and talk through their ideas. Each student was required to document asking questions about their peers’ work and how they responded to the questions asked of them. Students would then reflect on these meetings, writing a short paragraph explaining whether the meetings had helped them refine and improve their ideas and how. They were also required to submit a proposed timeline outlining the steps they would need to take to complete their artifact before the deadline. Beyond this, we did not provide additional instruction or support for developing their artifacts. We wanted our students to take ownership of their artifacts by allowing them to explore a new approach to demonstrating their knowledge and the freedom to showcase skills or abilities they might otherwise not have the chance to do in our courses.

When the final artifact was completed, students submitted their materials to the learning management system or a virtual drop-box. Since the artifacts consisted of various formats, students were allowed to submit photos, videos, or audio recordings of their projects as long as they were in a universally readable format (e.g., JPG, PDF, or MP4). A synchronous class period was then used to hold an “Artifact Showcase” session where students were encouraged to share their artifacts with their classmates. Students were not required to show their artifacts, but participation was encouraged by having them vote for the top three artifacts, all of which received a small prize.

To ensure that all students could explain their artifacts to non-experts, each submission had to include a separate written explanation of the artifact in one to three pages. This written submission included an explanation of what the artifact does or what it was about, definitions of the terms used, and an explanation of how the terms were used in the artifact. Because reflecting on one’s own learning has been shown to improve overall learning outcomes (Boyd, Basgier, and Wilson 2020; Norman et al. 2019), metacognition was also encouraged. In the field of psychology, metacognition specifically refers to consciously reflecting on one’s thinking process in an attempt to regulate, control, or learn from it (American Psychological Association n.d.; Norman et al. 2019). Specifically, in addition to explaining the artifact or how it worked, they connected it to their interests or a real-world problem and reflected on their development process. This latter part required them to examine what they learned about the topic and themselves while creating the artifact.

In the Learning course, students had additional learning outcomes focusing on writing about their research. Consequently, students had to support their artifacts with evidence from empirical research (APA LG 2). Artifacts that were not traditional papers (e.g., infographics) included an annotated bibliography in APA format, explicitly highlighting the importance and support of the literature to their artifact. Lastly, regardless of artifact format, students were required to follow the APA formatting guidelines (APA LG 4 and 5). For example, if students chose a voice-over PowerPoint, they needed to follow APA citation and format when presenting their artifact and the annotated bibliography. This immersion of research with the artifacts allowed students to actively review the literature while demonstrating content knowledge (APA LG 1 and 2) and the ability to apply learned knowledge to a new topic (e.g., solve a real-world problem). Lastly, the assignment allowed students to practice skills previously learned in other courses (e.g., research methods) and refine communication (written and oral) skills in the psychological sciences (APA LG 2 and 4).

In Dr. Skogsberg’s Introduction to Psychology course, the students’ writing abilities ranged from having little or no experience with APA formatting to demonstrating strong scientific writing skills. To address these differences, she provided the students with the grading rubric (see Supplementary Materials) and examples from previous courses that she had modified to suit the current assignment. They were also given specific links to resources from the Purdue Owl website (“APA Style Introduction” n.d.) to review. In the Introduction to Psychology course, artifacts were graded on a “meets expectations” or “does not meet expectations, revise and resubmit” basis. Students were allowed a limited number of “tokens” to exchange for the opportunity to revise and resubmit an assignment based on feedback.

One of the prerequisites for Dr. Rice’s Learning course is a research methods course (Psychological Science), where students learn the basics of scientific writing in APA format. In her Learning course, Dr. Rice emphasized refining and improving these skills. To ensure the students had multiple opportunities to practice their writing skills, they submitted drafts to their groups a week before they were due and conducted a peer review of their group members’ artifacts using the rubrics provided. Following the peer review, students submitted their revised drafts for feedback from the professor before submitting the final artifact. The peer review and the draft were low-stakes assignments, allowing for some accountability while also providing ample opportunities for the students to clarify any confusion before submitting the final artifact. Rubrics were developed to be flexible so that they could be used for various artifact types but rigorous enough to cover a multiplicity of learning objectives.

Student Works

This section provides specific examples of student works, shared with their permission, to illustrate our assignment. In Dr. Skogsberg’s Introduction to Psychology course, a math major initially struggled with connecting topics from psychology with her academic interests. To her, the two concepts seemed nearly mutually exclusive. However, an in-class discussion of how stereotype threat impacts women in math caught her interest. She started exploring the literature and found several empirical papers showing that women experience more math anxiety than men. Reflecting on her own experiences, she combined topics from the chapters on social psychology and research methods to propose an experiment to use jigsaw classrooms to help reduce ingroup/outgroup bias, stereotyping, and improve feelings of belonging and empathy. This example and several others (shared with the students’ permission) can be found in the online supplementary materials to this article.

By completing this self-designed artifact, this student demonstrated a deep understanding of content knowledge (APA LG 1), scientific thinking, and critical thinking (APA LG 2) in a personally meaningful way that is unlikely to be captured by an exam. As a STEM major, this assignment allowed her to write in a different format, with a different purpose than her usual work (APA LG 4), and apply psychology topics to her interests and career goals (APA LG 5).

A student in the Learning course created a fiction story while providing an annotated bibliography and vocabulary list (see student example: Strange Visitor in the linked file). This student’s artifact was an original fiction story about an alien visitor that two children found. In this story, the student correctly demonstrated the learning concepts discussed in that unit (APA LG 1) and applied them to a new situation (APA LG 2 and 3). Specifically, the story revolves around two children who use learning concepts such as fatigue and habituation to understand the behavior of the alien creature. Notably, the student researched beyond the textbook to support the development of the methods described in the story and provided this information in an annotated bibliography along with the story (APA LG 4 and 5).

Student Feedback

End-of-term course evaluations administered by each institution provided anonymous feedback about the courses and, in some cases, specifically about the artifacts. Students in these courses were given open-ended questions as part of their course evaluations. These questions captured the students’ perceptions of how the artifacts helped improve their research and writing skills. Overall, the students responded favorably.

Examples of student qualitative responses

“...I now feel more confident about reading/comprehending Scientific Literature.” (APA LG 1 and 2)

“From all of the papers that we wrote, I became a better writer and I learned how to better cite sources. A big portion of our grade had to do with creatively writing, so I became better at applying content to real-world scenarios” (APA LG 2 and 4)

“I believe I became better at writing through applying the material we learned to the real world/our goals. I also got better at pitching my ideas to peers and that helped to push me to be more creative.” (APA LG 2 and 4)

“The implementation of the artifacts helped us to make connections to all of these (e.g. social issues) and more (APA LG 1 and 3). I chose to make connections between the material and my career choice but I got to listen to other students who were making connections to their personal interests along with social issues.” (APA LG 5)

“The artifacts were very intimating and still are as we approach our last one, but I think they test our knowledge on a topic of interest just enough.” (APA LG 1)

While most of the comments about the artifacts were positive, the last comment indicates that a few students struggled with the open-ended format of the assignment. Generating ideas and following through on them were greater challenges for the students than meeting the more prescriptive writing requirements. They reported feeling anxious about not having explicit instructions on what to do. We reassured them by pointing to the rubric and reminding them that as long as we could identify those specific elements, the format did not matter. We also reminded them that we were not grading them on their artistic or creative abilities. It did not matter if they used stick figures, watercolor paintings, sonnets, or rap lyrics. If the content was accurate and they could explain it, they would earn a passing grade.

Even with specific rubrics to follow, several students in both courses needed to revise their assignments to meet the writing requirements. The revision process allowed us to correct misunderstandings and helped students recognize their mistakes. To help students correct their errors, we used the rubric to note what elements they had not completed satisfactorily. Sometimes, providing specific feedback or meeting with them was necessary to correct conceptual errors or subtle APA formatting issues. While this was time-consuming, this revision process proved to be useful in helping students correct their understandings and interpretations.

Student quantitative responses

In Learning, students provided feedback on several course-related statements on a scale ranging from one to four, with one being strongly dissatisfied and four being strongly satisfied. Students responded with an average score ranging from 3.5 to 3.63 (SD range 0.52-0.53) to the following statements: objectives of the class were clear, feedback allowed for understanding and improvement in the course, the class allowed for opportunities to develop critical thinking skills, the course allowed for opportunities to seek more knowledge about the course subject, and that they learned much that was valuable to them. Because the majority of points in this course were from the artifacts (60%), these results suggest that the artifacts played a significant role in these scores. Additionally, students in the Learning course had an opportunity to give feedback in open-ended questions. Before artifacts, 15% of students reported the workload was too much, while this dropped to 10% once the artifacts were implemented.

Instructor Responses

What started as a solution to an abrupt move to an online environment amid a pandemic ended with a product that addressed our course learning goals in flexible, inclusive, and engaging ways that we plan to continue using. While the students overwhelmingly reported enjoying creating the artifacts, we were also able to conduct assessments that aligned with the APA learning goals and tapped into their existing skills and interests, allowing them to engage with the material in memorable and meaningful ways.

As instructors, opening the assignments to various formats was initially intimidating. It can be difficult to interpret whether a student understands a topic like schizophrenia from their interpretive painting or stick figures. But written definitions and explanations of how the terms are applied are much easier to assess. We leaned into the rubrics, which allowed us to assess all submissions on concrete, specific writing requirements without having to be experts in the vast array of formats the students explored. Additionally, we felt the assignments were much more enjoyable to grade than typical exams or papers. We both preferred reading creative stories about aliens and watching funny videos about flossing to grading another term paper or exam. We also felt that we got to know our students better by learning about their other talents and interests.

When comparing previous course evaluations and the assessments themselves, it was evident that our students were more engaged and generally more enthusiastic about the course concepts. For example, in the Learning course, there were notable positive changes to the student evaluation responses. Specifically, before the pandemic, multiple assessments were used to assess content knowledge (e.g., exams and quizzes; APA LG 1) and critical thinking and scientific writing skills (e.g., essays and papers; APA LG 2,4). During the pandemic, Artifacts replaced these types of assignments and, in turn, appeared to reduce the students’ workload. Additionally, there were fewer emails and requests for extensions compared to traditional assessments. The drafts and feedback given during the development of the artifacts allowed the instructors to intervene early, helping to accommodate and ease the variability among our students’ prior content knowledge and writing skills. Additionally, the flexibility of the assignments allowed students to play to their strengths while demonstrating their understanding of course material in a way that exams and assignments often miss.

Conclusion

While the pressures of the pandemic pushed us to experiment with creative artifacts as assessments, our experiences have convinced us to keep them even after we have returned to in-person instruction. The quality and creativity demonstrated by our students in these assignments exceeded our expectations and are illustrated bn the examples of the creative story about aliens, the math anxiety experiment, and the videos about flossing.

In retrospect, both instructors feel that the students’ depth of understanding and ability to use the material to connect with their personal and real-world interests were more engaging and effective than demonstrating the ability to memorize and replicate content as is typically captured on traditional assessments. Additionally, the students informally reported that they appreciated the flexibility, felt more engaged, and enjoyed the opportunity to demonstrate their creativity on these assignments. Importantly, as evidenced by the student evaluations, the artifacts effectively increased student engagement and provided the instructors with practical ways to assess the students’ ability to meet the course learning goals. Lastly, with the help of a good rubric, grading assignments was more enjoyable and less monotonous than grading traditional assessments. We learned more about our students’ interests, talents, and lives outside of our classroom than we would have using traditional assessments. Further, our experiences show that artifacts, as authentic assessments, can be used in various courses, regardless of grade level or content. Therefore, while our campuses return to in-person instruction, we will continue using artifacts as authentic assessments in courses where appropriate.


ASSIGNMENT: Artifact instructions

[Editor note: The grading rubric that appears with this assignment is included as a supplement to this article (see Supplementary Materials).]


References

American Psychological Association. n.d. “APA Dictionary of Psychology.” Accessed December 29, 2022. https://dictionary.apa.org/.

———. 2013. “APA Guidelines for the Undergraduate Psychology Major: Version 2.0.” http://www.apa.org/ed/precollege/undergrad/index.aspx.

“APA Style Introduction.” n.d. Purdue Online Writing Lab. Purdue Writing Lab. Accessed June 22, 2022. https://owl.purdue.edu/owl/research_and_citation/apa_style/apa_style_introduction.html.

Boyd, R., C. Basgier, and C. Wilson. 2020. “Repurposing Scientific Writing in Conservation Biology.” Prompt: A Journal of Academic Writing Assignments 4 (1): 3–17. https://doi.org/10.31719/pjaw.v4i1.53.

Burgstahler, Sheryl. 2020. “Universal Design of Instruction (UDI): Definition, Principles, Guidelines, and Examples.” DO-IT University of Washington. https://www.washington.edu/doit/universal-design-instruction-udi-definition-principles-guidelines-and-examples.

Cassady, J., and R. Johnson. 2002. “Cognitive Test Anxiety and Academic Performance.” Contemporary Educational Psychology 27: 270–95. https://doi.org/10.1006/ceps.2001.1094.

Cohen, M., and R. Khalaila. 2014. “Saliva pH as a Biomarker of Exam Stress and a Predictor of Exam Performance.” Journal of Psychosomatic Research 77 (5): 420–25. https://doi.org/10.1016/j.jpsychores.2014.07.003.

Conrad, D., and J. Openo. 2018. Assessment Strategies for Online Learning: Engagement and Authenticity. Athabasca University Press. https://doi.org/10.15215/aupress/9781771992329.01.

Darling-Hammond, L., and J. Snyder. 2000. “Authentic Assessment of Teaching in Context.” Teaching and Teacher Education 16 (5): 523–45. https://doi.org/10.1016/S0742-051X(00)00015-9.

Feldman, J. 2018. Grading for Equity: What It Is, Why It Matters, and How It Can Transform Schools and Classrooms. SAGE Publications.

Marler, E. K., M. J. Bruce, A. Abaoud, C. Henrichsen, W. Suksatan, S. Homvisetvongsa, and H. Matsuo. 2021. “The Impact of COVID-19 on University Students’ Academic Motivation, Social Connection, and Psychological Well-Being.” Scholarship of Teaching and Learning in Psychology. https://doi.org/10.1037/stl0000294.

Merriam-Webster. n.d. “Artifact.” Accessed December 28, 2022. https://www.merriam-webster.com/dictionary/artifact.

Mueller, J. 2005. “The Authentic Assessment Toolbox: Enhancing Student Learning Through Online Faculty Development.” Journal of Online Learning and Teaching 1 (1): 7.

Norman, E., G. Pfuhl, R. G. Sæle, F. Svartdal, T. Låg, and T. I. Dahl. 2019. “Metacognition in Psychology.” Review of General Psychology 23 (4): 403–24. https://doi.org/10.1177/1089268019883821.

Oreskes, N. 2021. “Scientists: When Talking to the Public, Please Speak Plainly.” Scientific American, October. https://www.scientificamerican.com/article/scientists-when-talking-to-the-public-please-speak-plainly/.

Reid, M. P., S. M. Ghose, A. R. MacPherson, S. M. Sabet, C. M. Williams, and N. D. Dautovich. 2022. “Learning in the Time of COVID: Undergraduate Experiences of a Mid-Semester Transition to Virtual Learning Due to the COVID-19 Pandemic.” Teaching of Psychology 00986283221082987. https://doi.org/10.1177/00986283221082987.

Reynolds, H. L., and K. D. Kearns. 2017. “A Planning Tool for Incorporating Backward Design, Active Learning, and Authentic Assessment in the College Classroom.” College Teaching 65 (1): 17–27. https://doi.org/10.1080/87567555.2016.1222575.

Shin, M., and K. Hickey. 2021. “Needs a Little TLC: Examining College Students’ Emergency Remote Teaching and Learning Experiences During COVID-19.” Journal of Further and Higher Education 45 (7): 973–86. https://doi.org/10.1080/0309877X.2020.1847261.

Tobin, T. J., and K. T. Behling. 2018. Reach Everyone, Teach Everyone: Universal Design for Learning in Higher Education. 1st ed. West Virginia University Press. http://wvupressonline.com/node/757.

Tulaskar, R., and M. Turunen. 2022. “What Students Want? Experiences, Challenges, and Engagement During Emergency Remote Learning Amidst COVID-19 Crisis.” Education and Information Technologies 27 (1): 551–87. https://doi.org/10.1007/s10639-021-10747-1.

Wang, B. 2021. “If They Build It: Student-designed Assignments in a Molecular Biology Laboratory.” Prompt: A Journal of Academic Writing Assignments 5 (2). https://doi.org/10.31719/pjaw.v5i2.85.

Wiggins, G. 1990. “The Case for Authentic Assessment.” Practical Assessment, Research & Evaluation 2 (2). https://doi.org/10.7275/FFB1-MM19.

Wiggins, G., and J. McTighe. 1998. What Is Backward Design? 1st ed. Merrill Prentice Hall. https://web.archive.org/web/20160721163755/http://www.fitnyc.edu/files/pdfs/Backward_design.pdf.

Woolston, C. 2020. “Words Matter: Jargon Shuts Readers Out.” Nature 579: 309. https://doi.org/10.1038/d41586-020-00580-w.

Zilvinskis, J. 2015. “Using Authentic Assessment to Reinforce Student Learning in High-Impact Practices.” Assessment Update 27 (6): 7–13. https://doi.org/10.1002/au.30040.