Student Attitudes Towards Generative AI Use in Teaching & Learning 

This spring, the Generative AI Professional Development Committee (AI-PDC) focused its efforts on academic integrity and students’ use of AI for coursework. As they completed their work for the semester, Paul Keys, the primary AI-PDC facilitator, reached out to the Office of Research, Planning, and Assessment (RPA) to seek advice on conducting focus group interviews with students to gauge their use of generative AI for learning, their understanding of its capabilities, their attitudes towards faculty’s use of AI tools for teaching and assessing, and the importance of the knowledge and relevance of AI tools to their career plans. RPA worked with Paul Keys and the AI-PDC committee to develop approximately 15 questions that covered the various topics of interest to the Committee. 

With help from other campus areas, six students who were well acquainted with each other and can be described as heavy users of technology, including AI tools, were recruited for the interview. While all undergraduate class years were represented, a majority of the group were upperclassmen. Several of the students were double majors, as a result four STEM majors, two social science, and two humanities majors were included. The interview was conducted in-person and took a little over an hour. Audrey Taylor, an Industrial Organizational Psychology graduate program intern with RPA, conducted interviews with assistance from RPA and AI-PDC staff.

Here are some of the key findings from the focus interview:

  • All except one student uses AI for coursework.
  • The student who is a non-user is a humanities student and was concerned about protecting her “creativity.”
  • AI was used most frequently when students had a deadline they needed to meet or were pressed for time in some other way.
  • Chat GPT, Gemini, and Grammarly were the most frequently used generative AI tools.
  • Students relied mostly on their own knowledge to fact check AI generated information, occasionally they Googled something that they were not familiar with.
  • The tasks they relied on AI to complete included: article summarization, explanation of technical information, coding, and generation of study material.
  • Socio-cultural and environmental issues around AI use were not a primary concern.
  • Overwhelmingly, students would prefer faculty to educate students on ethical AI use (plagiarism, etc.) rather than just prohibiting it.
  • Students felt that better communication was needed about if/why using AI in certain ways might harm learning. 
  • AI was perceived as particularly useful for homework tasks perceived as “busy work.” 
  • Students said they needed to be responsible about balancing AI use with actual learning. 
  • Some suspected faculty were already using AI for creating quizzes and course materials.

A more detailed summary report of the focus interview results is available from RPA upon request. Moreover, if you have any questions about this focus interview or any other assessment-related topics, please contact Dianne Raubenheimer (raubenhe@meredith.edu) or Dilnavaz Mirza Sharma (sharmadi@meredith.edu) in the Office of Research, Planning and Assessment.

Melyssa Allen

News Director
316 Johnson Hall
(919) 760-8087
Fax: (919) 760-8330

allenme@meredith.edu