UIC’s Voice on Technology & Innovation: Thoughts on the Integration of AI in Education

UIC's voice on technology and innovation banner

Last semester, we introduced UIC’s Voice on Technology & Innovation – a CIO-moderated talk series through which the UIC community of students, faculty and staff are invited to share their diverse experiences and perspectives on popular technology-focused topics. In this series, CIO Matt Riley reviewed input from various UIC community members, shared in this feature.

UIC's Voice on Technology & Innovation Spring 2024 Prompt Heading link

How has the integration of AI in education influenced your perspective on teaching and learning, and what specific benefits or challenges have you observed in harnessing AI tools within the educational environment?

Thoughts from the UIC Community Heading link

Robotic finger meeting human finger

“It’s been one full school year since ChatGPT become a global phenomenon, and overall it turned out more or less as we expected in UIC’s writing courses. Some students — not as many as we feared, but maybe one or two in each first-year writing class — have obviously used ChatGPT to cheat by submitting entire essays, or portions of essays, that were written by the AI.

Writing instructors knew to expect this, but maybe we ended up seeing fewer of these cases than we feared…or maybe more? I think most of us thought these cases would be harder to detect, and if there’s any silver lining, it seems like an attentive reader can fairly easily spot the odd AI-generated paper. A paper that’s largely AI-generated tends to stand out among two-dozen other drafts for its aberrant writing style and, in many cases, the inaccuracies and imprecisions it presents.

Whatever generative AI’s benefits and affordances may be in producing outlines to essays, and summing up the internet discourse on any given topic, students need to realize that it’s never a good thing to submit AI-written text per se, without carefully vetting it for its factual content. Critical thinking is required in any given college course, and students’ own thought-labor in presenting ideas and connecting facts needs to guide their use of AI, not vice versa.”

Mark Bennett
Director, First-Year Writing Program

 

Student looking at laptop

“The advent of AI technology in higher learning environments poses an opportunity for all of us who spend time in classrooms because it challenges how we consume and disseminate information. As a graduate student, the many features offered by AI-inclusive technology, such as the AI Assistant feature in Adobe products, means I can more easily navigate PDFs to find the information I need. Access to other technologies like ChatGPT or Copilot means I can get more specific results in searches for information online.

My professional training allows me to discern the quality of the information I find and gives me a moral compass to address misuse concerns. The use of AI technology puts me front-center at the beginning of this new technological age. The last time I was this excited was in the 1990s when the internet was coming online. As a teaching assistant, however, I find the technology presents different circumstances that center on the issue of AI’s ethical use. In the last year, I noticed an increased use of AI among undergraduate students to fulfill homework assignments or more significant assessments.

The lack of institutional support in the way of technology tools to address this issue, i.e., a SafeAssign-like application, has perpetuated the misuse of the technology and left classrooms as no-man-lands where anything and everything is now suspect. I honestly believe this does not have to be the case. As educators, we have an incredible opportunity to guide our students through this new age and grow our skill sets simultaneously. This is the perfect time to update syllabi, assignments, and assessments to reflect this new reality.

I would argue that any time is the right time to do this, but the advent of AI created the perfect time to perform these upgrades en masse. As British propagandists communicated to their populace in times of war, we must keep calm and carry on.”

Paul Ribera
Graduate Student, Teaching Assistant, Instructor on Record

 

male student typing on keyboard

“Since AI was introduced into mainstream, I feel pressured to integrate it into my research. I am a 3rd year graduate student and would like to graduate within the next three years but now if I don’t have an AI component I think my resume will not be as competitive when applying for jobs.”

Soraida Garcia
Graduate Student

“The ease of ChatGPT and other generative AI sites has made trust nearly impossible between instructors and students. As soon as instructors suspect the use of AI written submissions for assignments, there can be no trust between them and their students, even if the student did not use AI to complete an assignment. As both an advisor and an instructor for first-year seminar courses, I have seen how poorly prepared young people are to enter the academic environment.

This technology has disrupted connections in classrooms and has impeded the ability for students to challenge themselves academically. In the same way most people use calculators to do simple math, these AI tools are reducing our capacity to do our own work, manage time effectively, and problem solve. I find it increasingly frustrating to hear that my co-workers are using AI to complete basic tasks like write a personal email to a student. ”

Laura Kaczmarczyk
Honors Academic Advisor and Program Specialist

CIO Matt Riley Responds Heading link

UIC CIO Matt Riley

This Spring, we asked the UIC community how AI impacted teaching and learning throughout the 2023 to 2024 year. A mix of faculty and students answered the question, and some common themes emerged that I want to address.

  1. The advent of GenAI has created trust issues in the academic space. There were mentions of very obviously AI-written papers and worry that students may not even realize the ramifications of turning in that type of work.  Also, from a trust standpoint, there is concern that some percent of students are using AI and getting away with it, leading to mistrust between students.
  2. Some note that students may be less prepared than ever, disruptively relying upon these tools and changing classroom dynamics.
  3. On the plus side, students believe that GenAI can help them gather research and answer the questions they need, which increases their ability to think and perform at a high level in their classes.
  4. Finally, all seem to agree that more guidance on appropriate uses of AI at UIC, defined AI tools, and general help on the topic would make a difference across the academic landscape.

For angst around available information on AI for the academic community, I would point to three perhaps underused opportunities that exist at UIC:

  1. There are a variety of articles and opportunities with AI for both faculty and students available on the Learning Technology Solutions website, learning.uic.edu
  2. Faculty can hone in on the AI subject in the CATE newsletter archive at teaching.uic.edu
  3. General information on AI tools available at UIC is available at it.uic.edu/ai

Colleges may also have their own guidance on AI. Please look for more information on GenAI in faculty and student newsletters when they arrive in Fall ’24.

UIC's Voice on Technology & Innovation | Fall 2024 Prompt: Heading link

Ensuring the security of our digital environment is a shared responsibility. How can we enhance awareness and education about cybersecurity across the university community? What measures do you believe would contribute to a more secure online environment for both personal and professional use?

Join the Conversation!