Main content

EducAItion in action - two US academic institutions offer lessons on AI and education beyond plagiarism paranoia

Stuart Lauchlan Profile picture for user slauchlan May 14, 2024
Summary:
AI is impacting education and it's not going away. So how should academic bodies react? Texas Tech and the University of Nevada, Las Vegas have some homework to give to their peers.

AI education
(Wake up! )

Teaching is hard work, right? Data from a 2022 Gallup Panel Workforce Study found that K-12 education employees are the most exhausted, with 44% of such respondents saying they feel always/often burned out at work, followed by college/university workers on 35%.

That general sense of being tired is going to have a knock-on impact on students and longer term on the future workforce. Another 2022 study, this time Salesforce’s Connected Student Report, found that only 11% of college students polled felt they are ready to join the ranks of the employed in their chosen fields.

Something’s got to give - and the answer may lie in AI. Now, of course the answer may lie in AI - this is 2024 and the most silver of all silver bullets is generative and waiting to be fired, for better or worse, across every sector and education is no exception.

But there’s an immediate problem here  - one of generative AI’s most notorious party tricks to date has been its ability to plagiarise. Why bother to sweat over your term paper when ChatGPT can write it for you? Of course, getting someone else to do your course work for you is as old as time among students, but this time it’s free and it’s fast and it’s at your fingertips.

Image problem

So, AI has an image problem. Is it a fair one? It’s certainly a real one, admits Bala Subramanian, VP & GM of Education at Salesforce, which today launches a series of generative AI enhancements to its Education Cloud offering:

This is ultimately different from previous plagiarism, in some sense, because it's hard to catch it because gen AI is actually generating new content. It's not just copying older content. And so institutions are having to really re-work their pedagogical approach for this. For example, shifting from final paper-based exams to have more of a written assignment [model] that is in classroom, things like that.

But the good news is now we have seen a pendulum swing, where education institutions that were initially for banning AI, really now focus on teaching students how to safely and ethically use it, while still focusing on critical thinking skills. Because really, we all know that these students are going to be in a world of work, where AI is going to be in the flow of work.

Mitzi Lauderdale, Vice Provost for Academic Innovation and Student Success at Texas Tech, argues that it’s important to get to grips with AI: 

You need to lean into it instead of away from it. It's here, it's here to stay, and it's going to be evolving. The piece of advice that I would give is to facilitate the conversation and kick it off and start. And when you think about the different vantage points across campus, you need to challenge everyone to think about AI through their lens. How does it impact them in their own workplace? How does it impact systems? How does it impact governance? How does it impact the students? And how does it impact the faculty?

Sitting it out until there’s a clearer picture isn’t an option, agrees Kivanc Oner, CIO at the University of Nevada, Las Vegas (UNLV):

The place to start is really, starting and not waiting. Starting now and starting with low stakes, high impact, figuring out what those are…We need to really start somewhere and create the environment where this is not something we are fearing, but we are embracing.  What is it that is out there? How can we provide those learning opportunities for internal [use] and our community, creating work groups, creating community information and knowledge sharing? So instead of just fearing this, and getting into our old ways of, ‘How can we hide from this?’, we need to be more embracing and get it out there.

Both Texas Tech and UNLV have been proactive in this respect. At Texas Tech, Lauderdale says:

We put together a group just to talk about AI in the classroom. That's one small place to start. Let's figure out and look at the discipline of teaching and pedagogy in the classroom, and how we can craft different different assessments to challenge a student to think critically. That was an easy place to start. We study pedagogy on a regular basis, so we brought Faculty together, we brought students together and it just starts the conversation. How can we use AI? How can we use data in our everyday work to maximize our ability to be both efficient and effective, and spend time where we need to spend time with the human connection. So it's really a campus conversation that we should start.

Part of that conversation should be about ‘demystifying’ AI, she advises:

I think there’s a misunderstanding of what AI is. It sounds so big and so grand. AI encompasses many different aspects. [We are] helping people understand how it impacts them directly and indirectly, and really coming up with use cases. How is it that they can maximize what they are doing? Also what are the guardrails? What are the policies and procedures around it? Because if there are policies and procedures, that does help instil some trust that the research has been done. We also have a group taskforce that has been looking at this, specifically for the classroom, and they've received positive feedback from the suggestions and the continued conversation around it.

At UNLV, it’s been a similar situation, says Oner:

Working with our faculty senate, we created an AI Taskforce sub-committee. This is not only a technology issue, this is at every level. We foresee that AI will be embedded in everything. I think that's inevitable. It will be part of everything. I think we have been in this whole hype cycle -  first fear and then there's ‘Let's block everything!’.

The real questions that should be asked are different, he adds:

How do we get ready to embed this into our learning outcomes? How do deliver this content for our students so that they are successful in their lives? What's our value here? What are we trying to do? We are trying to get our students ready for their next steps.

Don't forget the students

That last point is critical - just as the customer must be at the heart of business, the student needs to be at the center of education. AI should be seen as an enabler to provide better, more personalized connections with that target audience, says Oner:

Our mission is providing access to education for our community, for our students. And as their public higher education institution, we accept that we have to have all the resources, so we need to find and create efficiencies. We need to be able to scale our services, because our audience, our students, are expecting us to deliver services as they are used to from industry. They want the 'Amazon experience', they want the 'Apple experience', they want the 'Salesforce experience', and all of that. We have 40,000 constituents and 115 buildings. AI is going to help us being able to provide responsive personalized content, be more timely and effective and provide services in a way that is not just going to be ‘OK’.

He adds:

I believe AI is helping with delivering those things because there's a lot of interaction between many of our frontline staff, especially with our students, that requires a lot of guiding and advising. We are now using a lot of integrated systems and data to build something that can be personalized for students and we are really excited to use generative AI functionalities to be able to provide actual personalized content as the call to action.

That’s an idea that also resonates at Texas Tech as Lauderdale notes how much things have changed since her institution was founded a century ago:

When we reflect on 100 years ago, imagine the students that showed up to Texas Tech University, that was actually Texas Technological College at that point in time, and compare them to who they are today. It's obvious that there's a difference. But I would also challenge you to think about the students that showed up five years ago, and the students that will show up five years from now - very, very different.

The 'always on' consumer experience has bled into the education sector, she argues:

Students expect things at their fingertips. They expect us to be available 24/7, which is not very easy. They expect personalized learning experiences, and we now have students that show up wanting personalized degrees. They want help to choose their own pathway and their own curriculum. They expect us to know them.

This is where AI’s potential kicks in, she adds: 

AI is being utilized on the Admissions side to look at some predictive and prescriptive modelling. That is now flowing into the student experience side. Same thing with predictive and prescriptive modelling that we're working on right now. That allows us to intervene for students when we anticipate their needs before they even know that they need it.

My take

Whether we like it or not, it's already happening around us. It's in the products that we are using, it's in the services we are getting. What does it mean?

While the comments and advice here are coming out of two US educational establishments, what’s being said is clearly highly applicable to pedagogic policy across the world. Getting past the plagiarism paranoia and the resultant negativity is going to be critical. Academic circles need to learn - or be taught - the wider benefits potential of AI in education. At best, the long-pursued ideal of the Connected Student might come much closer to reality. At the very least, they might find themselves less permanently knackered...

A grey colored placeholder image