Logo

Do your students know the consequences of AI use on an internship programme?

Students are using GenAI tools in their university work, but what about in the workplace? Are they unknowingly putting their employers at risk? Here’s how to prevent a potential data breach

,

,

,

University of Wollongong,Deakin University
12 Sep 2025
copy
  • Top of page
  • Main text
  • More on this topic
A young woman puts a hand to her head while working at an office
image credit: iStock/demaerre.

You may also like

Developing a GenAI policy for research and innovation
5 minute read
Team of engineers are conducting an experiment in a lab

Popular resources

It’s safe to say that many university students are using generative artificial intelligence (GenAI) tools for learning, university assessments and personal use. For the most part, that’s fine. These tools are powerful, accessible and increasingly integrated into how we learn and work.

But what happens when those same tools follow students into real workplaces, where they’re handling sensitive client data, confidential reports or proprietary business information?

While the higher education sector writes frameworks and policies around ethical use of AI tools, complexity lies within the work-integrated learning space – where students temporarily work and learn within an organisational setting. Although the research is slow to reveal data on students’ use of GenAI in work-integrated learning, anecdotal stories and informal feedback suggest that this is happening right now. 

It’s not a stretch to imagine that students might use the tools they are comfortable with for their placements. In the organisational setting, “shadow AI” – where employees use GenAI tools without workplace permission or knowledge – is pervasive and presents a significant risk to workplace data security. If an intern engages in this behaviour, often unknowingly, they can expose sensitive client information to third-party platforms, creating vulnerabilities that neither the university nor the host organisation might be prepared to manage.

Students aren’t acting maliciously. In most cases, they simply haven’t been taught where the ethical lines are drawn in professional settings or realise that this organisation is likely to have its own set of GenAI policy frameworks different from the university. As a result, they could inadvertently upload identifiable or sensitive information into GenAI tools, unaware that doing so could breach privacy legislation, contravene workplace policies or compromise client confidentiality.

Also, consider that the majority of GenAI tools used by students are accessed through personal accounts, which typically lack enterprise-level security and data governance. These platforms often have opaque data handling practices, increasing the risk of unintended data exposure and reputational harm for both the student and the host organisation.

Can we understand why students might want to use GenAI during their placements? 

Absolutely. The reality is, being an intern is tough.

Students entering a new workplace often face a steep learning curve. They’re navigating unfamiliar systems, trying to meet expectations and working under time pressure, all while still learning the ropes. In that context, turning to a familiar GenAI tool such as ChatGPT can feel like a smart, efficient move. It’s fast, it’s accessible and it helps them feel more confident in delivering work that meets professional standards.

Many students genuinely want to impress their supervisors. They know they’re only there for a short time and they want to make an impact. So when a task feels overwhelming or unclear, using GenAI to clarify, summarise or draft content can seem like a harmless shortcut. But here’s the catch: what feels expedient to the student might be problematic from the university or employer’s perspective.

Universities may have clear policies on GenAI use in academic settings, but those don’t always translate to the workplace. And host organisations may not have considered how student interns fit into their own GenAI governance frameworks. That gap in understanding creates risk.

So, what can we do about it?

The good news is that most work-integrated learning programmes are already well positioned to address this issue.

Many programmes already include essential documentation such as intellectual property agreements, legal contracts, role descriptions and codes of conduct. These documents could be easily updated to include clear guidance on the ethical and legally compliant use of GenAI tools in the workplace. It’s not about reinventing the wheel, it’s about evolving existing structures to meet new challenges.

Likewise, most universities offer some form of pre-placement preparation, whether through asynchronous online modules, workshops or in-class sessions. These are ideal spaces to introduce students to ethical principles, discuss the risks and responsibilities of using GenAI in professional settings, and give them a chance to ask questions and explore real scenarios.

And let’s not overlook the placement orientation itself. This is a key moment when students and supervisors meet, expectations are set and the nature of the work is discussed. It’s the perfect opportunity to talk openly about GenAI, what’s allowed, what’s not, what support is available and how to navigate grey areas. A conversation here could prevent a major issue later.

To support this work, a suite of practical resources has been developed by work-integrated practitioners and researchers at the Centre for Research in Assessment and Digital Learning at Deakin University. These resources are designed to help university educators, students and industry partners navigate the complexities of GenAI use across all stages of work-integrated learning, before, during and after placement. They offer structured ways to reflect on ethical, legal and practical considerations, and to build shared understanding across all stakeholders.

Because ultimately, preparing students for the future of work means preparing them to use GenAI tools – ethically, responsibly and with purpose.

Bonnie Dean is associate professor and head of academic development and recognition, learning, teaching and curriculum at the University of Wollongong. Joanna Tai is associate professor and senior research fellow; Kelli Nicola-Richmond is associate head of the School of Health and Social Development; and Jack Walton is research fellow, all at Deakin University, Australia.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

Loading...

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site