OPINION

Enter any university library today, and you see a quiet revolution.

A student sits, staring at a blank document, types some prompts into a chatbot, and out comes a research paper better than anything they could have drafted alone.

The student makes a few minor revisions and submits.

They receive a grade that will travel with them into the labor market as proof of their individual intellectual ability.

Everyone involved understands at some level that something is different here.

However, the institution acts as if nothing fundamental has changed.

Universities believe knowledge exists within a person – a quality that can be academically measured and then packaged in the form of a credential.

Generative AI has undermined this premise by making thinking no longer limited to the head; it now occurs across a network.

And higher education has quietly developed a way to ignore that fact.

If universities acknowledge that intelligence displayed in the paper results from a symbiotic relationship between a human brain and a probabilistic algorithm, we would have no idea how to award a degree.

Economics vs education

What we are seeing is an economic problem disguised as a pedagogical one.

The way we conceptualise thought and its distribution has changed.

Universities must move from certifying that a student ‘knows something’ to endorsing how well the student ‘manages the knowledge’.

When a student uses an AI tool to generate an idea, the resulting insight occurs at the intersection of the user’s direction and the algorithm’s processing.

In many professional roles, cognition is already distributed across people, tools, and networks of systems.

Yet graduation rituals still celebrate individual brilliance.

Why? The explanation lies within the political economy of the credential.

Higher education falls within the framework of Human Capital Theory (HCT), which asserts that schooling increases an individual’s productivity and thereby justifies higher wages.

In the digital economy, value is often generated by the connectivity between people and machines.

Scholars call this phenomenon ‘machinic surplus value’.

When a student writes a program using AI and analyses data with AI, the output produced is far greater than what the student can accomplish alone.

However, because the labor market cannot assign a monetary value to a “cyborg”, employers continue to hire humans.

Therefore, universities have a responsibility to engage in systemic suppression of the technological partner involved in creating the student’s work.

Market value

However, the university has a very high cost for protecting the market value of the human subject.

This cost is the encouragement of epistemic substitution.

Epistemic substitution occurs when a rational actor realizes that they are being evaluated based upon the end-product of their labor and that they will be able to achieve the highest grades if they allow the AI to complete the tasks that require “thinking, “synthesizing", and “judging”.

If the university does not evaluate the process of collaboration between humans and AI, it provides students with an incentive to surrender their agency to the machine.

In addition, the university rewards those students with an A+ who suppress the evidence of how the work was completed.

We are left with a certificate that lies.

The certification states that the student has an internal representation of the knowledge that resides on a server in California.

We are creating an epistemological cut-off between what students learn in school and what they can do upon entering the workforce.

The degree certifies graduates for things they cannot accomplish without their AI companion.

It just proves one thing: how well they can hide the machine.

It does not even provide graduates with the education to evaluate those same tools, because the schools pretended that they weren’t using them.

The future

Moving forward, we will need to be honest.

Today, human agency does not simply mean to have the knowledge to arrive at an answer; rather, it is the ability to manage the network of agents that generate answers.

AI has become essential and invisible in academia. It’s used by everybody, everywhere, but officially not centered anywhere.

Australia will feel this quickly because many Australian universities sell “job‑ready” graduates, while employers quietly expect staff to use AI tools, since everyone else already does.

So just pretending that students work alone doesn’t protect standards – it further corrodes them.

Standards are protected when we can see how a student reasoned, what they checked, where they doubted the model, and how they corrected it.

Yes, it will complicate the grading process. But it will be authentic.

If we keep grading the lone thinker to keep the market calm, the university will keep certifying a version of the human that no longer exists.