The AI Powered Campus 2: Considerations for the Future

Our recent brief, Artificial Intelligence in Higher Education, offers an optimistic vision for the future of AI on our campuses, which we also discussed briefly in this blog. A look at AI in any space should bring you to the conclusion that this is a quickly developing sector with possibilities for all markets, not just our own.

But with great potential comes a measure of risk. For all the benefits AI can have in our space, there are drawbacks to its implementation as well – including negative impacts on the workforce and student privacy. Another concern is overall impact on the job market. Many of the jobs we believe will be impacted by the emergence of AI are entry-level, where new graduates begin a career path. As rote tasks like data entry and troubleshooting become automated, entry-level opportunities for new graduates may be few and far between.

That said, these are all human concerns, and can be engineered in such a way that AI’s pending integration into our lives is net positive and creates jobs. As educators, we owe it to ourselves and our students to advocate for changes that will create a safe place for AI to grow and our students to succeed. We recommend a few places where advocacy could be focused:

Protecting Student Privacy: AI is most powerful when it is unbound and free to roam the databases of an organization to mine for insights. However, how much freedom should AI have when performing certain tasks without human supervision—say, enrollment or marketing functions—to access the data we collect? What should it be allowed to do with it? This especially holds true for student data, which is protected by the Federal Education Rights and Privacy Act. FERPA was last updated in 2001 as part of the Patriot Act. If we’re going to create a transparent space for AI to work on our campuses, student privacy measures should be updated to accommodate nearly two decades of technological change that has since occurred.

Empowering Innovation and Transparency in Credentialing: As AI begins to disrupt the employment market on a greater scale, we predict that learners will need to be able to reskill and upskill more frequently and more quickly to remain competitive in the job market. This likely means that non-traditional education pathways (bootcamps, certificates, etc.) will become more ubiquitous. We believe this is a good thing, but it needs to occur transparently. Learners have a right to understand the value of a credential and how it relates to other types of credentials. We also believe that as transparency becomes more robust, certain non-traditional credentials might even prove themselves eligible for federal financial aid or VA benefits. To make this happen, we need a framework for establishing how all of these pieces fit together. A good starting point is the Credential Engine, which is seeking to establish such a framework.

Creating the 21st Century Liberal Arts: A corner stone of American higher education is liberal arts education and the concept that our graduates should be capable of applying what they learn across a wide range of functions. In a way, this aligns well with the best practices that govern how people learn – by taking new knowledge and forming connections between it and existing knowledge. The broader the knowledge base, the quicker and more agile the learner becomes. While the focus has shifted more recently to discrete skills and structured knowledge—even within degree programs—many prestigious tech employers (such as Google and Facebook) are doing an about-face and hiring liberal arts majors again. We believe there is still a vital place for disciplines like literature, ethics and history in the modern world. We also believe there is an imperative for the cannon to be updated with new disciplines and concepts. A 21st century liberal arts needs to be adopted to reiterate the importance of traditional disciplines, incorporate new ones such as coding, data science and information systems, and call attention to their interconnectedness and relevance in a modern world. Some programs like The Minerva Schools at KGI are already making progress in this area and we believe more should join them going forward.

There is a lot more to do to create a space for AI to thrive in higher education and our learners to grow in an AI-driven labor market, and we outline several of those in our report. However, the takeaway remains the same — the challenges we face as AI proliferates require human solutions. Collective intelligence, the synergistic combination of AI power and the human mind, requires a deliberate framework to accommodate it. As educators, there’s plenty we can do to make sure that happens.

About Nathan Ackerly

Nathan Ackerly is the curriculum program strategist at Learning House. He advises partner institutions on many aspects of their academic initiatives, from new approaches to online programs and academic governance to regulatory and accreditation concerns and innovation in teaching. Prior to this role, Ackerly was an instructional designer for both Learning House and Bisk Education