From ChatGPT to DeepSeek, Artificial Intelligence (AI), for some, has become a common part of everyday life, and is utilized in many ways.
Among AI’s advantages is its ability to enhance an organization’s capabilities. According to Ted Brown, an alumni elected trustee, AI can be used as a tool to benefit Penn State.
“A couple of months ago, I did a presentation about the value of AI in the world of crisis management, risk management and so forth,” Brown said. “I see tremendous potential for AI to help us do a better job in that space, simply because you can do so much data mining with AI that you can’t do manually.”
Brown said he believes AI can be used to help increase enrollment at Penn State’s Commonwealth campuses. He emailed members of the Board of Trustees (BoT) conceptual evidence of this before presenting his proposal during a meeting on May 22.
He continued by saying the use of AI could create new avenues of interaction between applicants and Penn State beyond traditional channels including allowing applicants to reach out to club executives via AI-generated emails with an applicant’s contact information.
Brown said, while his proposal was unable to prevent the closure of seven commonwealth campuses, he believes the use of AI could help improve the ones that are still active.
Flowers bloom on campus on Saturday, April 19, 2025 in University Park, Pa.
Kayla McCullough“Even though that vote is done, even though those seven are destined to close, there are five [other] commonwealth campuses that were originally considered to be closed that are not going to be but they have similar challenges,” Brown said. “We have empty seats and every campus could benefit from what I’m talking about.”
Jason Hoss, Brown’s partner, said he believes AI can provide students with more choices in how they pursue their college career.
“Ted has that passion component in connection to Penn State … and we just saw this as a potential opportunity to align our passions for a greater good,” Hoss, the CEO and founder of Whirlybird Labs, said. “The idea of this is it is not about technology telling anyone anything; it's more or less using technology to gain insights on how people might fit elsewhere when that primary choice isn’t an option.”
Hoss explained that the data obtained from college applications, as well as from other sources, could be used to form profiles for AI to analyze to offer tailored solutions to choose from.
“Our tastes, desires and what we’re looking for changes so the end goal is to allow that profile to be dynamically updated, not only on the student’s side but also on the school side, too,” Hoss said. “So, in thinking about things like that, as we build those college profiles as well, we can start matching what the student is looking for as well as what the school can support and enable in the student.”
Students walk along Pollock Road near Old Main together during the first day of classes on Monday, Jan. 13, 2024 in University Park, Pa.
Chris EutslerHoss said since AI would be utilizing Personally Identifiable Information (PII), there would need to be an extensive effort to protect that data as well as ensure its use is compliant with laws such as FERPA.
David Failor, a part-time educational program specialist at Penn State Beaver, said the compliance of AI to laws such as FERPA is dependent on the training it receives along with the intentions of its trainors.
“If someone wants to harvest private data for nefarious use, or if the production team is not aware of security concepts, then no data is safe.” Failor said. “You could say the same thing about human teams, but we understand how the human teams operate and they either follow the rules or get fired and/or incarcerated for violations. Not so with AI.”
According to Failor, AI can also develop biases and hallucinations based upon the data that it receives as well as potentially using outdated data for its calculations.
He also stated the importance of keeping humans involved within the AI’s processes to minimize the chances of violating one’s privacy by exposing PII.
“Just like an AI-powered ‘self driving’ car is unable to make ethical or moral decisions between the life of the driver and the life of a pedestrian in whose direction it is heading, you must have final step human oversight of AI access to sensitive data,” Failor said. “In other words, AI is very useful for locating, sorting, and finding hidden patterns in data, but it should not be used with sensitive data for the final interface between the user and the system as that still needs rigid and enforceable programming rules in place to protect PII and other information.”
MORE NEWS CONTENT