Next Gen AI: NotebookLM and Gemini – Friend or Foe?

Session Description

New directions for open AI use in academics, personal use, and business contexts are unlimited but warrant critical thought when deciding on the ethical use of AI. New AI tools such as NotebookLM and Gemini are currently being refined and being implemented on an experimental basis into online classrooms and in Google Labs in an experimental capacity. Technology is constantly changing and assessing the value of new open AI tools depends on the user exercising critical thought. NotebookLM is an experimental product designed from the ground up using the power and promise of language models. It is also a very different kind of notebook — one in which the user can use existing documents and notes to help the user to learn, create, and brainstorm.

A significant component of this presentation is the case study conducted in collaboration with Purdue University Global and Google on a new AI tool called Notebook LM. This unique partnership marks Purdue Global as the inaugural institution to work alongside Google in trialing, refining, and enhancing Notebook LM through real-world application in a graduate class setting.

This presentation considers future research and the ethical implications of emerging generative AI technologies, specifically NotebookLM and Gemini. AI generated tools hold potential and challenges for researchers and learners. Technology is constantly evolving and tools such as Google’s NotebookLM and Gemini use the power and promise of language models paired with existing content to quickly gain critical insights. As this experimental open AI tool gets refined, it holds potential for use by learners, faculty, and individuals. NotebookLM can house resources and notes as well as save students time by utilizing specific resources curated by the instructor. As with any new technology, the user must make ethical decisions on using any generative AI tool.

Presenter(s)

Carolyn Stevenson
Purdue University Global
Chicago, IL, USA