ChatGPT is an artificial intelligence chatbot that's been getting a lot of news, lately! You may have heard about the bot's ability to carry on a conversation and answer questions, but did you know that ChatGPT makes stuff up all the time? Chatbots are designed to respond to human questions in ways that make sense, but they are not programed to check whether the info they provide is factual or not! This has led to something called "Hallucinations," where the Chatbot gives completely made-up (yet very believable sounding) answers.
This has been a particular issue when it comes to research and scholarly citations. Over the past few months we've had several students come to the library attempting to find articles ChatGPT recommended for their topic - but it turns out the citation was fake! Not-to-fear: our librarians were able to help the students find real sources on their topics but we wanted to get the word out!
For more information on ChatGPT Hallucinations and Academic Research, check out this great post on the topic by Duke University Libraries.