Hallucinations
I use ChatGPT, Bing & Bard during the working week. Do you?
I use it to create catchy titles for blog posts, to write summaries of articles and sometimes to brainstorm ideas. It can also be used for more complicated internet searches.
Back in February, I started planning a trip to Louisiana (my birthplace) later this year. I wanted to take some friends with me and needed a plan for a short tour. Now I know the area quite well. Maybe not all the latest restaurants and nightclubs, but the places tourists want to visit don't change that often. Well, in Louisiana they sometimes change after a hurricane.
Well, back in February I opened up ChatGPT (remember, this was when it was only 2 months old) and asked it for a 3-day tour with 3 stops along the way. I had only given it 2 places that I really wanted to visit and the result was really good. Well, to be honest, it will be about 90% of the tour we will do in October.
I later asked for an itinerary, driving plans, and hotel and restaurant ideas and saved it all in a plan to refine. What a cool use of ChatGPT.
The "success" was due to my knowledge of the area and the places we would be visiting. I could have created the same itinerary as ChatGPT, but it would have taken me a day or two. So from a time-saving perspective, it worked really well.
I am sure many of you have tried one of these new-fangled chat tools with different results. Everyone has. You may have heard about how these tools can hallucinate or make up answers. Oh, they can make things up.
Before the summer holidays there were many articles about teachers receiving work from students and immediately knowing that it was not the student's work but from one of these machines. This is a hallucination in action; the student either does not know much about the subject and is just "passing it on" or is lazy and does not proofread it. Either way, a road to failure.
Uh oh
Recently I needed a list of postcodes for the whole of Switzerland, if you're interested in why, read my other articles on email marketing (or ask me). Switzerland is a small country where 4 official languages are spoken. Interesting fact: Switzerland was the third country in the world to introduce postal codes after Germany and the USA.
Well, I was preparing an email marketing blitz to businesses across Switzerland and had materials prepared in German and English (my French is poor and my Romansh is well, non-existent). I needed to tag all the businesses in my database for either the English or German campaign. I thought that postcodes would be close enough to get the right campaign to the right company. So why not ask ChatGPT?
To be honest, the following example was done using Google's Bard, but the results will probably be similar. My first query was for a table of Swiss postcodes and the common language. Within seconds, I could export a table to a spreadsheet with a single keystroke. How cool was that!
The ranges it gave me were a bit too broad, as (see my Louisiana experience) I knew a bit about Swiss postal codes. So I asked for a more refined list. Voilá, I got a more detailed list.
You'll notice that Zurich is listed in almost every area where German is spoken. I was not sure why this was so, but I asked a bit more, even asking "Isn't Zurich in the 8000 area?
Its response really made me sit back and think for a moment.
"You're right. The 8000 postcode area is also part of the 3981-6493 area".
What? Really? How could she come up with that answer? It even added: "I apologize for the confusion. I will try to be more careful in the future to provide accurate information".
Well, at least it was humble and admitted it needed to improve.
I recently heard a good metaphor for this hallucination. ChatGPT is like the neighborhood know-it-all, it really does know a lot, but then makes sh%t up when it doesn't.
Good luck with your experiences. Looking forward to my October tour!