Yeah I had a coworker say she likes to use ChatGPT to find answers and explanations to questions she has that you would normally Google.
This is a terrible idea. While it may contain legitimate info, ChatGPT was not designed to give factual answers. It comes up with convincing answers based on text it has read. You’re going to end up with some bad information and it’s a bit dangerous to hear that people are starting to use it that way.
A slightly dangerous part is that ChatGPT makes up convincing texts that may be wrong due to misunderstanding and or biased.
Yeah I had a coworker say she likes to use ChatGPT to find answers and explanations to questions she has that you would normally Google.
This is a terrible idea. While it may contain legitimate info, ChatGPT was not designed to give factual answers. It comes up with convincing answers based on text it has read. You’re going to end up with some bad information and it’s a bit dangerous to hear that people are starting to use it that way.
Fair, but that’s not much different then google or Siri’s summaries based on biased site’s manipulation of SEO.
In the end, you have to do your own research and validation to decide what to trust.