Birdwatch Archive

Birdwatch Note

2023-04-14 19:54:41 UTC - MISINFORMED_OR_POTENTIALLY_MISLEADING

ChatGPT and other general-purpose language models are not designed to store statistical data. The response does not pull from real data; instead, it generates its own data in a convincing way in what is known as a hallucination. https://bernardmarr.com/chatgpt-what-are-hallucinations-and-why-are-they-a-problem-for-ai-systems/

Written by 32F1392261555917D3AE20BDC96CB81A94BE590DCDEFA9A4B7F79ABE209E5257
Participant Details

Original Tweet

Tweet embedding is no longer reliably available, due to the platform's instability (in terms of both technology and policy). If the Tweet still exists, you can view it here: https://twitter.com/foo_bar/status/1646640890092351488

Please note, though, that you may need to have your own Twitter account to access that page. I am currently exploring options for archiving Tweet data in a post-API context.

All Information

  • ID - 1646965218541719552
  • noteId - 1646965218541719552
  • participantId -
  • noteAuthorParticipantId - 32F1392261555917D3AE20BDC96CB81A94BE590DCDEFA9A4B7F79ABE209E5257 Participant Details
  • createdAtMillis - 1681502081891
  • tweetId - 1646640890092351488
  • classification - MISINFORMED_OR_POTENTIALLY_MISLEADING
  • believable -
  • harmful -
  • validationDifficulty -
  • misleadingOther - 1
  • misleadingFactualError - 1
  • misleadingManipulatedMedia - 0
  • misleadingOutdatedInformation - 0
  • misleadingMissingImportantContext - 0
  • misleadingUnverifiedClaimAsFact - 1
  • misleadingSatire - 0
  • notMisleadingOther - 0
  • notMisleadingFactuallyCorrect - 0
  • notMisleadingOutdatedButNotWhenWritten - 0
  • notMisleadingClearlySatire - 0
  • notMisleadingPersonalOpinion - 0
  • trustworthySources - 1
  • summary
    • ChatGPT and other general-purpose language models are not designed to store statistical data. The response does not pull from real data; instead, it generates its own data in a convincing way in what is known as a hallucination. https://bernardmarr.com/chatgpt-what-are-hallucinations-and-why-are-they-a-problem-for-ai-systems/

Note Ratings

rated at rated by
2023-04-14 21:19:08 -0500 Rating Details
2023-04-14 20:33:01 -0500 Rating Details
2023-04-14 15:36:45 -0500 Rating Details
2023-04-14 19:10:18 -0500 Rating Details
2023-04-15 07:37:44 -0500 Rating Details
2023-04-14 20:54:10 -0500 Rating Details
2023-04-15 00:57:28 -0500 Rating Details
2023-04-14 16:12:56 -0500 Rating Details
2023-04-14 21:42:05 -0500 Rating Details
2023-04-14 16:17:00 -0500 Rating Details
2023-04-14 16:24:02 -0500 Rating Details
2023-04-14 16:51:43 -0500 Rating Details
2023-04-14 15:02:03 -0500 Rating Details
2023-04-14 15:02:23 -0500 Rating Details
2023-04-14 19:45:54 -0500 Rating Details
2023-04-14 17:18:42 -0500 Rating Details
2023-04-14 16:14:20 -0500 Rating Details
2023-04-14 18:54:29 -0500 Rating Details
2023-04-14 17:27:27 -0500 Rating Details
2023-04-14 15:47:59 -0500 Rating Details
2023-04-14 17:15:58 -0500 Rating Details
2023-04-14 21:19:08 -0500 Rating Details
2023-04-14 20:33:01 -0500 Rating Details
2023-04-14 15:36:45 -0500 Rating Details
2023-04-14 19:10:18 -0500 Rating Details
2023-04-15 07:37:44 -0500 Rating Details
2023-04-14 20:54:10 -0500 Rating Details
2023-04-15 00:57:28 -0500 Rating Details
2023-04-14 16:12:56 -0500 Rating Details
2023-04-14 21:42:05 -0500 Rating Details
2023-04-14 16:17:00 -0500 Rating Details
2023-04-14 16:24:02 -0500 Rating Details
2023-04-14 16:51:43 -0500 Rating Details
2023-04-14 15:02:03 -0500 Rating Details
2023-04-14 15:02:23 -0500 Rating Details
2023-04-14 19:45:54 -0500 Rating Details
2023-04-14 17:18:42 -0500 Rating Details
2023-04-14 16:14:20 -0500 Rating Details
2023-04-14 18:54:29 -0500 Rating Details
2023-04-14 17:27:27 -0500 Rating Details
2023-04-14 15:47:59 -0500 Rating Details
2023-04-14 17:15:58 -0500 Rating Details