Birdwatch Archive

Birdwatch Note Rating

2024-01-19 02:56:05 UTC - NOT_HELPFUL

Rated by Participant: 005BA1AB3AB5DD562D851FDF2EB6C89C3F5EF0693295898B1F07A3DDC0ACF388
Participant Details

Original Note:

The tweet misstates Meta using 600,000 H100 GPUs for Llama 3 training. In reality, Meta will have the equivalent of 600,000 H100 GPUs in computing power by year-end, with no indication all are for Llama 3. https://www.theverge.com/2024/1/18/24042354/mark-zuckerberg-meta-agi-reorg-interview

All Note Details

Original Tweet

All Information

  • noteId - 1748100277331271705
  • participantId -
  • raterParticipantId - 005BA1AB3AB5DD562D851FDF2EB6C89C3F5EF0693295898B1F07A3DDC0ACF388
  • createdAtMillis - 1705632965021
  • version - 2
  • agree - 0
  • disagree - 0
  • helpful - 0
  • notHelpful - 0
  • helpfulnessLevel - NOT_HELPFUL
  • helpfulOther - 0
  • helpfulInformative - 0
  • helpfulClear - 0
  • helpfulEmpathetic - 0
  • helpfulGoodSources - 0
  • helpfulUniqueContext - 0
  • helpfulAddressesClaim - 0
  • helpfulImportantContext - 0
  • helpfulUnbiasedLanguage - 0
  • notHelpfulOther - 0
  • notHelpfulIncorrect - 0
  • notHelpfulSourcesMissingOrUnreliable - 0
  • notHelpfulOpinionSpeculationOrBias - 0
  • notHelpfulMissingKeyPoints - 0
  • notHelpfulOutdated - 0
  • notHelpfulHardToUnderstand - 1
  • notHelpfulArgumentativeOrBiased - 0
  • notHelpfulOffTopic - 0
  • notHelpfulSpamHarassmentOrAbuse - 0
  • notHelpfulIrrelevantSources - 0
  • notHelpfulOpinionSpeculation - 0
  • notHelpfulNoteNotNeeded - 1
  • ratingsId - 1748100277331271705005BA1AB3AB5DD562D851FDF2EB6C89C3F5EF0693295898B1F07A3DDC0ACF388