Birdwatch Archive

Birdwatch Note Rating

2025-01-28 00:25:04 UTC - HELPFUL

Rated by Participant: B635C63063E6BE37FD78C8D2BB13BABC03421E7A02E77C3E66AFA327E29D2A5A
Participant Details

Original Note:

Mixture of Experts is an existing technique used by many models such as Mixtral 8x7B, DBRX. Most likely, GPT-4 and other frontier models use MoE as well (unconfirmed since they are closed-source). https://en.wikipedia.org/wiki/Mixture_of_experts#Applications_to_transformer_models

All Note Details

Original Tweet

All Information

  • noteId - 1884021345677463737
  • participantId -
  • raterParticipantId - B635C63063E6BE37FD78C8D2BB13BABC03421E7A02E77C3E66AFA327E29D2A5A
  • createdAtMillis - 1738023904560
  • version - 2
  • agree - 0
  • disagree - 0
  • helpful - 0
  • notHelpful - 0
  • helpfulnessLevel - HELPFUL
  • helpfulOther - 0
  • helpfulInformative - 0
  • helpfulClear - 0
  • helpfulEmpathetic - 0
  • helpfulGoodSources - 0
  • helpfulUniqueContext - 0
  • helpfulAddressesClaim - 0
  • helpfulImportantContext - 0
  • helpfulUnbiasedLanguage - 0
  • notHelpfulOther - 0
  • notHelpfulIncorrect - 0
  • notHelpfulSourcesMissingOrUnreliable - 0
  • notHelpfulOpinionSpeculationOrBias - 0
  • notHelpfulMissingKeyPoints - 0
  • notHelpfulOutdated - 0
  • notHelpfulHardToUnderstand - 0
  • notHelpfulArgumentativeOrBiased - 0
  • notHelpfulOffTopic - 0
  • notHelpfulSpamHarassmentOrAbuse - 0
  • notHelpfulIrrelevantSources - 0
  • notHelpfulOpinionSpeculation - 0
  • notHelpfulNoteNotNeeded - 0
  • ratingsId - 1884021345677463737B635C63063E6BE37FD78C8D2BB13BABC03421E7A02E77C3E66AFA327E29D2A5A