Birdwatch Archive

Birdwatch Note Rating

2025-01-28 04:41:19 UTC - HELPFUL

Rated by Participant: FFB1DAF2C016CA5F1E4BDAF4F2129BCF67606D8BF677F4E7CD97A41B3196E93B
Participant Details

Original Note:

Mixture of Experts is an existing technique used by many models such as Mixtral 8x7B, DBRX. Most likely, GPT-4 and other frontier models use MoE as well (unconfirmed since they are closed-source). https://en.wikipedia.org/wiki/Mixture_of_experts#Applications_to_transformer_models

All Note Details

Original Tweet

All Information

  • noteId - 1884021345677463737
  • participantId -
  • raterParticipantId - FFB1DAF2C016CA5F1E4BDAF4F2129BCF67606D8BF677F4E7CD97A41B3196E93B
  • createdAtMillis - 1738039279049
  • version - 2
  • agree - 0
  • disagree - 0
  • helpful - 0
  • notHelpful - 0
  • helpfulnessLevel - HELPFUL
  • helpfulOther - 0
  • helpfulInformative - 0
  • helpfulClear - 1
  • helpfulEmpathetic - 0
  • helpfulGoodSources - 0
  • helpfulUniqueContext - 0
  • helpfulAddressesClaim - 1
  • helpfulImportantContext - 0
  • helpfulUnbiasedLanguage - 0
  • notHelpfulOther - 0
  • notHelpfulIncorrect - 0
  • notHelpfulSourcesMissingOrUnreliable - 0
  • notHelpfulOpinionSpeculationOrBias - 0
  • notHelpfulMissingKeyPoints - 0
  • notHelpfulOutdated - 0
  • notHelpfulHardToUnderstand - 0
  • notHelpfulArgumentativeOrBiased - 0
  • notHelpfulOffTopic - 0
  • notHelpfulSpamHarassmentOrAbuse - 0
  • notHelpfulIrrelevantSources - 0
  • notHelpfulOpinionSpeculation - 0
  • notHelpfulNoteNotNeeded - 0
  • ratingsId - 1884021345677463737FFB1DAF2C016CA5F1E4BDAF4F2129BCF67606D8BF677F4E7CD97A41B3196E93B