Birdwatch Archive

Birdwatch Note Rating

2025-01-28 16:13:30 UTC - NOT_HELPFUL

Rated by Participant: 0E3957C974EB4FEC4E7CB7C3B16A31AD6EC8ECDA54BDBA747D1A3FB742F8F78B
Participant Details

Original Note:

Mixture of Experts is an existing technique used by many models such as Mixtral 8x7B, DBRX. Most likely, GPT-4 and other frontier models use MoE as well (unconfirmed since they are closed-source). https://en.wikipedia.org/wiki/Mixture_of_experts#Applications_to_transformer_models

All Note Details

Original Tweet

All Information

  • noteId - 1884021345677463737
  • participantId -
  • raterParticipantId - 0E3957C974EB4FEC4E7CB7C3B16A31AD6EC8ECDA54BDBA747D1A3FB742F8F78B
  • createdAtMillis - 1738080810880
  • version - 2
  • agree - 0
  • disagree - 0
  • helpful - 0
  • notHelpful - 0
  • helpfulnessLevel - NOT_HELPFUL
  • helpfulOther - 0
  • helpfulInformative - 0
  • helpfulClear - 0
  • helpfulEmpathetic - 0
  • helpfulGoodSources - 0
  • helpfulUniqueContext - 0
  • helpfulAddressesClaim - 0
  • helpfulImportantContext - 0
  • helpfulUnbiasedLanguage - 0
  • notHelpfulOther - 0
  • notHelpfulIncorrect - 0
  • notHelpfulSourcesMissingOrUnreliable - 0
  • notHelpfulOpinionSpeculationOrBias - 0
  • notHelpfulMissingKeyPoints - 0
  • notHelpfulOutdated - 0
  • notHelpfulHardToUnderstand - 0
  • notHelpfulArgumentativeOrBiased - 0
  • notHelpfulOffTopic - 0
  • notHelpfulSpamHarassmentOrAbuse - 0
  • notHelpfulIrrelevantSources - 0
  • notHelpfulOpinionSpeculation - 0
  • notHelpfulNoteNotNeeded - 1
  • ratingsId - 18840213456774637370E3957C974EB4FEC4E7CB7C3B16A31AD6EC8ECDA54BDBA747D1A3FB742F8F78B