Birdwatch Note
2025-01-27 23:31:06 UTC - MISINFORMED_OR_POTENTIALLY_MISLEADING
Mixture of Experts is an existing technique used by many models such as Mixtral 8x7B, DBRX. Most likely, GPT-4 and other frontier models use MoE as well (unconfirmed since they are closed-source). https://en.wikipedia.org/wiki/Mixture_of_experts#Applications_to_transformer_models
Written by BAEF1BB423A2D2A700E5996AE6F922E77E993CC15ABFE71D702C70C840462A0D
Participant Details
Original Tweet
Tweet embedding is no longer reliably available, due to the platform's instability (in terms of both technology and policy). If the Tweet still exists, you can view it here: https://twitter.com/foo_bar/status/1883686173295673821
Please note, though, that you may need to have your own Twitter account to access that page. I am currently exploring options for archiving Tweet data in a post-API context.
All Information
- ID - 1884021345677463737
- noteId - 1884021345677463737
- participantId -
- noteAuthorParticipantId - BAEF1BB423A2D2A700E5996AE6F922E77E993CC15ABFE71D702C70C840462A0D Participant Details
- createdAtMillis - 1738020666891
- tweetId - 1883686173295673821
- classification - MISINFORMED_OR_POTENTIALLY_MISLEADING
- believable -
- harmful -
- validationDifficulty -
- misleadingOther - 0
- misleadingFactualError - 1
- misleadingManipulatedMedia - 0
- misleadingOutdatedInformation - 0
- misleadingMissingImportantContext - 0
- misleadingUnverifiedClaimAsFact - 0
- misleadingSatire - 0
- notMisleadingOther - 0
- notMisleadingFactuallyCorrect - 0
- notMisleadingOutdatedButNotWhenWritten - 0
- notMisleadingClearlySatire - 0
- notMisleadingPersonalOpinion - 0
- trustworthySources - 1
- summary
- Mixture of Experts is an existing technique used by many models such as Mixtral 8x7B, DBRX. Most likely, GPT-4 and other frontier models use MoE as well (unconfirmed since they are closed-source). https://en.wikipedia.org/wiki/Mixture_of_experts#Applications_to_transformer_models
Note Ratings
rated at | rated by | |
2025-01-28 16:48:47 -0600 | Rating Details | |
2025-01-28 16:27:30 -0600 | Rating Details | |
2025-01-28 10:25:05 -0600 | Rating Details | |
2025-01-28 10:13:30 -0600 | Rating Details | |
2025-01-28 09:36:35 -0600 | Rating Details | |
2025-01-28 03:05:11 -0600 | Rating Details | |
2025-01-28 01:38:23 -0600 | Rating Details | |
2025-01-27 22:41:19 -0600 | Rating Details | |
2025-01-27 21:57:47 -0600 | Rating Details | |
2025-01-27 21:30:30 -0600 | Rating Details | |
2025-01-27 21:28:50 -0600 | Rating Details | |
2025-01-27 20:16:28 -0600 | Rating Details | |
2025-01-27 19:32:51 -0600 | Rating Details | |
2025-01-27 19:32:36 -0600 | Rating Details | |
2025-01-27 19:22:34 -0600 | Rating Details | |
2025-01-27 19:15:37 -0600 | Rating Details | |
2025-01-27 18:40:17 -0600 | Rating Details | |
2025-01-27 18:28:27 -0600 | Rating Details | |
2025-01-27 18:25:04 -0600 | Rating Details | |
2025-01-27 18:17:21 -0600 | Rating Details | |
2025-01-27 18:11:48 -0600 | Rating Details | |
2025-01-27 17:51:37 -0600 | Rating Details | |
2025-01-28 22:19:31 -0600 | Rating Details | |
2025-01-28 15:35:12 -0600 | Rating Details | |
2025-01-28 08:08:02 -0600 | Rating Details | |
2025-01-28 06:54:24 -0600 | Rating Details | |
2025-01-27 20:13:55 -0600 | Rating Details | |
2025-01-28 15:46:47 -0600 | Rating Details | |
2025-01-28 10:24:00 -0600 | Rating Details | |
2025-01-28 00:55:05 -0600 | Rating Details | |
2025-01-28 00:03:55 -0600 | Rating Details | |
2025-01-27 23:06:52 -0600 | Rating Details | |
2025-01-27 22:04:26 -0600 | Rating Details | |
2025-01-27 21:51:23 -0600 | Rating Details | |
2025-01-27 21:25:09 -0600 | Rating Details | |
2025-01-27 19:17:51 -0600 | Rating Details | |
2025-01-27 19:08:31 -0600 | Rating Details | |
2025-01-27 18:57:20 -0600 | Rating Details | |
2025-01-27 18:35:08 -0600 | Rating Details | |
2025-01-27 18:34:15 -0600 | Rating Details | |
2025-01-27 17:47:05 -0600 | Rating Details |