Birdwatch Note
2024-11-28 19:18:14 UTC - MISINFORMED_OR_POTENTIALLY_MISLEADING
Fake. Generado mediante Inteligencia Artificial. El autor no menciona esto en el post. La clonación de voz con IA es una técnica accesible, pero peligrosa que debe de ser usada con responsabilidad: https://www.xataka.com/basics/my-vocal-ai-como-clonar-tu-voz-inteligencia-artificial-usarla-para-decir-que-quieras Esta misma técnica es usada en estafas, así cómo en desarrollar fake news: https://www.newtral.es/estafa-voz-familiar/20240702/
Written by D3F800C108074CD2E1BB505A6AD4F81F6A18AA091A66A0264C216CFB45F82F94
Participant Details
Original Tweet
Tweet embedding is no longer reliably available, due to the platform's instability (in terms of both technology and policy). If the Tweet still exists, you can view it here: https://twitter.com/foo_bar/status/1862077757267509492
Please note, though, that you may need to have your own Twitter account to access that page. I am currently exploring options for archiving Tweet data in a post-API context.
All Information
- ID - 1862214434040504814
- noteId - 1862214434040504814
- participantId -
- noteAuthorParticipantId - D3F800C108074CD2E1BB505A6AD4F81F6A18AA091A66A0264C216CFB45F82F94 Participant Details
- createdAtMillis - 1732821494004
- tweetId - 1862077757267509492
- classification - MISINFORMED_OR_POTENTIALLY_MISLEADING
- believable -
- harmful -
- validationDifficulty -
- misleadingOther - 0
- misleadingFactualError - 0
- misleadingManipulatedMedia - 1
- misleadingOutdatedInformation - 0
- misleadingMissingImportantContext - 0
- misleadingUnverifiedClaimAsFact - 0
- misleadingSatire - 1
- notMisleadingOther - 0
- notMisleadingFactuallyCorrect - 0
- notMisleadingOutdatedButNotWhenWritten - 0
- notMisleadingClearlySatire - 0
- notMisleadingPersonalOpinion - 0
- trustworthySources - 1
- summary
- Fake. Generado mediante Inteligencia Artificial. El autor no menciona esto en el post. La clonación de voz con IA es una técnica accesible, pero peligrosa que debe de ser usada con responsabilidad: https://www.xataka.com/basics/my-vocal-ai-como-clonar-tu-voz-inteligencia-artificial-usarla-para-decir-que-quieras Esta misma técnica es usada en estafas, así cómo en desarrollar fake news: https://www.newtral.es/estafa-voz-familiar/20240702/
Note Ratings
rated at | rated by | |
2024-11-29 21:59:25 -0600 | Rating Details | |
2024-11-29 06:58:49 -0600 | Rating Details | |
2024-11-29 03:26:51 -0600 | Rating Details | |
2024-11-29 03:09:35 -0600 | Rating Details | |
2024-11-28 23:41:59 -0600 | Rating Details | |
2024-11-28 22:01:03 -0600 | Rating Details | |
2024-11-28 18:51:48 -0600 | Rating Details | |
2024-11-28 18:38:36 -0600 | Rating Details | |
2024-11-28 18:03:11 -0600 | Rating Details | |
2024-11-28 16:31:32 -0600 | Rating Details | |
2024-11-28 15:45:22 -0600 | Rating Details | |
2024-11-28 14:43:32 -0600 | Rating Details | |
2024-11-28 14:16:51 -0600 | Rating Details | |
2024-11-28 13:53:20 -0600 | Rating Details | |
2024-11-28 13:50:37 -0600 | Rating Details | |
2024-12-01 07:34:13 -0600 | Rating Details | |
2024-11-29 02:00:05 -0600 | Rating Details | |
2024-11-28 23:55:50 -0600 | Rating Details | |
2024-11-28 17:56:15 -0600 | Rating Details | |
2024-11-28 17:01:41 -0600 | Rating Details | |
2024-11-28 15:01:24 -0600 | Rating Details | |
2024-11-28 13:39:37 -0600 | Rating Details | |
2024-11-28 13:20:11 -0600 | Rating Details | |
2024-11-29 02:06:59 -0600 | Rating Details | |
2024-11-28 18:38:39 -0600 | Rating Details | |
2024-11-28 18:32:17 -0600 | Rating Details | |
2024-11-28 13:46:24 -0600 | Rating Details | |
2024-11-29 04:32:56 -0600 | Rating Details | |
2024-11-29 03:10:42 -0600 | Rating Details | |
2024-11-29 02:52:08 -0600 | Rating Details | |
2024-11-29 02:47:39 -0600 | Rating Details | |
2024-11-28 16:24:19 -0600 | Rating Details | |
2024-11-28 15:56:00 -0600 | Rating Details | |
2024-11-28 15:10:38 -0600 | Rating Details | |
2024-11-28 14:28:30 -0600 | Rating Details |