Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@group555_ you are making a fair point, although I think it is still considered …
ytr_UgwBkc46x…
G
Man y’all don’t wanna see my AI chats on Talkie, Chai, And Crushon ai 🫣…
ytc_UgzMRaMRr…
G
Plans of how to destroy the AI
Weakness point: wifi / informations
Reminder: …
ytc_UgwerhOa1…
G
7:30 why would it be impossible to machines to have an 'actual understanding'?
J…
ytc_UgggkMex7…
G
I find you guys comical because 99% of you LIVE on your CELL PHONES and SOCIAL M…
ytc_UgzReNRTI…
G
As an artist (this is my alt account, but hi, Maya Lee Walker here.) I find it c…
ytc_UgwHwBBMS…
G
It took him long enough to understand the game theoretics.
Yeah, we ain't stopp…
ytc_UgyVj9eEl…
G
Thanks for sharing your experience! There's no market incentive for the creators…
ytr_UgwToCUn2…
Comment
11:25 if there was an asteroid heading towards earth that scientists concluded had a 10% chance of causing human extinction, every government would come together and spend billions to find a way to stop it. With AI, from the experts you mentioned, 10% is the lowest estimate for human extinction if we continue expanding AI's capabilities, and yet we're racing towards further AI implementation. Mindblowing.
youtube
AI Harm Incident
2025-10-11T14:2…
♥ 31
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgycFw_oAxw08zNr_At4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzYwINnI0ifyRWky3x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyROVrCZ-ErtdNYKDN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyHUT41mN1LJ9CFpsp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwPCO3zGy3qHfVTNAF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwy9uuXIiUrnInDFeV4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxEGdjP86i09fEHxP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwzzVpUAC_-Xbqxyy14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyz-s2V97wQ2F9PkdR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwB2LcUjb_Adqbch-54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]