Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Waymo is SOOOOO amazing. I prefer it to Uber. When they bring it to Vegas I'm al…
ytc_Ugz3D1A7Q…
G
This man got chatgpt Sweating balls. Why are you instigating so hard, bro?Chill …
ytc_UgxEPJlOL…
G
My phone can’t even do voice to text properly…miss me with “AI is going to end t…
ytc_Ugw250M2R…
G
They expel kids for curling up in a ball and getting beat, too. So, you did the …
rdc_nvqisvh
G
AI boom is one big lawsuit away from being disowned by CEOs. Cos companies alway…
ytc_UgwZqEcFu…
G
Wait. Is this endorsed by Elon?!? Your kidding me?!?? I CANT BELIEVE IT! He’s le…
ytc_Ugxu-0zNs…
G
7:15 I know what we can do with super artificial intelligence, and that is get t…
ytr_UgxpzCnzA…
G
Another way for increasing pressure... AI USED IN A WAY THAT IS NOT HELPFUL TO S…
ytc_Ugx_wALw_…
Comment
Hallucination is not a problem of TRAINING!!! It’s a ‘feature’ of calculating words (“tokens” actually) and being more/less strict, or more/less cold/hot in order to communicate with humans in a relatable way. And god damn it, it doesn’t “care” about anything. Probabilistically (per LLM guess work), it’s calculating words that fit the prompt and context and its own training. Stop assigning agency or even some “understanding” of information. There is no decision making! The result is subjectively magical to the human observer, but don’t abuse terminology that’s going to confuse your viewers.
youtube
AI Moral Status
2025-12-17T02:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyE3RhkarsXglEKbel4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwNedXKXHpm55QxMKN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxeGSODXXNQ_a7cN5d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1t_U_DgBnORUuUZ54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzwJs-yWxL2Zw8kGr54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1YrapuCCq5OnagPh4AaABAg","responsibility":"industry","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz3ZD8TXDmd2iQNSRt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGjcwuCAXJBOSJhFF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzEEG3LPOv5PP-a6Fd4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPYtYsY7Y8ajDzra14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]