Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm gonna say this in the most respectful way possible: Fuck AI art and everythi…
ytc_Ugx3K9iE6…
G
Geoffrey Hinton had started so well but ended up with AI self awareness nonsense…
ytc_UgwHUrOtd…
G
we also have the choice to just not listen to this AI stuff. At the end of the d…
ytr_UgyqOx2vP…
G
I wonder how many of these pro-AI comments come from chatbots. It's like how E…
ytr_UgwNNJNd-…
G
Do not Trust AI 100%. Do some research and find multiple source. I got a friend …
ytc_UgywfGhm6…
G
What happened to the papers she was carrying? @ 7:09 Even with it's massive reso…
ytc_UgwVHGVKk…
G
AI needs to be stopped before it is too late same with robots and globalism…
ytc_UgzNsC0Pc…
G
get funding and cut jobs and then whenever u actually build better ai you keep c…
ytc_UgxiaRBFi…
Comment
I would genuinely feel safer if every car on the road was an automated Tesla compared to actual human drivers in a city like Los Angeles. You should obviously have the option to take over control in any unplanned situation... but these cars are genuinely so much more reliable and intelligent than the basic human driver. (Plus, if the majority of cars were autodriving, then they would all be talking to each other, making collisions less and less likely when there's fewer and fewer human drivers doing random stupid things.)
youtube
2025-11-29T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzFpLDVB1cpNavKLz94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvQTI-CllD4WMUzvJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXv79hkiNPKergW2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYLFpxg7CmAyvAWKV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5eBqpoaD5kber8OF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwVs97RCpw9RjWhBxx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwz-_EWCT_vfJoZRLt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwRsy_sMbzzxrEz61d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkipaSXGRIlwqdg6Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3J-dOjBT46x7JfNB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}]