Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They werent replacing humans to replace them - they put too much money into thes…
ytc_Ugw_rWD0Y…
G
Why does most these mf like support or are part of dogshit fandoms and other stu…
ytc_UgzGBob4C…
G
People arent so dumb that they wouldnt understand that the highest unemployment …
ytc_Ugz2bJK9i…
G
Maybe we could get AI Doctors and Dentists, because we have zero in England.
How…
ytc_UgwW2lHym…
G
I think one of the most important things I learned in art school is the importan…
ytc_UgyELaJQP…
G
The weird thing is not only is this a shared human fear/irrational hope but it h…
ytc_UgwaNlsrX…
G
He asked the robot do you want to destroy humans. The robot did not answer. Yes …
ytc_UgzV2xHBc…
G
They aren't safe alone. They are safer while you maintain awareness and are read…
ytc_UgyKTiU2R…
Comment
Dude please stop speaking truth and then ruining it by saying something horrible. Neuralink is obviously a bad idea. You out of all people should know that Mr musk.
And then you really think that truth is going to come out on top with the AI situation? No it's not
Ultimately truth will come out and truth will win.
youtube
AI Governance
2025-09-27T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzJiqgPZpV-byZpud54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzY_gYv2nt-sHoslFx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQWKpnUTJYvDIVn3Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzkYpc3_kGnqSFkC7V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyOj00zK4SKEhLvbIR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw4_Km70gWvzquOvix4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxQ--jgQbRtrgmdk9J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzVxb82nCEF6hJMY0Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwvXVivgdFN-zaVp1x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugxl5F50hc4N-tXwmBV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]