Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Very dangerous when robots get as smart or smarter than humans it’s just like th…
ytc_UgwHwTX7v…
G
Inko ye sab pehle sochna chahiye tha, bad mein kya faida, pehle nuclear bomb or …
ytc_UgzE3hqqr…
G
Elon Musk is an inhuman monster, scumbag who needs to be stopped and punished. …
ytc_UgwxIp4cT…
G
Does AI take over being senators and presidents, as they can make the best decis…
ytc_Ugz3JfGAF…
G
ai is not doing all the programming, and likely wont be. its doing some of the p…
ytr_UgzrjVz6K…
G
Elon Musk is high as a kite most of the time. So him being silent for 12 secs do…
ytc_Ugwe3kfPU…
G
That's the thing, Ai doesn't have a point of view. It's just a prompt machine.…
ytc_Ugx2lwTuv…
G
If idiots like ph work in government, it might be best if AI took over the think…
ytc_Ugw3FrqwB…
Comment
Thankfully the first people to loose their jobs will be these traitorous TV reporters... And it couldn't happen to a nicer group of people... Look we could easily stop Ai ...but we won't... Instead... An intelligence way superior to our own will arrive soon... And it will decide our future... Lets all pray its nice to us
youtube
AI Governance
2025-08-26T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxbCZ5rUTCvXJUPTfF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxs0gcyVzKWivoGr3V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwvNb-xFRs9R4Gl2lh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-kJul4C6dqZU_N4x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwDHNq1O45zRNEX_-R4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwX1Fi1YP-B5ztpcBF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyRc39gocaHOW9BBLR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwE7tqX7O3pGsv8AIx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzkIJMtudxinVq8xNF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjDWS35FZT7Ody6tp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]