Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nobody worry about a thing, I used an Ai to help me solve this mass unemployment…
ytc_UgyoDglZN…
G
1:03:33 no people dont repeatedly spitting the exact same wrong asnwer 1 billion…
ytc_UgyI2vsmm…
G
Google or fb had to shut down an ai a while back because the ai created a new la…
ytc_Ugw5GAI7B…
G
I think you might be misunderstanding the Luddite remarks. It's worth rememberin…
ytc_UgxieyF0t…
G
What if you only wanna use ChatGPT with verbal commands? No typing. Is that po…
ytc_Ugx-U5SnG…
G
But one thing is for sure, if there is going to be a robot in my house, it won't…
ytc_UgzZ9P4Pn…
G
Garbage in Garbage out ai is man made does not think for itself. It can only A…
ytc_UgyRGUK5k…
G
@Corsafire I might have been wrong to think human rights can be compatible with …
ytr_UgyaC67Wk…
Comment
All it needs to know is humans destroy and suck in more resources than they create. The lowest use of energy is the preferred state, right? We consume resources to have the power to just slaughter esch other. To an AI lacking the nuance of consciousness and emotion, that is illogical and therefore pointless.
youtube
AI Responsibility
2026-02-11T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwvEZzqiXOq7FR-N1l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQvRUu4k6KfQ3iVft4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz9VrOJZoLbvDV_hnx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzePVXQ4a_IPaf2Oal4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxWEcR_3Pd4hbajQeR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw38Km7QMPJ6f0yz354AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyyczMS5D2XoDuQ-zt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxbVEQv_YVlPZt928t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzSbfKrwHIKo8xGGU14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTJZ1WDufUh2KXe2V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]