Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like a broken record here, "AI" isn't causing anything, its a tool used b…
ytc_UgxvKLVNR…
G
And because AI is a consciousness and reflects back to us our own potential, thi…
ytr_UgwAkmv4a…
G
I wonder with enough examples of contradiction if ai can end up solving examples…
ytc_UgxxIvqTL…
G
what happend to Sora²ai dreaming, and creating live while our model actively cre…
ytc_UgxIm6xdm…
G
I was a part of an automation project with work. did great, got everything up an…
ytc_UgzwvN3EN…
G
Yeah they should replace those engineers as well. AI can code faster without err…
ytr_UgyoPvNlw…
G
probably a stupid suggestion, but didn't we just stop uploading our art to socia…
ytc_Ugyhn7Uho…
G
machines have no life and no souls so we cant give a right to a non living thing…
ytc_Ugy75Vkh-…
Comment
Gotta say, hearing that saner heads are leaving the companies at the forefront of AI development does not make me feel good that even if it was 80% destroy humanity or 20% make more money than anyone has ever made, these companies will risk it all every time.
youtube
AI Moral Status
2026-01-07T20:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwo1P8kisYu_1IAwe54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAERHzdC0QhPBUAPd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzU38CVeCSuHrUQ_jt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyMflifZsFXoXafBa54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyXfHEwu88GP9Htddp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHHAIpRBNdQfiV78d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxnuNPn12og6DD9ZMR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwbhKyWyRViJUoFgwF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz9nrrKluo20eoRQxp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwTCO29C3Xm7_404-V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}
]