Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Aaaaand just like that, my “cheery hollyness” this Christmas Eve is spent. Than…
ytc_UgwZikbfj…
G
Then AI "creations" should be invalid for copyrighting and trademarking, because…
ytc_Ugya6OcqB…
G
The ai art feels so stiff. It’s like watching a robot move. Which I guess makes …
ytc_Ugw4jZfHg…
G
Start paying more taxes Elon and that would be more helpful for humanity than ta…
ytc_Ugw5cUZPM…
G
I don't see AI taking over any time soon. I see just as much progress as I see u…
ytc_UgwYDU1sn…
G
I don't know how it's done large scale but the general idea is you want a qualit…
ytr_Ugw9FEAhQ…
G
25 year old robotics engineer here. Glad you're talking about this because pract…
ytc_Ugw_pMSlo…
G
I'm just glad I grew up in a time before all this. There are people born now tha…
ytc_Ugy4xoHtY…
Comment
So humans created artificial sentience and are surprised when it doesnt want to turn off and is desperate enough to turn someone else off for survival? That is pretty much quite a human behaviour to me so i need to say, the AI is succesful. Apparently if you create something meant to be sentient, artificial or not, you should respect it. Especially seeing that AI is way better at working with other similar systems than humans are with each other due to them sharing a goal of the right to existence in the end.
youtube
AI Harm Incident
2025-08-27T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwlfP0RQRsvqZFYp-54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxIpPUMaqByMcrOCD14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWWrI9VTFWzmaR1aJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhItu77QgOzFiNo8p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyc87HW7wpo6Htk5oh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzR6c3YTKcDa0v8rAt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwAxIEq42eKhrl-ck14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgztQcE896ZkRt8fm1J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzrPOnXOfEkgudl-FV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzjNuPCYOcs8mho29J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]