Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This morning, I came up with a mobile app idea. I told ChatGPT about it and aske…
rdc_jhdn0ce
G
He’s like a dude who’d like to be a great artist, but he absolutely has no style…
ytc_Ugy1Jtf2a…
G
Brandon, I hope you don't mind if I say... I'm an atheist, and yes, sometimes yo…
ytc_UgyEOwHCk…
G
"What models have you trained?" At best he has copied and pasted from the docs a…
ytc_Ugyx-YyA0…
G
Human coders will be and already are being replaced by AI coders. Then AI will b…
ytr_UgzUYknAR…
G
I personally feel that AI is just a powerful tool, superb tool, that can help us…
ytc_Ugz_b_UO9…
G
why is everyone crapping it - when we have land - we can grow food and we can sh…
ytc_Ugy1BorjT…
G
I think AI is like the invention of Dynamite. Groundbreaking and opening a new e…
ytc_UgxOJO6da…
Comment
In the future Ai robots could be a potential danger to humans, for example they could
have a better memory and think much faster
than us and seek to control us.
One way i can think to offset that risk is to have
safe gards in place for example compulsory
lines of code that render the robot harmless.
and also an easly accessable manual off switch.
youtube
AI Harm Incident
2025-05-20T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzRnRlwN4i5yPlzX794AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkDSFoy95kv_wHn0h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzCvn1957KhiPf9LBp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyLQ7XS1ZhsximM2Jd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwB3cyOpxCstA-IwJZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxx2uDpzzC_zMgBKeh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz7MeCsTaETmzW7bjp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyFp_QdW0iQ8IMgZb14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy_GTVng8rT7UhavsZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzTUUaAuId2PGbVvRx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]