Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
when humanity stops been so selfish and stop looking for a stupid excuse for eve…
ytc_Ugir2u5ei…
G
If anyone has access to your ai camera except you then it can and will be used t…
ytc_Ugwz-CWa_…
G
Remember Trump is in with alot of the creators that run AI and is really pushing…
ytc_UgwA8Opxn…
G
Once it is given the means to construct machinery and control it entirely autono…
ytr_UgyRqxaWP…
G
SpaceCakeism Lol my favorite part from ergo proxy is when the cop tries to order…
ytr_UggRPiq5d…
G
Looks almost real. It looks good not sure if to like it or be freaked out by it!…
ytc_UgzACq6Xm…
G
Dentist not only have to do teeth, they sometimes have to deal with some dramati…
ytc_Ugx9MOJI8…
G
WHY SO MUCH POWER CONSUMPTION?
AI's answer was: 10,000 homes' power consumption …
ytc_UgykGEVtj…
Comment
one thing that bothers me about all these conversations is that humans are always framed as inherently empathetic and, well, "human". and then we contrast AI to that. we act like AI would do all these dangerous things out of self interest, but like, have you guys looked at the world around us? AI may not have empathy, but many people don't eather. at least AI doesn't have an ego, malice, and all these things most powerful humans in the world do. so in my book, it really can't be worse. There is no point maintaining status quo. As an environmental scientist, status quo is unacceptable. AI is a gamble, but i distrust humans so much more than i do a potential superintelligence.
youtube
AI Moral Status
2026-01-04T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwo1P8kisYu_1IAwe54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAERHzdC0QhPBUAPd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzU38CVeCSuHrUQ_jt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyMflifZsFXoXafBa54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyXfHEwu88GP9Htddp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHHAIpRBNdQfiV78d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxnuNPn12og6DD9ZMR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwbhKyWyRViJUoFgwF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz9nrrKluo20eoRQxp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwTCO29C3Xm7_404-V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}
]