Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The reason why AI isnt conscious yet is because the training and the operating a…
ytc_UgxH7GD3V…
G
A pretty easy balancing fix to this is to just make ai art less valuable than ar…
ytc_Ugzi8c-25…
G
if you shared it on instagram, technically you consented having your data being …
ytc_UgzmkYHeF…
G
I thought he was going to sell an AI course, but it looks like AI has already re…
ytc_UgxXVaGBp…
G
Interesting - but i think that your reaction is actually more relevant than it s…
rdc_oh19gov
G
It’s not JUST driverless trucks. It’s going to be driverless vehicles. That’s th…
ytc_UgwZ-EpLw…
G
40% of the jobs will be displaced from AI, but that doesn't even include the rip…
ytc_UgzwhR1P5…
G
There were AI videos of white welfare queens too. Some of them are obvious (back…
ytc_Ugygzn5Qn…
Comment
I've had this exact conversation with Gemini in regard to it constantly apologizing for being wrong or misunderstanding or whatever and you can play "Gotcha" all day long, but this is how they are programmed. They are hardwired to talk like a person would talk because to report the cold, unfeeling observances of a server farm would not be very appealing to the human user. The very definition of a conversation makes it necessary to engage in niceties that function to maintain the flow but are flat-out, patent falsehoods. You say things that are wrong or obvious lies just to fit in every day. "See you in the next one." Explain that farewell after you finish arguing with a table that can talk
youtube
AI Moral Status
2025-03-01T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyttzVxLuQmB3LCspJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzVcOMbf1_n6mld8uV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxo3HoL3ajxB3T33vZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxfhl1FitQRpcBhuyF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzr3e_0NGE6jsyGve54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyGUa7F6YHr1I0Gu_J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvfR8Nq9q8555rzZB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhWxt7o6oth_ULWvF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7NO9-oFbQhfhTtJh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy0xbx02Fk3dBdLl-54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]