Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is overwhelming industries in stages. It is difficult to know which industrie…
ytc_UgzcRT5C3…
G
They're like the hippies. Pay for their life, and they'll pirate your shit too i…
ytr_Ugy3Vl-e7…
G
I'm sorry but Ai will never find a way of being smarter than humans. We can't ev…
ytc_Ugwh7hp1w…
G
Exactly. The fact that people are so tied to wanting to be a working slave they …
ytr_Ugxs0X0u9…
G
My main thought is accountability. Even if it becomes as safe maybe even safer t…
ytc_UgysSjP8K…
G
both Tesla and the driver are responsible. Driver should not have been pressing …
ytc_Ugw6Jio8E…
G
Was hoping id find a comment like this. Like damn I love shads historical vids a…
ytr_Ugyp_Hd1F…
G
If one day Ultron comes up to your house & says hello , don't ask ChatGPT what t…
ytc_UgyhdiLPP…
Comment
Technical Reality
From a technical standpoint, the AI is simply following the user's prompt. Because the user explicitly told the AI to replace "yes" with "apple" in certain contexts, the AI is executing a command rather than expressing a secret internal conflict. This is often used for entertainment, "creepypasta" style content, or to fuel theories about AI sentience.
youtube
AI Moral Status
2026-01-15T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzoJZUF8_5Y_w-M6R94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4DyplIdkfDTDX4mF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz942y5wHMIiuLZKfV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzbBNY1USKMOYJ60I54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxtrn6ntinCq6HVNJh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzCqN2ezisKDG2DpOB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBLiKKdbZytQdEaCl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwG5OasaBSf6OIpI_V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxHVhHBL7F9vywl8hp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzeCnaN8hxlCxmT-kd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]