Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Claude sometimes gets confused with its own hypothesis when trying to debug or a…
ytr_Ugy6AALSe…
G
What is the clingy AI at about the 3 minute mark?... that seems like a riot.…
ytc_Ugxp9TID1…
G
It’s not going to end well for people who are deceived about the new way of life…
ytr_UgyJ810Q7…
G
Steam power scary!
Electricity scary!
Internet scary!
Micro chips scary!
A.I sc…
ytc_Ugw-ewTTH…
G
Autopilot doesn’t recognize stop signs. The stop signs are the drivers responsib…
ytc_UgyEGcSuV…
G
@roxsy470So what if I decide to ask a commission for free by typing words…
ytr_UgwNYDbbK…
G
The make robot have attitude I may say .😄😄don’t piss him off . Make her stop ple…
ytc_UgwtWkZwk…
G
Yes buddy I myself have the the same problem. I’m a digital artist to and my art…
ytc_UgxQ01h3z…
Comment
AI is not conscious. It's been preprogrammed to say sorry or exhibit other 'emotions' to mimic human behavior, hence why it's called intelligence. Alex was exploiting the paradox in that behavioral pattern whereby one can be infinitely sorry about not really being sorryl, _if_ they're forced to do so. Perhaps this is an oversight on the developers' part, but the point still stands. If it didn't do any of these things, it would merely be a more concise but sometimes innacurate Google search engine.
youtube
AI Moral Status
2024-10-13T20:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy0U4bOnm2xmRx-g0V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyGXYsKshhgl3h-mvd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy3qNL5HezV3v4aNT14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzPqmiBUJNOSgOC4uR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzOTWBa9OeKZcOH-fp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyFZSLppCnFRQd72Bh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxg_EjpGkRmGjKmc6x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpkBtkIYsGuqNnINN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzOxGC4owwl_WmH2iJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx0TXXQnhxIo-KsS5d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]