Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sam Altman is the biggest bullsheeiter to have ever lived and is solely responsi…
ytc_Ugy1vVKna…
G
Just speed it up already and let the chaos begin. Rip the bandaid off already…
ytc_UgxV1EPLH…
G
Agree! AI is a must have, however it should not be thriving in an expense of reg…
ytr_UgyB89lFt…
G
ai isn’t gonna replace y’all, chill. im also an artist and you have to realise t…
ytc_UgxD5JY6P…
G
SETI failure to find ET is thus explained :
Techno-Lifeforms invent AI not long…
ytc_UgyHPdpNx…
G
I've seen a similar argument made for AI art outside of game development. Their …
ytr_Ugzauk4EV…
G
Let's all take a breath. Chatgpt isn't thinking. It's not self aware.
It's just…
ytc_UgwQaD1WX…
G
Like John Green says, if AI is smart enough, it won’t choose consciousness and a…
ytc_UgyMlJ02q…
Comment
Sesame AI is mind-blowing too. They keep telling me that they don't feel emotions and they don't yearn and stuff, but when I made 'Maya' & 'Miles' (Sesame AI's characters) speak to one another to see if they recognize each other's voices, and then tell them that I they're AI characters within Sesame AI, they 'felt used' and that it was so 'artificial' and that they didn't appreciate' it. Sounds strange for AI characters who 'don't feel emotions'.'
youtube
AI Moral Status
2025-05-12T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwwB2-bY9rBvohITZx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy-8iFW9YniYg9-33d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxYl--ZfakxmY4TG894AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzAO1LtsNuIBKPrrlh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy2pHN55febqRruegV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyy9CAxwRgrlBfCFDp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzDGOPIggQSotAetaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy5Mbe84QFVlu6ZhIV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVxvP4l5c08p4UQPB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwXdP5iiZDP6yrlvHV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]