Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A bunch of AI bossing around humans sounds very concerning.
And how are emotionl…
ytc_Ugzahnbyj…
G
I think the only way would be to "hack" the reality... which AI with iq like 400…
ytc_Ugwh2AzxB…
G
@exurb1a glad you mentioned the poss of an AI hiding its sentience; seemd like a…
ytc_UgxAR_m4e…
G
Show Me How: Really awesome deep dive. I was thrilled so many historical example…
ytc_UgzCzGiB9…
G
Dear AI, please extrapolate this out as far as you can:
What will you do, what …
ytc_Ugypro70l…
G
but people will prefer a human actor n singer.. not robot..
though a. i can ha…
ytr_UgzhD6PRg…
G
vfx is correct, Ai take on moving the camera close to your face to see the react…
ytc_UgzRkkOVk…
G
Yeah the second copilot needs to do something more complex than boiler plate or …
ytc_UgxfH44je…
Comment
The same hand-wringing over the potential hazards of AI aren't so removed from other major advances of technology, from the invention of fire, to the automobile. Imagine life without technology, medicine, tools, etc. There will definitely come a time in the near future where it will be highly illegal to be caught using ANY AI not directly approved by the powers that be, you can count on that!!
youtube
AI Moral Status
2026-01-30T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwGKcm2ybSTRfn718d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5FHO6CmReYCS29QF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyETgHr4DWhzwEaahl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMqd62o-XcHw9j-1l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxuwgtghtG9vOV2tgF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDB6WluvGPywRSqsV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyqUokl3OZiIq_FY8x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCMN4wsZtAEZy1Jep4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaoMYJegby_khXa694AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMZ7UKWfAWyHSogbF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]