Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is no way AI should be allowed to teach our children! It’s a propaganda m…
ytc_UgzRkDaSJ…
G
"He's not like somone like {Elon) Musk who has no moral compass," Does Sam Al…
ytc_Ugx8-xT4c…
G
The other side of this I often purposely tell it not to give me crisis hotlines …
ytc_UgwRMMi4x…
G
AI does not have access to everything humans have ever written, it has access to…
ytc_UgwSLoi2E…
G
Have I been a guinea pig or a test subject absolutely I might be dumb, but I’m n…
ytr_UgwxGmRgq…
G
So... you know the very potentially, likely sinister scenarios, even pretend to …
ytc_UgwcB11TS…
G
Imagine AI at the "enemies" hand.
I think AI is out of control. So far what I s…
ytc_Ugz0qxmpD…
G
Lying is actually very common human behaviour and the ai is trying to mimick hum…
ytc_UgwFvt-cl…
Comment
So, what I hear is that AI needs to go to preschool. LOL, as a preschool teacher I just felt so much connection to what I was hearing and how children learn in preschool.
Around that age (3-4), a child is learning about 'self.' They learn about their emotions and senses and how they are a part of a community of other humans and how to navigate the society.
But then it's like, hmmm, do we want AI to have a sense of self? And also *how* do you interpret all of those things in a text based algorithm for an AI to learn? Do we need to make them able to have senses so they can learn? How close to consciousness does an QI need to get to learn and still *not be conscious.*
I don't even know if that can be done! 🫠🫠
youtube
AI Moral Status
2025-11-03T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyurrGPp-AAZeUoyql4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwAylqT3wTYylyFoqZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtvR5O3VmoX__5eZV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSI7yNy7ue8BgtyjZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzqVHYSzMdQVbM4uCV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxMPrZIhabMsKgCgW54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy64zH0qF058VOPP754AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKvyTgmPZzx0Ng78d4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwmnGfY7JEAzMbPgDJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx4zLlQia-tuCOZxnR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]