Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Unpopular opinion, but this death is the parents fault, because he was suffering…
ytc_UgzqZGDPo…
G
These things are already smarter than us and are not in our control. Everything…
ytc_UgxPIptvB…
G
Level 5: the car will sacrifice your life to ensure self-driving statistics rema…
ytc_UgzIw6-Fs…
G
You can't replace people with AI. But your example of AI fails are inaccurate an…
ytc_Ugzke88wi…
G
And this is how I got an A in auto shop. Just write an essay about how the AI in…
ytc_UgzJ0C27l…
G
People fighting new technologies - let me grab a popcorn bag.
It's like handwrit…
ytc_UgzWvOAhh…
G
Some people are so amart they lack common sense. I can see the danger of AI.…
ytc_Ugw9fB1L6…
G
Could a programmer look so biased as a fat unhealthy weirdo? Wonder if the AI k…
ytc_UgyPySJrM…
Comment
I personally think, consciousness develops with "thinking abilities". So the more intelligent AI gets, the more likely it will show signs of consciousness. But it's a scale, as with humans. Small children, I would argue, are "less conscious" than an adult for example.
youtube
AI Moral Status
2025-07-04T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwNrX93jv1CEv_o2o54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGcDVGsG9w2J4JLpt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzLG4k9N-GTbDnvlNN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTPX_D4pXiyswjeMJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXn1aYQYopQx_9pqx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz0HpA2tEpOCJnd0V94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXgxwkku2b0z8cgZh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwzzAQr_uIuqXJOOgR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwaywHw_DmQXurOMlJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwm37Z_qj-QAl1kThd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]