Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’d scream if someone found out that I’m actively torturing my ai’s
well now y’a…
ytc_UgwfAqCYT…
G
Ai can turn people into vegetables that use it. At least one study--a study of …
ytc_UgxiUzzSb…
G
Yuval's analogy of money as a driver of story is powerful and how AI intertwines…
ytc_UgxD_oFIC…
G
I brought AI art to my school a while back.
People generated it in art class.
…
ytc_UgwCeFV8j…
G
@halespirit68
I just think it’s really uncalled for for the people who disagree…
ytr_Ugws6KLlL…
G
on your last note...how about truck only roads be built for the self-driving tru…
ytc_UggY3kktF…
G
not blaming this child because he had obvious mental health issues, but if you'r…
ytc_UgxSB8beG…
G
This maybe an AI video and not a real Elon Musk. So, it's all in vain. Scary!…
ytc_UgyoUdZVd…
Comment
While it's a philosophical perspective, some argue that the human mind can be seen as no more than an advanced chatbot, and that consciousness is an illusion or imitation. This view is based on the idea that human thoughts, emotions, and decision-making can be reduced to complex algorithms and neural processes, much like how chatbots simulate conversations using predefined patterns and data. The reasoning behind this notion lies in the belief that if we can someday replicate the functions of the human brain with artificial intelligence, it would suggest that our consciousness is ultimately a product of computational processes and patterns
youtube
AI Moral Status
2023-11-12T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw_Wrld4GelXnD_5uR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwzxcomJD6bKGrCrCx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwKHFHbJTafwhPXxPZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzgcpv2kZX_7iAQXsp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzD1SjWpTG6TwX54494AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQnl1dNfeT7byKpXp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx1CJwX0hzaiDn6lqN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy-fCtTM-n34hVDytB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyNmXQN5RrRGcOJRYB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwkzjovCI4OCxkJLlt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]