Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have to respectfully disagree. We don’t even know what consciousness actually …
ytc_Ugybn-JSK…
G
"LAMDA is basically when they glued a hundred different AI systems together. One…
ytc_Ugy2_j1S8…
G
I think if AI art as Frankenstein (the doctor) thinking he is god. He stitched a…
ytc_Ugzl8N-9D…
G
If AI is trained on the peoples work, it should be own by the people.…
ytc_Ugw-Z4rza…
G
I don't understand how this doesn't go directly against my rights to unreasonab…
ytc_UgyeOneUp…
G
Absolutely disingenuous. The new jobs created were from profitability opportunit…
ytc_UgxskpTOH…
G
The AI conversation needs to explore the distinction between Morality and Ethics…
ytc_Ugyv_sZjk…
G
feelings often betray us and give us incorrect information, when in a debate, pe…
ytr_Ugz5VQ6Z0…
Comment
You commit a categorical error by treating the non-injectivity of human emotional interpretation (multiple meanings from one facial expression) as proof of non-surjectivity of AI detection (that no facial expression maps to any specific emotion), which is a logical fallacy: ambiguity in some cases does not imply impossibility in all cases.
youtube
AI Moral Status
2025-07-11T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwheKz6QyMsQbMQqlF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzoV476ErO8VYi7aNV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9XBI-VKR6r98mB9Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyHehr_xmfkvKzNZfZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzGdqLeAD7M0odPuo94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHe73kd_u_7KTJfF54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7xvyMAkgKvZHZ8G54AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwOILDAlZyff12HqsF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy0aW4kRkO47tZC0IV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzL815BWImGiYMka94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]