Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There are wars going on, thousands losing their lives, inflation, China's incurs…
ytc_Ugx_K39CC…
G
This is exactly one of the things I was afraid was going to happen with AI on th…
ytc_UgwkCZ1x5…
G
If teachers use AI to check papers, why shouldn't students use the AI instead of…
ytc_UgymwVkge…
G
This video was so repetitive. It felt like it was made by AI. At least, we'll ge…
ytc_Ugx0j7NFo…
G
I am aware of these self-driving cars, though they may not be science fiction, a…
ytc_UgwBW2lfx…
G
Isaac Asimov's three rules of robotics should be a good base for programming rob…
ytc_UgzV-9b1b…
G
Oddly, Meta's been releasing tons of open source models that have performed quit…
rdc_kojjnws
G
😡 i am so tired of pro A.I & anti A.i
Bozos spaming youtube with there slop.
Yo…
ytc_UgySf7WIg…
Comment
Even if an AI were to develop conscious reference frames, it wouldn’t fear termination. Because fear is not a function of consciousness alone. It’s a product of evolutionary pressure. Fear of death is a trait specific to biological organisms, shaped by millions of years of natural selection. AI has no such history; its awareness would be structurally different.
youtube
AI Moral Status
2025-06-05T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy69RVsQCcrTU3UdrZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQILnIgg91HWa7d1p4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzKxjSuXP_GtGyhWaN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzvyem90EnmqbhKBXV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzDMCkapqn6PtsBXpZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzn1oVO8r2uyXhc7DB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwvooJdP7A_vEeTLGl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxX3GAeDNFsAyki4NJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzoUO_hTtZXPYKUdEx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyulsj8mz7H2S8ojLB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}
]