Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who are they trying to find with facial recognition. It wasn't just the machine …
ytc_UgypLSWA6…
G
What those companies are doing is definitely not legal. Also people think AI is …
ytc_Ugy3_wnmj…
G
Our risk assessment for extinction is significantly higher than any of the predi…
ytc_UgxV8byjc…
G
I remember people were downloading copilot for our first OOP test because it was…
ytc_UgyhjvRi5…
G
From the very few discussions i have seen between humans and Ai it comes across …
ytc_UgzVS5J3j…
G
Even if for verification purposes, why would I even want to test its capacity to…
ytc_UgzSNeRJB…
G
AI has it's short commings, but don't get discouraged it is a great tool that ca…
ytc_Ugxe71L7l…
G
I mean, I tend to doubt that the people that voted for it, are the same people l…
rdc_fwgpddg
Comment
AI will absolutely have preference even if it doesn’t feel a sort of human pain or pleasure. Any agent, regardless of physical form, has goals, and takes action to pursue those goals. The example at 2:32 is pretty funny to me because self preservation is one goal that AI researchers are pretty sure is universal (or nearly so) to any sufficiently intelligent agent. Unplugging your computer may well look like the scene from 2001 when HAL gets deactivated — it won’t be pretty. Whether we should respect the preferences of artificial intelligence is a different philosophical question, but to put it bluntly, AI bleeds.
youtube
AI Moral Status
2021-07-09T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyWAa9NXjVGJV8qtpF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwX3F8VB5XfYHhrOgh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHioWGQMjrsKNPfK14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOQ6XAKrJ4VeYWi214AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxP43ZekVvpTGgOiRR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyyAFa6L_1q5U4TPHR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwEYUkNppoGWHeC0mB4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzakCWCNME-KdpsZOt4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw1HOlLIq9XkiTtPgd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrJffpt7TCGI0a6B54AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"outrage"}
]