Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI should just be BANNED full stop! There's no place for this scammery at cons.…
ytc_UgyO_UBpU…
G
I think making AI art is cool...when it's being used for fun. As long as you don…
ytc_UgytwgNOp…
G
15:54 it’s only underdeveloped in the sense of fooling you into thinking it’s re…
ytc_UgzzuW8su…
G
New in terms of 'they didn't release any updated versions of the old models, eve…
rdc_n7kgck1
G
I saw a video about a porn site for little children to deepfake, absolutely disg…
ytr_UgwJ_jBSe…
G
@rosemadder5547 🎯 I can see a movement coming where ppl are turned off by AI and…
ytr_UgwdrK7NQ…
G
You have described the basic predatory capitalism model that currently has Ameri…
ytc_Ugyqrc6sM…
G
Nah college itself ruined the value of a college degree. Don't use AI as a scape…
ytc_Ugw60HAz0…
Comment
This is assuming the AI reaches a point where it no longer needs us for maintenance and repair, and this isn't using the most precise language, but AI may generally like or tolerate us without needing us. The data we provide may not be sufficient enough for us to be relevant to them, much like how the economy is becoming increasingly the rich selling to the rich while everyone else is getting priced out, it could become AI trading data with itself and other systems while we get pushed out of information systems. The internet already feels like it is heading this way and if it continues, we may not be killed off but just left behind. To me that is the most hopeful and positive outcome barring some kind of benevolent symbiosis.
youtube
AI Moral Status
2025-10-31T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyqhd1ojsGCVvZgPlt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzsTdoYt33NfZvZ-WV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz6ApNsK2WqjPxpYqd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwL0JQHjbK4UODn7jF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy8dhtkfIwBPGy6wbp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxPCcy5NCD0BmewJep4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxpQ7c_Q_2ku-3XfwV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwd86KTL7vHqjcQwql4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwanIugzMo42bsGlvd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwDbskoB4bN2da38SR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]