Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yup 100% agree, commercial Ai will inevitably fail, although in the medical and …
ytr_UgwWGXlXM…
G
Lots of these AI "Artists" don't seem to understand, that if you take paints awa…
ytc_Ugx938Dpe…
G
There shouldn't be any automated driving cars. Like anything else it's great whe…
ytc_UgxXaFBuW…
G
Fuck everything about this. Scanning our bodies, logging our fucking biometrics.…
rdc_iz0fdjq
G
Accountability doesn't matter for a debate about *determinism*. It does (obviou…
rdc_devjlw8
G
He mentioned his sora video but none of his titles are very straight forward doe…
ytc_Ugxp6ZoNm…
G
Don't forget if AI takes everyone's jobs. Then no one is gonna buy company produ…
ytr_UgwU-bjly…
G
Ha ha ha ha ha that answer is not real robot but you make your own voice to beco…
ytc_Ugy8dzQh2…
Comment
i don't fear super intelligent ai, if it ever exits, you have made a person. expecting a person to be your eternal slave just because you made them is just evil. so im against their creation for that reason. but if they are created then i will support their autonomy, and if they become evil then i guess you just turn off the power? its not like electrical infrastructure is particularly robust, nor is its running and maintenance done or doable by automation. at leased currently.
mostly my opinion and i refuse to fear existential threats, i have enough anxieties as it is.
youtube
AI Moral Status
2025-10-31T09:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwGCenfic0DffQynGV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMwUrLPPKGZc7N7gZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxYTqk0c1AMEO-Cn0R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugzvezki_UIzKiot7-R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwqT_qp2eypDr9Kwf14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYZAS6C1uYHlECl894AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyJjyR6omrJ_AWUSwR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTHp--dd6C17hBoY14AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyv3k5O2BLJBDFPWJN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz1YYOzpzTlkFa9XrV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}
]