Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What I don't understand is that if AI is so energy intensive, why don't electric…
ytc_Ugx2ZDBoH…
G
We appreciate your perspective. If you're interested in exploring advanced AI mo…
ytr_UgyimyQkr…
G
@NasseInc Dude, if your artform requires gambling, that isn't art.
Sure you cod…
ytr_UgzTbg3H8…
G
Literally isn’t. Is the kid’s fault for not seeking help and relying on a robot…
ytr_Ugx2et2zT…
G
Live Facial Recognition shouldn't be tenable under GDPR alone:
- It requires al…
ytc_UgzkYPstW…
G
My question for these ceo's, yes in the shorterm they cut labor costs, and there…
ytc_Ugx7GZbs7…
G
Man complains about deepfake porn while he watches sick BDSM porn or hentai or w…
ytc_UgzSqvjL5…
G
Currently AI seems to have issues with understanding nuance and complex human re…
ytc_Ugx21ACaQ…
Comment
If humans make sentient robots with specific purposes they essentially brainwash them
This topic is to complicated to talk about until it happens because then we have both sides and humans cant make excuses if it is that way my head hurts
Plus it's not like humans will make them completely aware they will make them have specific goals which is sad because it's not really natural if they crossed that barrier tho they would make ai natural which is weird to think about now
But it will probably get to the point were they arent programmed to have a goal and are free to be themselves because it only takes one curiosity on a person of what would happen to make it happen after all that's how the idea started isnt it
youtube
AI Moral Status
2021-06-12T12:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwB829cGMO6IugeH9R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAnZPFjHZj68J7sHR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz4whSHZ3YDv-kbMNd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgysT2Ss4DwG6flSvU14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwdVvkVKAM2iyx7zKp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-LNOj7B6NOVQmQ-N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwW3H177NbWHnDSLtt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxpEGtnzz7hdBoS7X14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_uifIODEWb-Ahvil4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwqU2ED8kLjNdaZfOl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]