Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The fact a real human did this is way more impressive than the AI edit 😳…
ytc_UgwHlQebV…
G
I think it is overwhelmingly negative to imagine what could happen in the future…
ytr_UgwfwbP9Y…
G
I hate the fact that people can literally make money out of stuff they “made” wi…
ytc_UgzOEN-QN…
G
Now is a good time to start a movement that values authentic human connection, p…
ytc_Ugy8644Wp…
G
Tesla is getting millions of miles of data with human beings as guinea pigs. Wha…
ytc_Ugwy8mfNK…
G
This is so much bullshit, when i see our factories in germany the one of the wor…
ytc_UgwD3Q-qk…
G
AI is the reason I stopped posting my art 10 years ago.
I knew this bullshit was…
ytc_UgzrCFARl…
G
Yep but it's a reason you must ship 10x faster. Even if you can't review code as…
ytc_Ugz7QrxK8…
Comment
@someonesgoat regardless of robots, they are programmed and programmable. Just as Isaac Azimov invented rules for robots in his sci fi, it is a reality that such engineering is programmable. It would be idiotic and ridiculously irresponsible to program a code that could threaten humanity through lack of control. Even humans are programmed by society to follow rules, so this notion that an 'AI' would be given a clean slate and unbridled autonomy is silly.
youtube
AI Governance
2023-04-18T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugw_ZmgXgBrfzAiRFwR4AaABAg.9odGzFJyXKn9odTf9p2N8S","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwMAUAKOGqPjkqchWV4AaABAg.9odFZ21oXCD9odT-_0P3ke","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyfIOcfg7taWn8Mw-x4AaABAg.9odA5SKjPYg9odQYcUAGKU","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgyfIOcfg7taWn8Mw-x4AaABAg.9odA5SKjPYg9odUfmmD_RU","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugzyj51Yy96YXMIAW-V4AaABAg.9odA3nmhtUa9odBGa2DZLk","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgzQUPOodNBZTN9DLdp4AaABAg.9od6C5GlhJs9od7MoWgiCY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugxkn_1UZO6KXJp7KDx4AaABAg.9od4reZ8kKM9odHE09IhkJ","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugxkn_1UZO6KXJp7KDx4AaABAg.9od4reZ8kKM9odNR650exH","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugxkn_1UZO6KXJp7KDx4AaABAg.9od4reZ8kKM9oe0I17T-TR","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugxkn_1UZO6KXJp7KDx4AaABAg.9od4reZ8kKM9oey1bYYGGv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]