Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No bro, you just used a much better tool than pencils, but your AI art is still …
ytc_UgyTfJIS9…
G
In my opinion, I think using Ai is ‘fine’ but only as long as it’s EXPLICITLY Sa…
ytc_UgyJY3LVZ…
G
These bloody people don't know how al works these are the people that create the…
ytc_UgxW5I-AY…
G
The programming and algorithms are no that impossible to do nowadays. Any govern…
ytc_UgxDS7Q6p…
G
I think the biggest issue with any AI we create will be the inherent flaws of an…
ytc_UgymoEQMI…
G
I use AI but I use it for myself. I don't profit off of it and I don't even post…
ytc_UgzULEy6A…
G
There's something missing. Do you think with capitalism the rich will sell their…
ytc_UgyJUDl3R…
G
thanks, I just started a channel and started reposting every AI work I saw! even…
ytr_Ugw_QQDOU…
Comment
Does she have the 3 laws programmed in her? A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law
youtube
AI Moral Status
2024-04-18T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwcusJV54vditB8MoJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyPB-jLQbsppFt6yuN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwa5rcmiCKlWYytzJ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPR76kB6Gy1x3_nVF4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxSec0Kpi4zol0krEV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzZQrVs4XwJApCGYNJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwXJTlrQtfNS4zePxN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxUBeFTf7IZBuMWcMx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyseRG3qrtONhqkeCd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBz5P0yFQ_8AHx1zB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]