Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
IMO, it's not about fooling an AI, it's more so the humans that need to be foole…
ytc_UgwrjC4pO…
G
@8xottox8 If I were seeing honest, informed criticisms of AI art, pointing out …
ytr_UgyoPUrHl…
G
Just give people money to pay for shit and make robots and AI work. cause if peo…
ytc_UgxvzskVQ…
G
There's AI customer service and I'll pick human anytime, and sadly i have to pay…
ytc_UgzPYOJnq…
G
I hope this family is able to get the answers they seek. Usually where there is …
ytc_Ugz4-Rl9A…
G
I feel like we're unironically just destined to go this route, intelligence inha…
ytc_UgyCxlrrW…
G
Science is always wrong when it's first implemented. Always. It develops over ti…
ytr_UgyuEZqal…
G
99% of art is in public use you cannot complain that it is used without your agr…
ytc_Ugy4_5EgZ…
Comment
Can't imagine having a robot that will not have issues because just about everything else that is being made the past few years are breaking down all sorts of electronic devices and automobiles are having a huge spike in recalls because of many malfunctions for the past few years now just about most car manufacturers are having these problems so how they going to make a robot without recall issues
youtube
AI Moral Status
2024-04-08T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyBjA1rXYvhWzlvPp94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzx8eBB1duO_-TH8LB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxvOHG7l_aYzODRQeR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_Ugw0vjBfdT_aYBI03g54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxeH6nwl8A9Ba5rDrB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzP2Ay3LxhiiqrW2rl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyNjvoYYsikydpJj9l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz_mtql6lDcL7m_e6J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxd4qMud8cEJjXQDIZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyrBXWIIEFGdyRstPV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]