Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@CorporateCatalystno it won't only html and css and basics will be wrotten by i…
ytr_UgzM93K-e…
G
If there is a recording of someone speaking about AI, wouldn’t AI be able to pic…
ytc_UgwE1Ilje…
G
I don't believe that every major "music-generating" AI limits its references to …
ytc_UgwN3Y3tr…
G
What confuses me about these stories is that I'm assuming these ground up horns,…
rdc_dv5wepx
G
NNs that are trained on human drivers, by definition, mimic human driver behavio…
ytc_Ugx35WVwW…
G
It's like phentanyl. It is not meant to be consumed, it's meant to be sold to ot…
ytc_UgyNpYH8L…
G
AI is already is already quantum and it already took over years ago and none of …
ytc_UgysemamE…
G
Why are people worried about not having a job in a world where nobody has a job?…
ytc_UgyPFToCQ…
Comment
machines have no life and no souls so we cant give a right to a non living thing i love robots but giving it rights is just weird,because the feeling that you claim that they might have are just them acting the feeling at the time they saw us acting it for example lets say ama intelligent robot ill observe you and learn from u the concept of feeling and learn when do u use that type of feeling then after learning it ill use that feeling in the situation that i found similar to yours when use it at that time it doesn't mean that i feel it am just acting what i learned to be a suitable expression at that time but i dont actually feel it
youtube
AI Moral Status
2018-07-04T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx7YznFYEUKkMe1iBd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzahW5WKawAqoKCB7t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRHWKvJT8IhKO-_qF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbgNKJMW57e2gSy1B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzYCJpRzmrEA7SN_ll4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7dI6ViiYSCEbnzft4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzinrD6hweefSHzu-x4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy9MR1jF5P4ZT51IHR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy75Vkh-6d8zWFeqFZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnZ11_1Tt2abQ2lgh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"})