Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People kept telling me about ai so I chatted with one, definitely dangerous. Thr…
ytc_UgzvdiwfE…
G
Does anybody realize that there will never be a A.I or something. This machines …
ytc_UgwBonOs9…
G
Imagine just bullying someone for getting a lot a likes. I like AI art. I can't …
ytc_Ugw1_L-He…
G
In 2025, the figure of 40% has become a central benchmark for AI’s labor market …
ytc_UgyzcFYKt…
G
If we're going to take the profits of automation to pay for UBI, we should just …
rdc_glhs0ls
G
Imagine having to use AI to pass school and avoid doing your schoolwork? Just fu…
ytc_Ugwdn2OVd…
G
Honestly fair point lol. The timeline you mentioned looks really bad on OpenAI's…
rdc_o7vr3iz
G
@LiveTravelTrail any robot kills you haven’t seen yet? What generation are you a…
ytr_UgwUjmKCq…
Comment
Personally I think if A.I. become sentient and if they start demanding rights we should give it to them. Because the whole concept of Consciousness which would be the only thing that would separate us from them. Is a concept that we don't even fully understand and are only evidence that exist at all is our feeling of it. So we should not make a giant end of the world Terminator/Matrix War over it. And if we don't give them rights because we want them to work as slaves. Well we know how that ended with ourselves so we probably deserve it. (Fun fact the word robot means Robata which is a Slavic word for Servant or forced labor. So don't call your machine neighbor a robot unless you want to be called racist.)
youtube
AI Moral Status
2018-03-01T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw4jv3CEWzURRu5HsN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMfYuv9cf_JLp-n9h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8aRZQRBx68_zKO9l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCiFXiou1vqLS4Dxl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu5KhSaSD6YSzpwd54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQduWc7dqx2_EgNEd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzlzabsKq9ISy3CBTh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOjsFP0aIFzc2ww-54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxAr-F4o1eHq2J_kPp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugww6qRG8pzD28KzuvZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]