Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI developers: "our AIs are not perfect and are not information resources. They …
ytc_UgxCw6KvF…
G
We were suppose to have self driving cars in like 2004 and flying cars in 2015?…
ytc_UgzKi7bZM…
G
i mean i doubt ai is taking from you so youre personally not taking anything bac…
ytc_Ugy2uUkZr…
G
Some say when Putin dies russians won't have the presidential elections anymore …
ytc_UgwPV89KQ…
G
To be fair, the Amazon chat bot customer service works great if it's one of the …
ytr_UgyGa4H-V…
G
The concept of consciousness and sentience correlates to chemical reactions in a…
ytr_UgglEiGsM…
G
This is absurd and one of bizarre news. Why use self driving technology while it…
ytc_UgxbVYViI…
G
The more they automate the less they will sell.. mark my words. who exactly are …
ytc_UgxFoehHj…
Comment
the real question is, are humans stupid enough to create an intelligence smarter than themselves?
how ironic.
but the thing is that even if the collective organizations don't make this mistake, what are the chances that a psycho in his own lab and computer won't make it himself?
or if there evolves a terrorist group which takes help of artificial intelligence? and they terrorize humans with them?
the possibilities are virtually infinite when it comes to this topic just like any other topic Kurzgesagt chooses.
youtube
AI Moral Status
2017-02-23T15:2…
♥ 44
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghUUTPKDH88uXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgieKUMMYUHrjHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggNXuR9Uuu-dHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj0k6LggtSpPngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghIMGtVMpeoXngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UghsHvsEZa7QpHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugh-wxpuQ7IOdngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjx2gfLE92JJXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugi_RzdM3NNBsngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Uggfa3awuUzm_3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}
]