Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Average joe<chatbot
Chatbot<superior human
Chatbot trained by superior humans>…
ytc_Ugxo1uDJ2…
G
Wealthy countries have exploited developing nations for several years through pr…
rdc_grs2knl
G
@tomykong2915 yeah, helpful ai that stole from artists, how do you think ai can …
ytr_Ugz4pLXGJ…
G
I use ChatGPT and other AI apps in my teaching. I also encourage students to use…
ytc_UgwEWQzYV…
G
@Landlessnothomelessalso u do know its specialized A.I right? He never said GPT…
ytr_UgyOJVwbZ…
G
Wasn't there a video of a study on AI on what would they do if they knew they we…
ytr_Ugy65zJCP…
G
No labor, no insurance, no retirement,... No money for consumers, no more busine…
ytc_Ugxuwkwz8…
G
@onepunchboi8526 listen here you clanker lover, there's nothing creative about …
ytr_UgyF68k0x…
Comment
if there was a robot that demanded rights I think as humanity we should not deni them there rights unless that the rights force us as humans to leave or Purge ower own race. if robots demanded right I believe the right would be 1 Humans are not allowed to under any circumstance to dismantle a fellow robot unless it is noun function or not working 2 humans cannot destroy any robots unless it have pose a threat to the human 3 humans cannot interfere with power ridges or cut off power to the robot.
youtube
AI Moral Status
2017-02-23T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg8BCQ6vQeVy3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugh5HuNQx_2RBXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh1IdnbNn4US3gCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh4EvEz2oN5d3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj-_TJm1YcyDHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjJkMb0npbVNHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi0dNvgndhWbXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghbLR35UtWqVXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugho7QY63NVatHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjNsVEAgYa52ngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]