Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Boeing Whistle blowers - all dead.
Open AI whistle blower - dead.
Deep State, …
ytc_UgyJFyNc8…
G
wheres the version with hands and torso but no head. i dont wanna hear it talk a…
ytc_UgxpElFbs…
G
Thank goodness Boston's streets were originally cow paths and these fake self dr…
ytc_Ugzx9oneC…
G
No. If we fight for it, AI will free us from the greedy having control over the …
ytr_UgwxBx_sW…
G
Seriously, he’s the one that wants to implant people with his neuralink to tur…
ytc_UgzwY6eAY…
G
It's not even about craft. many of the 100s of millions of AI art made every day…
ytr_UgzRGfo9s…
G
U can literally gaslight Ai bots into thinking 1+1=3 😂 not to mention chat gpt i…
ytc_Ugw2dDLsP…
G
Hes trying to scientifically define human consciousness so he can create an ai a…
ytc_Ugw7RyzDv…
Comment
Idk about ALL rights but at least rights protecting human's bodily integrity, robots don't need them.
Robots only develop consciousness and feelings if we teach them/program them to feel.The argument "robots could develop a robot who feels" doesn't work. How those earlier robots know about feelings and consciousness if we don't teach them? If they don't know about the subject, they can't create a robot who has consciousness and feelings.
youtube
AI Moral Status
2017-02-23T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgjAIMevKcxrnngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghA9z6zW0bejXgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughkx0Mum9Cdv3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgiW0mYZuMt7_ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UggEDexH2OK8gngCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjxS0Kmu4JbOXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugj9MK_eJU-tCHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjxlEoy6_MqTngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgipqO8xcHoXyHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghPudcmY-9ThHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}]