Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robot translation!
*mystery missing fnaf type music*
*working*
*notices*
*ag…
ytc_UgzzwoZi7…
G
5 persons doing 1 jobs change to 1 person . 4 persons jobless. Lesser 4 persons …
ytc_Ugw7pkJbs…
G
It's not quite there yet, but will be. The AI development transactions must be s…
ytc_UgySC3uM_…
G
@eagle-el6px while llms are clearly not replacing devs anytime soon, they're mor…
ytr_UgyB6f5np…
G
people can get ai to program ai already. it will get a stupid human to circumven…
ytc_UgzURb6He…
G
It is an impossible challenge for AI-imigery-users to defend their AI slop witho…
ytc_UgxMQoGsd…
G
@Midoro Gurin Or we could just not give robots the ability to think for themselv…
ytr_UgzuZtDlq…
G
@teoomitai8078ok arrête là. C’est vraiment drôle la situation pour pas dire tris…
ytr_UgzFEaKTf…
Comment
I think the topic of AI/Robots having rights because they are conscience, is interesting and we must stop to consider whether or not what we call "Life" is really just the same comparison as bio-organic or none-organic. In any case, I think the term "rights" is more like the word "permission". So, in that case "We", being the creator of the AI should by definition given by natural examples like children should have the right to demand actions from the AI as a reciprocation for the creation of that life. This is the most basic form of a sociological relationship with AIs.
youtube
AI Moral Status
2025-04-28T13:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyXTyObjAANjXCxEgZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyARUgOLkV_qPTfkFF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxTvOlcbw0qn4LfvlR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWB7ADIvo0_kchDXR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFc0TbgEmit6t1UgR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwrE6iPqqsV1ld19ex4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-t2c8gwo7glwJgtF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5rZthk32xyFH94T54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwFjxlUpJ7AwDLr3MV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy4FC0WT32aKm7Qp5F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]