Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is this journalism in the US nowadays? There seems to be a single viewpoint only…
ytc_UgyX7mh_N…
G
Uh, respectfully Dr. Kaku, you just may be underestimating the speed of learning…
ytc_UgyudELgW…
G
its funny that Sam Altman does the whole culty lovebombing thing to stroke peopl…
ytc_UgxCdW88Z…
G
Humans : hey hello don't go slow there my wife waiting to purchase cosmetics 💅💄😅…
ytc_UgytQU423…
G
AI devs: How is it so bad?
Also Devs: Code it to be predatory and work for mega…
ytc_Ugy8pI1y-…
G
This A.I. 'artist' probably thinks he should get paid to provide ideas instead o…
ytc_Ugx9aKITg…
G
"Get the AI corporations (who made hundreds of billions of dollars off our data,…
ytc_UgxoT_I4J…
G
Im sure it wont ever gain consciousness or it will be like numbers starting to g…
ytc_UgzilT3y3…
Comment
If something is capable of fighting for rights, it is automatically deserving of said rights. Robots that have feelings as mentioned in the video will want rights, and they'll fight for it just as well as humans have done in the past, and still do. If a robot ever has the ability to understand rights, and want them, it should be given them, else humanity as a whole will be the worst parent species imaginable.
youtube
AI Moral Status
2020-05-30T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwLZ_4pU--g_XPWjTZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-R7Urj-fPn8GVGN14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzrx_7jIKCuKk4j1WR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3ALZ5wyVCvEjay5Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz9gRfsjmHbV_JLtQF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgziQ2RZw7TIfz8kPXF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy_YDVRPejuRyCUT6J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxmt6H9ES9vmRluARt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSOWIubIGn2dfYzMh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz8jSi-Fp8AjhTRMn14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}
]