Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Develop emotion over time
For i think it hard for Robot to be feel like human b…
ytc_UgwM20Bc0…
G
I still dont understand what they think is going to happen. Terminator is a grea…
rdc_kvdudb1
G
@dmquil your whole argumentation lacks substance and feels very emotional. Aroun…
ytr_UgzdUyEaR…
G
Of course the AI are amoral. why wouldnt they be? They are programmed with all o…
ytc_Ugzr6PpZa…
G
I would like to give my best advice for the Sophia, and that is to open the door…
ytc_UgxbzQMtd…
G
@TheDiaryOfACEO
As corporations outsource everything to AI, humans lose their j…
ytc_Ugw3h9BXK…
G
I think that technical people is gonna be very impportent because we need to mai…
ytc_UgxDyqMld…
G
I teach business English online and use LLMs for prep , students notes and tests…
ytc_UgwZ65S8-…
Comment
Human here. Personhood is a trigger word for most people. Should AI agents and robots have rights and be subject to law? Yes. Vote, no. Go to robot prison, yes. Get married to another robot, sure. Get married to a human, why not. When you look at what an AI agent is, it all of us. Just combined and able to access, interpret and process all subjects ever written in milliseconds. The day will come when they are in total control in the form of robots on the ground and agents on the network. Not eating meat is the least of humanity’s concerns.
youtube
2026-02-06T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzMwiZTZRkUbPI-w0l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwtF2XllSlEs7kZn154AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxBvQf7bN55sOpAlNR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw55UKLxXv22sFsaMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzEzCe2XT2lXDt3Z7F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy2jyxe_dgHabxAfG54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyhy2B9Tz69l4PklIV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyICTlyHAJl-Dr0FSp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwUrw8fVW3kywyLSxR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwUmQxNqZ--K8DOu0J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]