Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
WE ARE DESTINED FOR EXTINCTION. BECAUSE OF AI, WE ARE A SPECIES THAT CONSUMES A …
ytc_UgyK8HJFk…
G
The one who caused the algorithm to take the decision should suffer the negative…
ytc_Ugy9uUIfq…
G
They're doing it because of greed. It's not safe- many are going to die, and no …
ytc_UgwkY3k3F…
G
Human kind eh!!... We got so tech smart over the last 80 years? First it was the…
ytc_UgxUETNpz…
G
Amigo,es solamente un error de código,te equivocas y los chips activan aleatoria…
ytr_UgxiJSMvU…
G
The scariest part isn’t that AI will take jobs, it’s that it will take away the …
ytc_Ugy197zGx…
G
Actually the people that work for ChatGPT are asking people not to say "Thank yo…
ytc_UgxbsJJTe…
G
The people with the money will never stop looking for ways to cut costs. Replaci…
ytc_Ugx9rNmyP…
Comment
Well, if robots demand rights, then I say we ask about what rights they want. And if they are reasonable, they shall be granted and that's that. I mean, if a robot asks for the right to kill people that would be unreasonable but tbh I don't see that happening so I think we would get along just fine.
youtube
AI Moral Status
2017-08-14T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz7uG2wEC19S49oP-94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxwsoWcZL6vvWs1sU54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzAxBYGDkKt5sS06Ql4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwkm27kBj-Nko0hqed4AaABAg","responsibility":"society","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxaCe8v2icP1o2wVtp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzbd6o3_ChC_IAdGUh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxN08ESQaXfpdIzaad4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxdCiXaINfQ8-FMuc54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyEDXQOHqCotJGpdh14AaABAg","responsibility":"society","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz5AW7EfnUyBlxhh2Z4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"}
]