Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai art is better for people who arent good at art its for people who want to loo…
ytc_UgzjpGFQC…
G
Deep fakes including porn should should be counted as sexual horrasment. (When a…
ytc_Ugx8GJt9s…
G
@RazorbackPT Tell us you don't get the point and are cooked by ai rhetoric and b…
ytr_UgxwuAol1…
G
> there's a lot of alternative facts being spread around.
Why not call lies …
rdc_dcwmz1t
G
This is why I can't bring myself to even acknowledge the quality of AI "art." Th…
ytc_Ugy9vJSrl…
G
Great video. On the points about software engineers understanding needs and requ…
ytc_UgzWeDK2d…
G
“a painting that AI could never create” is a claim you need to backup with evide…
ytc_UgwRLC71M…
G
Fear of the unknown...a basic human trait. Why is that? I would say it's becau…
ytc_Ugzth7HIF…
Comment
Just like in human societies, robots must be taught ethics such as “love your neighbors as yourself”. That’s the way humanity has survived to this day without killing each other to extinction. This is a problem that Isaac Amisov has long raised in his robot sci-fi novels, so he proposed the 3 robot laws that stated “no robots can harm humans…” Every AI systems must have these laws built in and there must be ways to enforce them. Any AI system failing the ethical tests will be banned. The time to it is NOW, when we humans still have control, before it’s too late.
youtube
AI Harm Incident
2025-09-10T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzEY0yU1dzfb1R-aJZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzuu-STTy7jObsp-5N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugw3GeaR99a240lYSLt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzfz6ujfvlze9RAUgx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxWcqfU3f-gqW2T16Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy7dE1_R27qI-Hj1MZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwV5VNsp7Qyg1cTkiZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnmxfSsvlqUV4ZdFt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzRN_kI3P5JdFnqQp94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy13_P-cEdTqIEhyEd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]