Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No you did not. There is no LLM called Clyde & what we have does not "understand…
ytr_UgzoRu3_W…
G
Wait… “how much money do you get if you turn into a Robot and shave your head?” …
ytc_Ugw9OF5gk…
G
Drones. Also human wave attacks like Russia has been doing. There was an automat…
rdc_karlv84
G
The fighter that they changed for the image of the robot, in reality hit like th…
ytr_Ugzb9-GHV…
G
Why is it necessary bill? What’s the point? Oh you mean it’s needed to fill your…
rdc_jd7ii4d
G
If not for all the AI hype and "investment" in data centers, we would already be…
ytr_Ugy9lHU0s…
G
Mind expanding interview. Stephan, your best interview yet. As someone who works…
ytc_UgzOM2VQu…
G
No the check AI things work. For example I ran it on your post, and determined t…
rdc_jj413yk
Comment
The most nightmare scenario is what is happing already, that it takes humans to teach AI and robotics how to behave the same way humans have been behaving for ages and now humans bring fear of what we taught AI willingly and scare others about what will happen to us with AI, while not stopping the process smh
youtube
AI Harm Incident
2025-02-19T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyd14ylkfB9kn69Krl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzrE7blnIfT0mB6_mx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzftbTWMn1lLpM3dIN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy2GHAKUE_G5ge4kTR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9qhHq7__95RdDya14AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxQSptsfKjswBUGXhN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx8a5dhPXB5nMWL0KF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxREQhpvrv11qZPFWx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyZKGFkRZTmaw4W17h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzSJqzV-osuYNY17-p4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}
]