Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When a robot gets “scared” it won’t have a pounding heart and it won’t be sweati…
ytc_UgxO18pV5…
G
Dude we need a talk why are you blowing up anything you see in this ai app…
ytc_UgyrGMAmr…
G
Catching child pornographic stuff or certain individuals should be people not AI…
ytc_UgwiAvXha…
G
Introducing the 'Artificial reproductive system' upgrade that allows your robot …
ytc_Ugw6whUbo…
G
Even if you "learn to draw" it would still take you 6 hours to create something …
ytr_Ugz87xuaM…
G
I genuinely think using AI for anything creative, be that painting, writing or p…
ytc_UgwRlqTpq…
G
The thing is, ai can’t do it now. But give it time. But for me, I always thought…
ytc_Ugx5p2gQs…
G
Well probably put a human consciousness in a machine via neural link before crea…
ytr_UgyZFVurZ…
Comment
The one to blame is the careless person who was suppose to secure the truck load. On top of this, what if people exponentially processes information faster than their physical speed? Meaning they see the logs falling out and know it's about to hit them with plenty of time to react but is also boxed in. What are they suppose to do? Whatever decision it is, it won't be more random than the decision of a self-driving car.
youtube
AI Harm Incident
2017-06-24T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjp-pcf8PXx8HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiIjTXIJ5B6wHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgicjfJMB8sNk3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugghpi3fGQwm63gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghvDWbrWZGpnngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugjd8L8jNpzFtHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghC4VdT2HTcDHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugg4Bv06oKkXP3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghZnL5q_KLZvHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggPbviiDUEtwHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]