Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@RizzlerRecordsOfficial That would work if all the nations did it at once. They'…
ytr_Ugz8cdRJr…
G
Comparing this to the Nth rooms is down right disrespectful to the victims of th…
ytc_Ugwx_wPTe…
G
sometimes I wonder if these ai art stands even watch the video they are disagree…
ytc_UgxwCMHh7…
G
@gilemee i think their point is, claiming you’re a chef because you ordered food…
ytr_Ugz2wqsEJ…
G
Mfw the "Bad beginner art" is still 20x better than mine
(I still don't use AI b…
ytc_UgzRn3RYi…
G
AI is going to assist in improving technology until it no longer requires massiv…
ytr_Ugyxrenfd…
G
If I wanted to represent myself, I'd use AI to compile previous cases similar to…
ytr_UgwUz3Den…
G
florianschneider3982 People use AI too much and make themself even more stupid …
ytr_Ugxl35wFe…
Comment
This is why self driving vehicles of any kind is a bad idea. You can't program a fear of death, self preservation, or love and concern into a machine. Real live people use logic, reason, facts, as well as reactions and emotions to make decisions. Machines would be poor decision makers during emergencies. They just don't have the ability to understand these concepts.
Or you could strap a rocket engine to a car! If there's trouble ahead, don't drive into it, take off straight up and avoid it! I'm kidding...or am I...I don't know what the answer is. I just don't feel comfortable with automated transportation.
youtube
AI Harm Incident
2015-12-09T20:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg0f6gYDoM2u3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg8lvN9vbzqmngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiOKnl1PCwC5XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghcyXISlo02pngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi2qolU0hnvF3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghXbi7fYusSUngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj2jxTP45dbIngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiMqqVdmeG89HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj-Xh3Fxwz1RXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugh3GHPd7ug6D3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]