Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Matter of ethics at the hands of the person who is creating these things.
"Shou…
rdc_nzk54od
G
This man is so right if we’ve never had to face or deal with anything smarter th…
ytc_UgyMUNWWn…
G
EruioDing There's this astrophysicist, Lawrence Krauss, who'd explain this bette…
ytr_UgyFKmvlr…
G
I hate ai art but the poisoned ones look like warped shitposts so now it’s just …
ytc_Ugx-iDNLE…
G
A better president would at least know the difference between AI and robotics an…
ytr_UgwW_9z2C…
G
Will sex robots have AI so you can talk to them - would be hilarious…
ytc_UgxiTwSt1…
G
The title of this video starts, "AI's first kill ...", but there's no mention of…
ytc_UgxPI0EWN…
G
AI is overblown. At best, it's a research tool. Ai spews a lot of bullshit becau…
ytc_Ugz47Uzgd…
Comment
If that self driving car was following any one of a number of "distance between cars" rules then it would have been able to stop in time so I'd say the first step is programming cars not to ride too close to large vehicles directly in front of them and then we'll sort out the ethics of actual impossible situations.
youtube
AI Harm Incident
2016-01-19T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg6IX-uG5XQOngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggqx26B0vYlNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjwZCpf6uJ5EngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjQFdEz8fzO-ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UggF86o_OEFCZHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugh-bk-TAV7aFXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgileDub0CwddngCoAEC","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjpqrVAg7rgYngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghQCXhv7515e3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgisOSWSkQ0bTXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]