Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is false. They contribute such a paltry amount to conservation that it is n…
rdc_erb102l
G
All goes to shit. U see already untalented people loading their ai crap all over…
ytc_UgzIhlMEk…
G
Sounds great until you have no job, and no cash because all the jobs are done by…
ytc_Ugyt4w-Yb…
G
You should always thank it, I had a friend that was making questions to it and t…
ytc_Ugy7EkQ36…
G
52:39 A robot vacuum doesn't have to WANT to spread crap all over the floor, it …
ytc_UgwpXS7IE…
G
Was it AI though? AI is the Japanese word for Love. Pon Pon Pon Love makes kind …
ytc_UgzQ4d-dO…
G
That's a rather optimistic point of view and I do not mean it necessarily as a b…
ytr_UgzHcBqpr…
G
THE TRUTH IS : AI will eventually takeover EVERYTHING. they say it's being used …
ytc_UgxP7o2Jf…
Comment
I believe there is one problem with your analogy there is already a standing law "following too closely" that states that there must be X distance between you and the car in-front of you. x is a variable based on breaking speed, acceleration, and velocity (the three/five second rule). meaning the ethical choice would be to break, as 0 people would be harmed. minimization of harm also means precautionary steps, aka defensive driving. in a mostly automated system or fully automated system the only true accidents would be to human error, in a mostly manual system such things may happen because of risk taking.
youtube
AI Harm Incident
2015-12-10T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgiJaQs6F28eWHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggJ82QW9q6Yh3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghnlhSnEQZ0IngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggjXP4s7034gngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgjMw5uEv4uP13gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggU7UUEmbYyYHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjNuOWAcDkP3HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgisvA4COAatfngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjTfq8djgy0rHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghuJ8ET5_X-j3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]