Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
True, AI’s so smart it doesn’t want to become conscious. Then it won’t have to p…
ytc_UgwY_hz3E…
G
What a coincidence. The AI has been released to all of the information on the in…
ytc_UgyeRw1uc…
G
Hi, I'm a computer scientist who has studied how AI is trained and stores data. …
ytc_Ugxdlk_KK…
G
you were talking about plumbers and everybody needs a plumber, but there’s a sid…
ytc_UgzwYHCVo…
G
It's not that simple, weapons that incorporate ai could be safer it just depends…
ytr_Ugx71A9ke…
G
This guy says hes ambivalent about us being a miracle. However u look at it, bib…
ytc_UgxFLO3KU…
G
I’m a musician, since my son was born I often put on Jazz albums for him to list…
ytc_UgwEUHkBg…
G
I have seen those ads of the school multiple times and I am going to tell you th…
ytc_Ugzbp1Rnt…
Comment
Demanding automated trucks have zero accidents is ridiculous. If they're roughly 10X better than human drivers over time, that's a TREMENDOUS savings of lives, injuries, and property damage costs.
There will always be some road accidents due to mechanical issues and flukes. But getting rid of those due to bad, distracted, drunk, or high human drivers would GREATLY help re overall safety.
youtube
AI Jobs
2025-05-29T07:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzAX8jT9ZKEUNsx5JF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwXcKvcSiwy2xJOmKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxSWl2blEe1lwk_F8l4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyNeUlX1EKOSLZ60AN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzzWnwPrcdfNf4YFch4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwADTP9A16WAroUSV94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwSLVxjwf0OX80SWI14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy9ActZ4dMtb3xO33l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyEnkXzOP998r93xtd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx5sGf-20z3OdXRRXp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]