Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've start to use Dream Studio a couple of days ago using my own images (photogr…
ytc_UgzxXaKTJ…
G
No it takes time and patience to develop your own unique art style. Anyone can d…
ytc_Ugy_EFYp_…
G
some people can only afford a free ai therapist 🤷♂️ any therapy is better than …
ytc_UgzUzW-do…
G
It’s not bc I want to protect the jobs of drivers — though I do worry about eve…
ytc_UgwywFrDW…
G
Corporate greed is what's dictating the continued advancement of open AI. So if …
ytc_Ugx5sPaaf…
G
"and should be planned for and managed with commensurate care and resources" for…
ytr_Ugws9LG6o…
G
Anthropic did the right thing by barring the Pentagon from using its weapons aga…
ytc_UgyXxJpKs…
G
We are not robotic we have empathy not engineered empathy like dumb LLMs. Humans…
ytr_Ugy-siT7x…
Comment
But if self driving is controlled by computers, then shouldn't it be programmed to the most safest feature of all? You know, literally following the road rules? Trying to find a way to have zero fatality on the road?
Even if something unexpected happens, shouldn't the computer already what to do best by getting no collision? since self taught computer is on a rise and other feature such as thermal vision or sensors should have been put to self driving
If there is something wrong happens or an accident happens, it could probably from the orders of the owner the car since the only thing abides the rules of robots (or computers for that matter) is humans (or someone hacked the car since hacking cars is already possible in this timeline anyway)
youtube
AI Harm Incident
2015-12-15T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugh8rhSAIlTrjHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj8lU9CWdFWf3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghFWNaMvDiVGngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgggdoqiWWgg1HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggYKBs14QZPoHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9xnzyNGEqq3gCoAEC","responsibility":"society","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugim4SKNBlRtfHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjXjm0R3slUzXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghxbQR1FcrERngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgipNWetGSuz7ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]