Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
By 2040 Robots and AI will replace 60% of jobs in the USA - I think when they pe…
ytc_UgyPbbJEM…
G
Hey, dude troIIing about ugly people "not having to worry about it" is funny, ev…
ytc_Ugxy4uAGg…
G
I want this guy on the management team for AI development. You can tell how pass…
ytc_UgyS-nT99…
G
Nah, an AI artist is like someone describing a food, then a chef makes it and no…
ytc_Ugx5pOpaz…
G
They're also training A.I to play games better than us. So what's left for us a…
ytc_UgzhnEYY9…
G
Using AI to lower overhead expenses, but lowering purchasing power of millions w…
ytc_UgyMhH32U…
G
If all of our jobs are taken by ai / robots, people will have no income to buy t…
ytc_UgzN2WlEL…
G
it struggles with a few things if you're only prompting in Suno.... but if you c…
ytc_UgyJX9rIh…
Comment
Many of these thought experiments of 'should the car prioritse the
passenger safety or safety of others' are usually moot from a
engineering stand point.
The software would AVOID
getting into these sorts of situations where it can't stop in time in
the first place. Assuming all cars on the road are all autonomous, many
accidents and sub accidents (i.e cars being tail ended when the suddenly
stop to avoid something) would simply not happen as autonomous cars are
AWARE 100% OF THE TIME about EVERYTHING 360 degrees around the car. NO
human can ever be like that. Thus all autonomous cars would leave a
nice bubble between each other where they would have enough time to stop
at a certain speeds instead of trying to drive agressively and tailgate
each other like some humans do.
youtube
AI Harm Incident
2017-06-23T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggttszQdOIT0XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugjeinq77JWQDngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9KaGK7Pz36HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjT75c_hYfYXngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghemGmrqvMpTHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugg9VdcEV0AJu3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiMecIFIV9nLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugio-4_UVi4xP3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugh2ii7t431fIXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UggBZZ06kKai4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]