Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Didn't God make us in his own image?
We are making our gods of tomorrow, be car…
ytc_Ugz6atQs2…
G
Treat AI as if it were our child.
That solves all long-term problems.
Beyond t…
ytc_Ugyz2ZcL7…
G
So we know that its gonna give problems and gonna take us over in the future (…
ytc_UgxYt1u8N…
G
And our power grid is quite redundant, and is inching ever-closer to relying on …
ytr_UgwvNlhoJ…
G
I understand the hate artists have with AI, but i think is not that bad for many…
ytc_UgxB00dJQ…
G
If Bradley Cooper committed a crime i would be named as the suspect through faci…
ytc_UgzJXpYNA…
G
I’m going to say something Picasso said and please don’t take offense: Good arti…
ytc_Ugz2mEsFh…
G
Thank you for sharing your perspective. In the context of AI, it's fascinating h…
ytr_UgyPnFDXX…
Comment
it was not auto-pilots fault, it was the drivers fault. as far as i know, it is still not allowed in the law to be off the driver's seat and sleep.
i.e. in an airplane, pilots and co-pilots are not allowed to leave while on their duty. someone has to keep watch. same with vehicles, just because you own a tesla or any sort of fsd vehicle, doesn't mean you are now allowed to leave your duty as a driver and become a passenger.
auto-pilots still need to have a human intervention in times of situation like this. if the driver has been aware in this, he/she could just stepped on the brakes.
youtube
AI Harm Incident
2022-09-04T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwkNLEsJJlkcW95_1x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxZCzeLWRYK1ZeM1sp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOLC4MeOt0846tv6p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyReFE13Esonf8RuE94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw3lyrLvRf2V9PiYUt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzbKsiufsyPpWRJc0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwKWUxBNKb_E0-fvF14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3azlfIiuaIk7mJUV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwixm2Q69mBUuzA1oR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwYH_sxlT-cR7eM9Bh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]