Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"your friends are making plans for the weekend" Google ai make me a image of lam…
ytc_UgyRe8hQv…
G
I am glad the professor is offering to give practical feedback. You should take …
rdc_nu0k5j3
G
These big companies don't care for the people they want the money so of course a…
ytc_UgxpkfURy…
G
Actually real artists are usually pretty hard on themselves rather than each oth…
ytc_Ugyuf9XkV…
G
I'm not really believing all this hype. They haven't even been able to build a s…
ytr_UgyupbznF…
G
If it is walking your dog and someone tries to steal the robot what happens?…
ytc_Ugzao5DfX…
G
Ah, yes, the new Torment Nexus (tm) 6.0, coming soon from the merger of OpenAI/M…
ytc_Ugza6nUEu…
G
The actual answer to "does A.I. think?" If the AI we're referring to are the cur…
ytc_Ugx-JEluB…
Comment
This is probably why, self driving cars need more than cameras, cameras are not enough because they can't see in the dark... Actually if you look close enough right as the lady came in the car's headlights the car noticed something but didn't know what and started moving to the right and hit her. Yes the person behind the wheel did not pay as much attention as it might of needed during the night drive, but after some time of being a "driver" of a self driving vehicle it becomes boring just sitting there and looking out the window all the time, because nothing actually happens, the car drives itself, until now. So, to summarize, the car needs more technology which helps it see in the dark, people behind the wheel need to pay more attention during more dangerous situations like night driving.... In my opinion I think this situation might've been prevented but you would've needed lightning fast reaction from the driver. And people need to be more self aware and not cross the road in the darkest spots available..
youtube
AI Harm Incident
2018-03-23T16:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxy01VX_8QwXy9_57V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzP2TOIEQNrkwHotDp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwjt5cv4iPRLp6BKrF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyUUkGKDz2ID-KRa1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzSzbckQWJHdkyLy0F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwkBaLtQi5J43dOVjF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugz0hN4KTF0haVONTDJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwvRS-7NO2QZfGHuJ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz2x7umeeGIBkOO7Sp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQLYrOrlprqGY7tD94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]