Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
By all means use your obvious skill and talent to create images manually. I don'…
ytc_UgyWtfFQ8…
G
What if an AI from another planet or dimension came into ours and is now forcing…
ytc_UgzArznNG…
G
When we go to AI and ask them to do what we want is fkn like a mosquito had some…
ytc_UgyOwhcSE…
G
Around 4:40 you mention that there would be an economic interest in torturing AI…
ytc_UgiVMj0Ws…
G
The new trend is to convince people that AI is big and bad. As if humans were an…
ytc_UgymwFq2x…
G
Not gonna lie I've experimented with CHATGPT and had full conversations too...it…
ytc_UgwLzaKj4…
G
you realize that if you use AI to get a degree, your degree will have no value a…
ytc_UgyzdfOe1…
G
half the time they are talking to some chud in india pretending to be ai.…
ytc_UgxZNpHBq…
Comment
This was a perfect storm of a complete failure by all 3 ... the pedestrian was walking right in front of a car at night, that the professional "safety driver" was not paying attention at the exact same time and so illegally relying 100% on the self driving technology which obviously still isnt capable to avoid all collisions, otherwise they would not have been out there even doing this testing... maybe next time they try out this test they should send the Uber CEO out in the dark to stand directly in front of the 40 mph self driving car, instead of trying to perfect it with ordinary citizen jaywalkers used as test dummies in real world conditions ? i think in time they can actually probably make it work by using their lidar and radar etc. because that can sense better in the dark and around corners and obstacles better than any person could with their eyes, but if the consequences in a failure are a bit higher for themselves that can help them to work harder on it too...
youtube
AI Harm Incident
2019-11-13T06:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzIZWKU_29pIsbtnqZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwwv6PTvh9Toy8lGpR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwpFxWKuXLG7WTx4f14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxeYsPHofFY5shuHpV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyqChGPq8PNh3BxWVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw8uPE6BqQ9SuYoPxh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw08XGO72WtieHOMTl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxELwttJKt8QfPXgOt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyG776EsfrVrjb8qex4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz1iM-qSU0GQXJxPD14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]