Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My daughter too but the AI art lacks the emotion, the expressions...they can't s…
ytr_UgyKjX596…
G
1:07 why is hayao being dragged into this
also why is spirited away now the rall…
ytc_UgziToi5I…
G
sounds dangerous letting AI build itself... If everyone who becomes a plumber, c…
ytc_UgwSbdIHM…
G
Are these algorithms machine learning algorithms, like neural networks and whatn…
ytc_UghETbDDL…
G
Amazing tips! I use AI tools and post content to fanvue to see what sticks…
ytc_UgytzjDU8…
G
lmao this episode made me laugh so hard.... that freaking AI is like having the …
ytc_Ugwqrlh6r…
G
You clearly do not know what you are talking about, AI is not at all art, you ar…
ytr_Ugz4ZLGJ_…
G
BUT WHEN YALL COME TO AMERICA YALL THINK EVERYBODY ON DRUGS AND LOOK AT YALL N…
ytc_Ugx1wvufl…
Comment
I am so surprised by how many people in the comments section here are trying to "outwit" this dilemma situation. "Use the help of the other self-driving cars" or "why would it happen if the car itself is able to measure out a safe distance." Your thinking is too specific. Think in broad terms.
This type of accident can happen anytime and and any place despite the most advanced of technology. What if a random person happened to be riding his skateboard and rolled off onto the road due to an icy trail? Because humans are not without error. To get rid of error, you get rid of humans. So the smartest thing would be to tackle this problem through ethical questioning, not pinpointing flaws in a specific scenario.
youtube
AI Harm Incident
2017-02-05T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggPlXqhTyqn-HgCoAEC","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgijP7n1AYDAFHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjSelYS_yNxMXgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghtfnAXloUXangCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ughv4M1zM_ZhFHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UggWc282B73l5ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugir1uoAgHGQ63gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghiAb5OOQ50H3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghKAohdhKOGKHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjB5UYNyemZAngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]