Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artificial Intelligence or AI will take over a lot of jobs, done so far by human…
ytc_UgykAIQlK…
G
If Ai is going to take everyone’s jobs, who will be able to buy anything?…
ytc_UgyzAm5ud…
G
I love the "you not supposed to let the full self driving, self drive...der" com…
ytc_Ugy8UKcC3…
G
If you use AI on your product, even if it looks fine from a distance, if I pick …
ytr_UgzdP8d1q…
G
Economic Systems are religions. AI becomes an apologist instead of a detached ob…
ytc_Ugxl0nhdf…
G
the guy is working on AI bias but is unable to see his own. it's kind of ironic.…
ytc_UgyhE5_yB…
G
You talk like this because you don't actually understand what LLM's are. Go educ…
ytr_UgyASDDgh…
G
Intense episode. It is scary. I am worried about it taking away jobs but maybe…
ytc_UgwDQBDv1…
Comment
I feel this movie both overestimates and underestimates the capabilities of future autonomous cars. The complex analysis of who has a helmet etc. does not seem feasible, let alone analyzing the "worth" of the passengers. If the car was able to detect objects falling of the truck it would probably try to dodge them without hitting other vehicles. Of course, the ethical dilemma is interesting, but in the end I think the autonomous car would make a best effort decision to avoid threats.
youtube
AI Harm Incident
2017-07-22T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UggJqTTxAgQpuHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UggSR5TBSlFvAngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugg3Ooi6amKBaHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgiIyXvapa3ghngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugg6S0mO_XY-BHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugj2e2pqV-vGsHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UghPxCznJQ-7DHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UggvYPle2Wkb6XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgjlhktnJUrllHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UggvUq6OXIbKO3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]