Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans were meant for more than just driving a truck. Let the A.I have these job…
ytc_UgyFgvY9Q…
G
im pretty sure this guy handed this robot an unloaded gun and add special effect…
ytc_UgyQgxog1…
G
What he said about China still doing it is right so we have to just have faith t…
ytc_Ugz2Y8u--…
G
Nous sommes en France déjà précurseurs de ce remplacement par de l'IA, ici les …
ytc_UgzLXRZqV…
G
With A.I we are releasing the demon. Elon Musk
To educate the mind and not the H…
ytc_UgxkVst3Q…
G
AI art in general seems to have problems with lighting where it just makes every…
ytc_Ugwa5YU87…
G
Facebook algorithms were responsible for influencing our thoughts or takes years…
ytc_UgxblqLS1…
G
it terrifies me that insiders, including CEOs and other executive officials, exp…
ytc_UgxVN8R5d…
Comment
If at least most of the cars are self-driving, and have the ability to communicate, then why doesn't the car either quickly stop or suddenly stop? All the cars behind our car will stop (up to a point that depends on traffic), and no one gets harmed except maybe for those people without seatbelts behind us. So, the solution is simple: Cars communicating with all the others instantly, which we already do with instant messaging. Why not program it into a car to maximize safety?
youtube
AI Harm Incident
2017-07-18T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UggJqTTxAgQpuHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UggSR5TBSlFvAngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugg3Ooi6amKBaHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgiIyXvapa3ghngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugg6S0mO_XY-BHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugj2e2pqV-vGsHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UghPxCznJQ-7DHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UggvYPle2Wkb6XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgjlhktnJUrllHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UggvUq6OXIbKO3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]