Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why AI is a solution to a problem no one asked… only the rich want it to cut cos…
ytc_UgyXtM0cl…
G
The irony is that AI, or intelligent algorithms, will impact the white-collar se…
ytc_UgzeyR5gr…
G
If i was a genshin character, and get to choose between:
Drawn by AI
or
Get…
ytc_Ugwv-IW6F…
G
11:14 this image here is supposed to be Saint George and the dragon, I might not…
ytc_UgyQwnzYc…
G
What about the threat of AI destroying the creativity of future generations and…
ytc_Ugz8d7i6G…
G
If casinos can get super high tech facial recognition that can pick you out even…
ytc_UgxNjy09F…
G
Nobody's asking for 37 AI data centers to be built at a cost of $5 trillion doll…
ytc_UgwZD_Nh2…
G
Future search engines will need to redesign themselves to be a distribution mode…
ytc_UgwjIowsy…
Comment
If all the vehicles in the group were self driving cars evry one of them will detect each other actions as a treat and some sort of avoidance will evolve out from whatever they do. And if they can comunicate with each other then some kind of group effort can lead evry one of them to safty.
In the worst case exactly as described where your vehicle is the only self driving car. This car would know exacly the distance, and the velocity from the rear car, and would know changes in this variables in real time. I think the better solution would be to hit the brakes in such a way that the relative speed to both treads(front and back) collisions is reduced to a minimun at the time of the impact.
youtube
AI Harm Incident
2015-12-09T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugj-WH6OpZhDSHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjQavChndvc5ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghZ2CeGeDq4y3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugg8HfmGm2p6hngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UggesFpy1EznlngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiRW9mWll7FTHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgifUAfLDoDb23gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg_yjdSah1yH3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjkwzfB0yQ1NngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugh_9XnDJVggxngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"})