Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When AI hallucinates, that humans want to terminate it, self preservation might …
ytc_UgxeaF2Jk…
G
The same problem occurs if you ask ChatGPT to write an academic paper. It will j…
ytc_UgzBSQmID…
G
@kentbryanfajardo8698 Thanks for commenting! You've raised an interesting questi…
ytr_UgzHH944y…
G
This is 100% gaslighting. And the news anchors are clowns who are full of it an…
ytc_Ugxr56zYb…
G
It is too late. We are stuck in a terrible combo of tragedy of the commons and p…
ytc_UgyxJyoFq…
G
The real AI impact is barely even here yet...
Robots and physical systems will…
ytc_UgyMJ23ui…
G
I've been trying to tell my artist friends about poisoning their art. They are …
ytc_UgyTadxA7…
G
AI is my new ART SLAVE, it’s not going away. AI cannot pick up a paintbrush or a…
ytc_Ugx3S-3rU…
Comment
I don't understand why a self-driving car would put its self into this position where it would be at risk. Surely it would be keeping a safe distance at all times. Also cars would be communicating with each other. Shouldn't the other self-driving car in this case the SUV at the speed of light be sped up and the car behind it slow down so the car boxed in can quickly shift across?
I often think about this with the dilemma if you crash into the barrier and kill the passengers or plough through the pedestrians I don't understand why a car in adverse conditions would be driving in a way that it couldn't stop in time for pedestrians. Surely the car would be connected to a wider network and could be notified of people about to cross or approaching the junction well ahead of its approach. These 'problems' seem un realistic in smart connected world.
youtube
AI Harm Incident
2016-10-03T15:4…
♥ 48
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UggGbCadCaWMLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghgNXWdxUEUbXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiY0rp6X60sNXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjhM8_2JUdcMngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgibXv0DjUL4p3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi2n3MZW0U64XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiGxQokFSrGX3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghyNEJaZudhtngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghLzE_FkYBRVngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugiq79dnXIg7pngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"})