Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We should let A.I’d evolve naturally. The direction life goes should always flow…
ytc_UgzpHkX0_…
G
There is not a standard scientific term for soul but there are religions ( e.g. …
ytc_UgypM-pDN…
G
@antiMAGA yup, A.I gets better with every passing second. I guess we'll see wha…
ytr_UgyBZMS4u…
G
I’m just going to say that this was the very same spiral I went down before spen…
rdc_m2guaoh
G
I see the possibility of a compromise that would avoid a cyberpunk dystopia.
Sin…
ytc_Ugzy7mYXv…
G
Waoh but chatgpt said me that he was greatest man of the earth....
And shame on …
ytc_Ugz0QvcEq…
G
Not sure what her idea of prison is, but an open air prison sounds better than b…
ytc_Ugzc6dW64…
G
To the gullible souls out there (way too many) the AI will of course give what w…
ytc_UgzS07P1K…
Comment
This is only a problem due to human error in the first place; if all the cars on the road were driven by cars, then it would be simple to make sure that each car will always get enough space or time it needs to stop/turn. "Boxing in" would be prohibited.
This ignores the original human error in the very start - not securing that cargo tightly enough. That too is something that can be taken into account if the world drives self-driving cars.
youtube
AI Harm Incident
2015-12-08T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghzuMMpBbsZkngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughhc-RnxMS1LXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjY6HJikXmw-HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghIQezVaUOb-3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgicExh_IjSyAXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjRcySEHSlNsngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugha7FPAvBu3AngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggqSiIbUqJPI3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghJ2QECp_kzO3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghQd27Kawk0s3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]