Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Koinezar we are talking abt us making the art, not a robot. Using ai to make ar…
ytr_Ugzrdsnm7…
G
I hope AI bubble will burst and all the pc components prices will be normal agai…
ytc_UgwayS_wK…
G
Literally everything about the Intelligence Curse is a result of control of the …
ytc_Ugzc0lDMs…
G
Super intelligence is absolutely possible.
But not with LLMs.
And I'm not real…
ytc_Ugy9j8o9Q…
G
Ah yesss, creative jobs are soooo safe, not like AI could generate images or mak…
ytr_UgzAEKgXo…
G
I dont call myself an artist, but i like to get pretty images much cheaper and f…
ytc_UgwqQFWme…
G
An idea for employment would be to get everyone to be " supervisors " of said ai…
ytc_UgxYbEoC-…
G
There will be wars like you have never seen before AI will be destroyed and supe…
ytc_UgzQy4yTf…
Comment
That's not what is going on here. This is a worst of the worst case scenarios. Everything fails, and the car can only make one last decision.
This is a serious question programmers and self driving car designers have to worry about. Not everything can be accounted for, and not everything will work perfectly. Things fail, and the goal is to minimize such failures. However, there is always a possibility of absolute failure.
Just think about the Apollo missions for a second. They tried to account for every possible scenario, and yet, things still failed and unknown potential errors came into play.
youtube
AI Harm Incident
2020-05-13T18:1…
♥ 31
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyF6F5mYMV-Pa5y5GZ4AaABAg.8hdOXOKKyqR8kbvmeVyUHC","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwkhoJUXCOdqrrdO_F4AaABAg.8ZuDxkqG2qH98bK0NRx2AW","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytr_UgwE0M-nX5hQSlgJMIt4AaABAg.8ZLDyabu4ux9mLu7ZsOo7e","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UghcPoA1NFGlengCoAEC.8NyX3Egi-Fc9mLtvosPf4W","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_UghcPoA1NFGlengCoAEC.8NyX3Egi-FcA1PIA3250ul","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugiew_Ebk3iMfngCoAEC.8Nvh72FTZah8NwFFi8Pqfy","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugiew_Ebk3iMfngCoAEC.8Nvh72FTZah8OAIyM-R3lw","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugiew_Ebk3iMfngCoAEC.8Nvh72FTZah8OASggjEx2E","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgiJaxBMly9MvXgCoAEC.8Ndi_y4XrqB8NolwAfwMh6","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgiJaxBMly9MvXgCoAEC.8Ndi_y4XrqB8NpIBpJMJ3W","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"}
]