Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The end has already arrived. So handing the robot the gun was the correct decis…
ytc_UgxBo6o7S…
G
One of the best values AI provides is that it can exploit holes in your understa…
ytc_UgyrPEeMH…
G
And the Big Beautiful bill had the statement that would outlaw regulating AI for…
ytc_Ugxgv_Ojf…
G
Ai is what artist don't want. But the normal person doesn't care on the end if a…
ytr_UgwRGKVYA…
G
I tried so hard to make an ai believe he was master chief once 😢
It never worked…
ytc_Ugw3S1bU-…
G
52:39 A robot vacuum doesn't have to WANT to spread crap all over the floor, it …
ytc_UgwpXS7IE…
G
AI can't do anything in depth. If it goes beyond 10k lins of code, it starts to …
ytc_UgyOIhR3N…
G
Now this needs to stop.
Legal representation needs to be available at every pol…
ytc_UgwbcVIZK…
Comment
I know promoting this would be like promoting Tesla, but for our own safety, it would be cool to be able to buy a device that you put on your card and tells a Tesla, "here I am" so in case AI fails to "see" my car is telling here I am, and you are approaching me. That would avoid accidents, at least with those who would be willing to buy such devices. Nowadays the incentive is not too big because not many people have a Tesla, but something universal that can work with all "autonomous" cars would be cool.
youtube
AI Harm Incident
2022-09-03T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxGJCM6NDC6WB6HJgJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw1PxPp8WhI8S2BySl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugynlz0ESnBG_vMHShR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwfBQ1xDz57aewCYzZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxKfHN5op6OBlnVP-l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtJbM145M1b0Hv7_14AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwaabtYzjBDuK0qNgV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxZWHldVY7KFwd_N6J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwv1VtlscWr037UtDl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFcwFhYlDotQiyX-F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]