Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you work as an artist in industry (film, games etc.), you might be better of …
ytc_UgyRG_Ysm…
G
Shit will become "I thought we had left the "Ai art is not real art" argument ba…
ytr_Ugw3TnWgm…
G
It literally has to be a Tesl game simulator, and say that it helps driving for…
ytc_Ugzm3Htxf…
G
I should be asking to a Swift (iOS programming language) specialist or learn by …
rdc_jhdnmpt
G
13:58 A similar blow up happened in the writing community in 2023. Kim Palacios …
ytc_Ugx2souTp…
G
I studied web design and development but soon after my studies all the html, cs5…
ytc_UgxbG0dCF…
G
I don't see why they can't just approach it humbly and without pretension. You c…
ytc_UgzGuTuL4…
G
@RentAMan-l1bof course but this is why these huge corporations are fighting Ai b…
ytr_Ugx1VRj4Z…
Comment
Should be automatic £1 billion dollar compensation for the nearest and dearest of anyone killed by AI, plus automatic life sentence for any driver that allows their car AI to kill someone, plus the road laws should change so that is is automatically the fault of the following vehicle's driver if they hit anything whilst travelling forward. Tesla should also pump millions into free motorcyclist education schemes.
youtube
AI Harm Incident
2022-09-04T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyoccfQ828_KCVGAmJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyqumgtzbF2qq-CjNR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwokRuJt991aa0Hrs54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfuPPCBHNX8x42XzJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwDD6K8Jfjo6cKnK2N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwv7AAmmhE2B_cwOpp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyn4CMXhWxCTgtDxux4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxNr3v69V9BOUM4S9h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzncvP7W909pdLQbNZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw4WwymQMDmlQCTmkR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]