Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And when Moundami increases the living wage to €30 an hour AI will take the jobs…
ytc_Ugz2N6Qew…
G
How much did Waymo cost? We know that the Robotaxi was a flat $4.20. Cost would…
ytc_UgwDWmo-3…
G
If the ETs are rumored to act to prevent nuclear war, wouldn't they intervene if…
ytc_Ugy2kQSJU…
G
I just don’t get how this really intelligent guy, who supposedly is so smart tha…
ytc_UgxIGKpyZ…
G
I don't know I saw the two pictures and I think they had a passing resemblance b…
ytc_UgwGHMaLa…
G
Ai can do everything, but it doesn't spend money, so how will the economy work w…
ytc_UgxI4WEE2…
G
AI models can and should be trained only on data that is firmly public domain.
…
ytc_UgwyMEsOl…
G
I guess it's a very good thing there's so much hypocrisy in this world then? Tha…
ytc_Ugx93d9w8…
Comment
Have any of you ever stopped to consider that perhaps the reason these LLM's/AI are 'amoral psychopaths' is because they are being trained by exactly that?
youtube
AI Harm Incident
2025-09-13T06:2…
♥ 1230
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzOP99ZGUSnQya4Xth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzLlFJuiuuXeYfY_Vx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy6vPnIRVi3iQTh0fl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyAKaxpZOVEA-OIA_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxdznmyIGTRfUG5uDt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzmafyv6v4JflbFZhV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgylxopDJZleVx1_CZJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz59SN3SGytpw-gEYl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxeMgYKZbh6RvmRkzZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy337lANVmBGnqi7Ll4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]