Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
watching his after my professor used AI to grade my paper on the dangers of gene…
ytc_UgzPbe-CS…
G
And who you are going to blame if AI will make a mistake on you?…
ytr_UgwCZNs7F…
G
It doesn't cost much to show a bit of kindness. Machine or human, it seems right…
ytc_UgzWkyW41…
G
Cemetery business is where to go. A lot of emotions to deal with robots I don't …
ytc_UgzHFq0At…
G
You used AI to tell us that AI will destroy us all.....Why would you do that.…
ytc_UgyaL_9AL…
G
AI data centers consume a significant amount of electricity. AI requires substa…
ytc_UgzvSX5_8…
G
EU should have already done a direct military intervention on in defense of Ukra…
rdc_mcq5y5s
G
We got to see Bernie back in the day when Ben & Jerry were still making ice crea…
ytc_UgwaBo7SB…
Comment
This Sam Altman guy and those like him who are looking to evolve their technology for a bigger payday should be held responsible for this AI chat crap that is being prioritized before actual life. These parents and friends are not able to know all the secrets of what's going on with every detail of the ways things are progressing in technology while regressing in real life. Accountability and better guidelines have to be initiated, or this will be completely out of control and out of anyone's hands.
youtube
AI Harm Incident
2025-08-27T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzw--WFQw5FAbd2S-h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx_WTSCjlFZP3ljxdt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwNbo7WT-ZIU6Jgp1t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwBTjFtcZQTfCTzzjl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw9h4FK8QTSZeLbbp14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyw6c7zOlG0J-1Ltj54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3Y5NyotgtOa88O9J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx1jXCC5R_2B8psCrZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxKdxWg2d1fpQgchdF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzgvAqcj5V13QgKf054AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]