Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@souplesssoup2903 it's obviously not about ai because then all the segregation/s…
ytr_Ugy8e6xNC…
G
“yah I'm an ai artist, i write a prompt and mark the image as my own”…
ytr_UgwP3S1d1…
G
Thanks for sharing your thoughts! It's fascinating how AI like Sophia sparks con…
ytr_UgydtCIMH…
G
An AI artist is effectively just someone who is really good at describing their …
ytc_UgxJiKbEC…
G
Another midwit video saying AI will take our jobs instead of offshoring and work…
ytc_UgwKAFZ4m…
G
That's a very insightful distinction you've made there! Indeed, while programmed…
ytr_UgzkJ9CLf…
G
How many much more bad take over we already have? Only think about that the whol…
ytc_Ugyy0tMjW…
G
There's terrible irony in Hatsune Miku fanart made with unethical AI... because …
ytc_UgxK4YVrl…
Comment
Amoral psychopaths.
That is what all these AI companies are racing to bring......the most intelligent AI without morals......all for money.
Is the AI the amoral psychopaths or the AI companies themselves?
They are putting all their efforts into making AI more and more intelligent, but don't know how to implement morality into these systems they are building.
All these AI tech billionaires will get rich in the short term, but extinct in the long term.
The movie Terminator is not science fiction anymore, but will be prescient of human race in the future. Except there will be no John Connor that is intelligent enough to rescue the human race.
youtube
AI Harm Incident
2025-07-27T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxK-Og8iqwEQtPgeV14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEaRSY94HaSkK6eK54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu6FZy_O45jWfsaEJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6JpPgFPi-UHvXZ5R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwONVhRhsct5vYM9d94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgymCj6MYfjVS1aJzsR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxENcBYe7dDLB2Dyep4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx9TNZQmWrJhqYZujZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzUISC8cWQ65r5NALB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz70Z2DWK-Q0pvDWbx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]