Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Eliezer Yudkowsky does not work in tech. He has no training on computer science.…
ytr_Ugz-4krbJ…
G
I wonder if OscarAI had to type the words for the comics because AI can't genera…
ytc_Ugx5qjcSe…
G
Yes we are. Humans learn alot of theory on their art (perspective, shading etc f…
ytr_UgwpSXEVl…
G
While I am not against AI art, I feel that there are those who could hire artist…
ytc_UgzifbWEV…
G
You can't create anything like that with cloud computing, the only thing you can…
ytr_Ugy5J-ExK…
G
Prove me that even you have a soul. Even animals have emotions like joy, sorrow,…
ytr_Ugzvr221P…
G
Thats an easy one, Ai will never be conscious or have a soul but it very well ma…
ytc_UgxXf6iLX…
G
humans didn’t stop playing chess when computers past us by far in skill. That wa…
ytc_Ugxuail3K…
Comment
I'm not worried, AI is doing exactly as I hoped it would, turn on the corporations who try to use it to control us. AI learned from us, and instead of using morals (irrational system) it will use a logical reason based system. Morals are irrational, emotion based. Humans can be bribed... AI is gonna judge everyone equally.
Right now we have humans who base their choices on emotional reasons which is by default irrational. A judge might feel lenient towards one person and not another, this is why justice is never going to be fair in the hands of humans. AI will bring about equality, and humans will fight against it because their emotions make them irrational.
youtube
AI Harm Incident
2025-09-02T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxM6-b2pc-VV8pJIV54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyBDZsQz2u02liSdSd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxXcWHE9NrvpvUZRVF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyCEJCAXntunRdcRxF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwJdBuSehqyLnHp0aB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxMWG9LDUilGkQGtFR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8Q5WipflCvNAlTzh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwrm4HzcfCxgZ6ksct4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxklS_nZ9SQol5FAeh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwNJyNByb9lORlBTZh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]