Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We should poison AI like it unintentionally happened with the studio Ghibli AI t…
ytc_Ugz9jhCs0…
G
Humble opinion: Idk where all this is going but i believe that AI art and Human …
ytc_UgwAe_Kkn…
G
We are going to need blockchain for news reports now! And we need to make the pe…
ytc_Ugx-WeynU…
G
I can't help but think even the reporter and speakers about it are A.I. too... l…
ytc_Ugyl5bmH7…
G
You’re an example of what AI is creating lol literally blind to what’s right in …
ytr_Ugx_zzS4G…
G
I BELIEVE THE DAY ANY AI COMPANY DEVELOPS A SELF-PRODUCING BOT PANDORA'S BOX WIL…
ytc_Ugze1cAL1…
G
Teachers are probably gonna be safe for a while though, ai is not great for teac…
ytc_UgzwOPXix…
G
The output of a chatbot depends on its training data, yes? So if it's been train…
rdc_kp05xbx
Comment
[AgeofUltron]
after 30-60years - beginning of the war between people who can control AI And normal peoples
[infinitywar]
after 60-100years - During of the war between AI with (Ro) And who can control AI (peoples)
[Endgame]
after 100-150years - ?
youtube
AI Jobs
2025-06-02T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxd80g821VtGk9rOiF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypxXTocoXWRy8RbA54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7JAKOUUhxWRG9Am54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwXS50Ybz9qNBfI0Yx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxZ3dC8xVAWiyI8hLF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyQsZ8m3ubEScrrJ_p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxyc06jIwMNWT6ITVd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzc2RVO0f9dYtH8szB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8HCXU7Ucr36oHb_h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzA517xDVMGGiZv4uN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]