Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wish you could be a real person, darling. Because real ones are far more dangero…
ytc_UgwVDG2Y2…
G
"AI art has a soul"
the "man" said to the most soulless ai video ive ever seen…
ytc_Ugz-85j4P…
G
>If governments were willing to take a stand and say, no, no more lockdowns, …
rdc_hm97dqo
G
B.S.B.S.B.S.B.S.B.S.B.S.B.S.B.S.B.S...... AI is dangerous because it will expose…
ytc_Ugy_gtulU…
G
Until someone pours water over the computers and no one fixes ai when they break…
ytc_UgxJWySX5…
G
Well. I believe that at the current pace it will happen. If we don't get to see …
ytc_UgxGXeqqW…
G
They obviously aren't going to arrest him. Would be seen as an act of war. They …
rdc_jrzgiw0
G
I am surprising all the time how anthropocentric many people are, so he thought …
ytc_UgyuXRJov…
Comment
UBI is heavily promoted by those promoting AI. It is the easiest way to deal with the 'redundant human problem' by giving them enough money to stay at home playing video games, eating mush and accumulating social media likes. Allowing for AI phase one to take hold. However it won't take long before you have to say nice things online, and IRL, about Sam Altman if you want to keep those UBI benefits coming.
youtube
Viral AI Reaction
2025-11-24T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwRPPT3Yt14H-9QN294AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8P2DdeWLFa-RnYft4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyJNogUBXC79PHPQLJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzpMmn5Q0nTVzEC-yR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwxDgWZuOrVeLtcmg94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwYU39ZLo_kYX63PwB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1I0fCiykJcHvYAiB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyyJUGjcjh9q1oexLt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOXqRpiQ1fcnqzk714AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw5AVFpkZEpLfuMpjF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]