Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
people gotta understand llms are just predicting the next word and there's no in…
ytc_UgxPtCTcg…
G
@AlexW1495AI prompters are not their peers. Their peers are other artist who ha…
ytr_Ugx8Jk8lY…
G
Meh, current AI is just the next conveyor-belt assembly system. Anything assumed…
ytc_UgyA98s2N…
G
Hi Steven, I like your shows. I want to point out on one thing, you know how you…
ytc_UgyK9Xk4X…
G
Thing is those in Kitboga's videos were actual scammers. Difference is these guy…
ytr_UgzE4wHrd…
G
To any 1 who wants to rrplace creative people for ai go watch the last disney mo…
ytc_Ugz2UrZcf…
G
Bro SAM IS AI THATS THE ULTIMATE TEST NOW I WILL BE ASCENDED AND getsomemoolapls…
ytc_Ugw_wdUtR…
G
You literaly instructed the prompt to remove morals, and it did. It responds you…
ytc_Ugw4n16XU…
Comment
This video starts out interesting enough, but then becomes a dystopian wet dream with HUGE leaps of flawed logic. AI exists right now to make our work with computers faster and easier. However, AI is not some sort of human replacement until robotics technology catches up to it. Sure, a century from now we'll probably see some sort of Star Wars/Star Trek future with smart robots handling all of the manual labor. But in the mean time we still need humans to interact with the physical world. Right now, only high-tech factories can eliminate most human labor. AI in the white-collar workplace is taking jobs, but those are jobs that require us to use a computer as our primary work interface.
youtube
Viral AI Reaction
2026-01-09T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyTJEJ65W16zS0FAlx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwY47nhNsz9kB2HUHB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwy955sTdjM6kfXlyh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxbFfBkE5seRtJONAd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgySS7Kvy4tafHr2hox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyEvj6Ltk-cgyR2oR54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkFoz60iPQ7YBhNlt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwj673iliKU-NrBy714AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw8TYDSvHW-pvDK9-14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwQYjKDwAyphPdPPfN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]