Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Big tech is always pretending that it's plan is the way of the future and inevit…
ytc_UgyBJvaTW…
G
Motorist getting hit as a cash cow under the guise of “safety” and saving the pl…
ytc_Ugx1RG2bK…
G
"It's the systemic processes that are protecting business interests over human c…
ytc_UgyzidVKK…
G
The education industrial complex will one day, probably not too long from now, b…
ytc_UgyJE7IdK…
G
What's critically missing is Issac Asimov's Three Laws for Robots:
"The first…
ytc_UgzB9ylG0…
G
We understand your concern. On our AITube channel for subscribers, we conduct li…
ytr_Ugz1RMe3Y…
G
When AI takes billions of Jobs it won't be funny you cannot unplug technology 😔…
ytc_Ugw7qjTXY…
G
I am a chief lock operator my machines are 200 years old this is one of those jo…
ytc_UgymetJJK…
Comment
I believe if the powers that be will not regulate AI slop then we real human commentors, real human creators and storytellers and script writers need to designate a watermark or logo like "H.I." that we real humans can see and know any media with it was made and and created by a real Human and nothing is AI generated and can lead to legal actions against any AI generated material using the designation falsely. If they aren't going to stop it because they plan to use it then we need a way to not only protect ourselves against potentially harmful and or dangerous generated slop but a way to combat it also. I might be wrong but it's just my opinion
youtube
Viral AI Reaction
2026-02-25T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwldVHBmtnHDG8yRwx4AaABAg","responsibility":"elite","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgydsEUa8M8Y7Gbnnh94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxXorIDOUi3IFMt28N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJ6MV4d3AT5eav1o54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwApkZHLjrG8dQA1vB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBMaHttZQLTEZ7QX94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwt51f7wdrhm2JR5254AaABAg","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwwdU9aslhEUfPp6u54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwdyr5TnBsgkBzG5yV4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzxmr4zyQ5TjwI3n854AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}
]