Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For another example of this, I once spent three hours repeatedly trying to get C…
ytr_Ugyqhl8kA…
G
it’s a bot, it has no consciousness in order to truly be excited, it only has to…
ytc_Ugy5BOCKK…
G
In a country where it takes more than a month to get an appointment with your do…
ytc_UgwWFsZK7…
G
Fucking terrified of ai of what could happen this just proof that it's horrible …
ytc_UgwqBRa8U…
G
Don’t excited one day you lose your job too. AI took all of the job in futures s…
ytc_UgzfLO275…
G
Nah I don't agree, fuck my data I just don't care let them leak any shit they wa…
ytc_UgwTI648A…
G
AI art is great. Human artists may be grumpy but they are not needed anymore.…
ytc_UgxMsuDaN…
G
Ha, jokes on you! I spend my day in a workshop today drawing AI memes in my own …
ytc_Ugx_ru5vQ…
Comment
I think the only ting AI should be used for is asking it things, like:
- Can I put a Lexus LFA engine into a second generation MX-5?
- What amplifier would you recommand for *insert speaker maker and modell*?
and simular things.
Generating art from stolen metarial is not right, most AI models should be illegal and their makers should be sued, because they use stolen data.
youtube
Viral AI Reaction
2025-02-25T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxnzG6lDp3BUirXzcx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmONulY1RuoBIJAOx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugy-Wlu1SZzVFAHNwl94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHpJ7CbgHws8teQYB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwyFx5EKGSO7PwwbAF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzQTqGle69v_Yrm8Bx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxWSMvChOIgDMSArlt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxpCZiF7J9775bJWuJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxuVQ8o2YVTxyQDoHJ4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwRV05xaB1MGbvuehV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]