Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Claude is shit compared to modded Opencode + gpt 5.4 xhigh + gemini with an orch…
ytr_UgwkJCY0x…
G
“I’m an artist myself” bro u literally just used ai to make photos and got a sei…
ytc_UgwgGZfxO…
G
You always have a real good cake though that’s why I watch your channel because …
ytc_UgxZFMo60…
G
I personally dont have an issue with AI art in of itself, my issue lies in the f…
ytc_UgxnHtDtR…
G
not sure why this entire channel isn't A.I. There's not that much original input…
ytc_Ugz2SvHKv…
G
Was all about tge copyright things if open ai did not do it others will , so wit…
ytc_Ugws3MXg0…
G
This is an easy question to answer. If the work that is being used to train th…
ytc_UgwG6yqXp…
G
@group555_ dude, people's art is being stolen. Feeding a robot a prompt after s…
ytr_UgzlMZwAg…
Comment
In our relentless pursuit of AGI, we’ve started overlooking the simpler, unresolved problems that still affect people every day. What's more concerning is how AI is beginning to erode uniquely human talents—like photography, design, and other creative expressions that make us who we are.
A decade ago, social media reshaped our lives, often not for the better. AI, if unchecked, could become an even more disruptive force—deeper, subtler, and harder to reverse. As a society, we need to pause, reflect, and ensure that in building machines that think, we don’t forget what it means to be human.
youtube
2025-06-15T15:2…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxLE_54VV2aW9aiMQB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzCz2aN3flY0qy74DF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyyvVQt2SInrDetBzx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwSmxNAwIcz3nwBLM94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwGn6kIrO4zEiMk7ah4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuxJakIBYDIPnup6F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxql3FlrgOvovW82_Z4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgydHiBRYqsbbarVC514AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwsvmlDCBR8XM3hBnl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxScnQJKDlrXTCR8Ql4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]