Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We the people don’t buy or use AI based Companies how do the Companies survive?…
ytc_UgyaoeJuH…
G
A nuclear war, triggered by AI would be a suicide move; the EMPs would shut it d…
ytc_UgxZ9ngCu…
G
AI is crap. it lies and doesnot do anything you ask of it. humans do the job bet…
ytc_UgwZ2u5-_…
G
AI will never have empathy, consciousness of being, belief.. wisdom is not the p…
ytc_UgwLg7CDB…
G
Gigo will save you. As AI learning is fed with AI slop, the garbage out will be…
ytc_UgzO_uyry…
G
I think his perspective is wrong. The economics isn't easy, and what to do with …
ytc_UgyG7O-Z3…
G
Lol it's backwards. Billionaires want people to think AI is bad, because it take…
ytc_UgzLxkb7p…
G
Yea dude i see like a lot of artist and stuff being scared of ai art and all, bu…
ytc_UgwxjuSNN…
Comment
What he's saying reminds me of the 1990s hype about how the Internet was going to change the world by brining people together. I bought into that optimism about the Internet when I was in college. While it's done _some_ of that good stuff it's also isolated people and and divided us into algorithmic echo chambers. Technological revolutions, like political ones, may improve things in the long run but are rarely bloodless. These days, I feel like I'm bracing myself for the aftershocks of AI with my fingers crossed for more good than bad. 🤞
youtube
2025-06-10T20:1…
♥ 74
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzx34qIsx_O-30oqWJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgziKQKelepEkTipxIx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxNK6SbiEg-2N79dI54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwQRkwhsCMCsAhe4bF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxRuJJYTzMfD8qNPU14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxVFdd1pbrGY2Ep6AZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyszb3YBUTp7PEE5Xt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwl5kCKCL8IzSiFtqV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzjKel2owfrAPHLKxx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw4kPAFt4gZ8OtgnAF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"}
]