Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We want the ai to make the dishes and clean the dirt so you can use your free ti…
ytc_Ugy5ubjni…
G
Meta dropped the ball hard on AI development. Going to be brutal with incomes st…
rdc_oadlkfm
G
AI can't take my job, because AI can't get on a fucking airplane to go replace t…
ytc_Ugxf327BS…
G
Did I hear correctly that US tech guy talking about *needing* to work 60 hours a…
ytc_Ugx1ilvvr…
G
In "Person Of Interest" 2011-2016, Harold Finch (the creator of the AI) had an a…
rdc_l5u2uxo
G
The AI turning off a second before impact to avoid law suits is the most evil th…
ytc_UgyaIkhtf…
G
I'm glad you were able to make a bunch of money off of this video, but it's a li…
ytc_UgxCVgeS0…
G
I agree that when it comes down to it, patients will always prefer a doctor look…
ytc_Ugwzb85yb…
Comment
the problem is when he says the smartest people are building it. that is not true. tech geeks are not smart. they are talented and there is a huge difference between that and generally smart. talent has no transferability meaning you can be amazing at one thing and utterly stupid at everything else. tech geeks have zero social intelligence or common sense. giving them the wheel to our future is suicide. ai cannot ever be regulated because human bad actors cannot be regulated. any safeguard will be overridden by a strongman somewhere in the world.
youtube
AI Governance
2025-11-30T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwxrHz-h9yQ1MKYuah4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxS_ckLQfsN5n_fqsd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1FyNGZKNEikaplD14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx7J8Jcgfz3h9bTNUZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwaRMioHUytvYtYr0B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwuKk3tD3sVgVilYkR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzp-WddNkwJwVMcppN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwy5d7P0unCIzZW25h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz7GnhZpJmsWtuFT4Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgylnmuKo0RDH3hE1XF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]