Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is pure hype. The real threat is our decision makers, both in politics and …
ytc_Ugz8ublXh…
G
So they trained it for 10 years on a data source that was already biased...
&am…
rdc_e7koo9j
G
The kid was depressed WAY BEFORE OpenAI was invented and the parents didn't noti…
ytc_Ugxe9waod…
G
I'm trying to learn art although I may not be a 16 yol baby but 21 even then it …
ytc_Ugy4KhzkH…
G
@MPRF12345 Their accident rate is only lower cuz there are less of them on the r…
ytr_UgxM0Emi5…
G
To all artist out there spam "ai is not art" when you see ai art edit: when you …
ytc_UgzJsUtvU…
G
So if they started destroying us for being careless and destructive won't it rea…
ytc_Ugxuch2UW…
G
the fact that sam altman himself said that people will lose their jobs because o…
ytc_Ugx3XKs2K…
Comment
0:22 You're getting the timeframe wrong: The first AI was developed by Alan Turing at about 1948. So it's not about three years, but rather about 77 years.
And lets not forget: None of the companies that develop this kind of software is even close to earning money, they are burning money as fast as the Lehman Brothers back in 2008. And there is a reason why you can't buy 24/7 AI support but just some tokens every six or eight hours: The data centers need so much energy that the companies need to build power plants that produce as much energy as the whole planet is consuming right now. And in less than half a year the US economy will crash even worse than it did on black thursday october 24th, 1929.
youtube
2025-11-10T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7RcyA-idERFua6Bt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxojvRJnhSbrgVYFWN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzLy3F_sR0Nj0fJXXF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyermiJrD2oXy3mhJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyF1vG96vGFPmYUS-J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxiIW7HO64Dn2_ys3p4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxk_qGc8ONk9KslBbh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSWF0-3WXQbQO5rt54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvvlaPF2tl7RlVYoB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_K9I7nFpadweYV4t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]