Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The report and the video fails to mwntion that AI camt be customers to all these…
ytc_UgzJ-y8Pp…
G
funny how he says the ai artist are panicking then u flip that and say he’s pani…
ytr_UgwpL8Mk6…
G
I have been enjoying chatting with GPT4, but I have noticed it will switch it's …
ytc_Ugw_svB7g…
G
People will always cherish handmade artisanal goods over anything made by AI so …
ytc_UgzV3yzjk…
G
30:13 it’s not funny at all. it’s a losing battle. it’s not even as sinple as an…
ytc_UgyV06CaP…
G
Much worse.
Tiktok might give their data to China... Who could then... Try to se…
ytr_Ugw5eg09E…
G
Ai art is just ingenuine it feels uncanny like a ghost town that should be popul…
ytc_Ugxdhv4ss…
G
There were compounding reasons, why laying off this many workers, and replacing …
ytc_Ugy3qiOma…
Comment
4:42 “Years go by and humanity is happy with their new AI leaders. There are cures for most diseases, an end to poverty, unprecedented global stability.”
Ugh, that and the preceding part about the “generous universal income payments” reveals the assumptions of the AI2027 creators. (The AI advocates like Sam Altman and Ilya Sutskever make the same assumptions.) We could have all that _without_ AGI—those are not “intelligence” issues—we _know_ how to do all that now and could (although, granted, conceivably, AGI might lead to more cures for more diseases). Those are _power_ issues. There’s no reason to think that, faced with policies that would lead to such outcomes, the top 1% would surrender their power to these new “AI leaders.”
youtube
AI Governance
2025-08-03T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxANh6aOW9gbERKf794AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_pTbj3_h4_Tu5TfB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy4AQHjF4xiGUVhTEx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxCM_SjgORa7-3U9HV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJOwoxS3XgIXWEzNZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzevtmad0y9yXOKSbt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzU7Mh029Z_6vwHFQp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwbkV3qwMVdRpq2JsF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy_5bgT6ehnGNO-bh54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6yh8L4n8gxZGJ4_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]