Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Corporations will do everything it takes to promote AI for the sake of adding mo…
ytc_Ugw0ddltO…
G
Love the guys doing a full-on mindbreak on Squidward and causing him to fuck Spo…
ytc_UgwkUBa4B…
G
20% unemployment - I wonder who will have money to buy all the AI-generated crap…
ytc_Ugx8dq8Xb…
G
Altman wants regulation now, because it will stifle his competition. In order fo…
rdc_jkhi8h1
G
Lets assumed ai chip is not embedded in the human brain in 100 years from now. A…
ytc_UgyYzYiRH…
G
Exactly, when you put it on the internet, people can share it, and so you defini…
ytc_UgzKIe4W0…
G
Agreed, most of the expression in this clip comes from the eyes, basically no mo…
rdc_nepfcnl
G
I feel a glimmer of hope that, in the ai vs humanity 'war' (if one could call it…
ytc_UgxyYPg9G…
Comment
AI expert Professor Stuart Russell warns that AGI could arrive by 2030, posing extinction-level risks. Despite knowing the dangers, tech CEOs continue the AI race driven by economic incentives. Russell argues for strict regulation and safety measures before developing superintelligent systems that could replace humanity.
youtube
AI Governance
2025-12-04T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwREN4FyXG7tN2FwFd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1KSkc_CBCF8vkm_R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBm_4sMJemVzr7VF14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwvZ7_pmmm43evBynF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWAKDdP4sAqpkSl-N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwD54SpD1NMdfuSlEt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyUKU3Q7ZZTTm8jF454AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxprB8UiCWnAo8ULDt4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwNqjG86yxxs0pXUId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz098x72_JnwYp9xex4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]