Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What would be the perceived BENEFIT to the machines to eliminate humans? How wou…
ytc_Ugym6w84g…
G
Take my money!!! Where do I sign up. Give me the first humanoid robot girlfriend…
ytc_Ugw26wEC9…
G
Those implications rather unnerve me being in an unfettered AI system with no bu…
ytr_Ugyd9XXEZ…
G
When you elect a lazy narcissist who is profoundly ignorant of history, you'll h…
rdc_e2weduh
G
I get the feeling they are making more of this than it is. Not the danger, it's …
ytc_UgwaTYfDt…
G
Stop uploading the same video every week. This channel is AI slop. It’s like yo…
ytc_Ugwycza_B…
G
The problem is that the ai is not applying any logic or would be smart, but just…
ytc_UgxoZXGDI…
G
Using AI to optimize your resume for each role is a smart move, especially in a …
rdc_odblssx
Comment
A new paper gives a formal proof that AGI is NOT computable! Meaning neither LLMs nor any other agloritm can achieve AGI. The paper called "On the computability of artificial general intelligence" is on arXiv
youtube
2025-12-11T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz_LJ03pNgjiU6vlRF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMzZsJvJtSMz7TH254AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQA1fP8Bd3l2BgDBR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxLFOMJLLgemi1WaXJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6W-p7Q4LjkJXD1EB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwoXtUs7_GfHor2z054AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzUogT43Tyn59QHGWV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgypoiMGTAmzOL08ed54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyk-cyt0p5K_7xJ9El4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyBfw8NmWELtt1SSTp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]