Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It should be illegal to use AI for creative work at all, be it visual or written…
ytr_Ugxa9zUYp…
G
What he's saying reminds me of the 1990s hype about how the Internet was going t…
ytc_Ugyszb3YB…
G
Did anyone get the ad where some random AI app overlaid its logo onto chatGPT’s?…
ytc_Ugzstg1F-…
G
If you give baby chicken to a duck it will act like a duck
If you give ai to hu…
ytc_Ugzr0upLK…
G
I am in tears... I have been telling everyone that the schools are like an overw…
ytc_UgyTIS-V7…
G
I'm at 40:07, and there's no mention of the greediest of them all: governments a…
ytc_UgwzZmVXQ…
G
Let him go down there to Africa so someone can put a bullet in his skull.…
rdc_jrzteze
G
AGI is 5 years away now? In the 1960s it was only a year away so now we really n…
rdc_kvfhc8l
Comment
It's also very naïve to think that humans are near the maximum intelligence. The reason why ASI is going to happen is because AI will improve efficiency in every field, including in its own architecture and energy production. Why should the limit be AGI? How is it that humans managed to reach the intelligence cap under natural selection, especially when our brains are inefficient and no where near the physical limit. Think of an artificial brain with maximum efficiency, minimum neuron size, and maximum total size (like a planet-sized computer, or somehow even a solar system or galaxy sized computer) --- the sheer complexity will 100% be godlike to us.
Remember, ASI just needs to be smarter than all of humanity combined.
youtube
AI Governance
2025-08-26T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugy1onCVK2MzAw1C2u54AaABAg.AMIBtdw73dIAMILZvAQq4V","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz8SKCB88ERmHqh4L94AaABAg.AMIBVY1I7nGAMIhuGYIa-_","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwz6ReKY9mEFJBbE1h4AaABAg.AMIBK03P7P5AMIItSL88ey","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgzRC_P20rmSpz9RQxd4AaABAg.AMIAzr_JX8mAMIB4TZXtOE","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgwJRJ2xMK-WREdEEJd4AaABAg.AMIAvX2p6PwAMIFwXTYXsL","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzudFiJf0Khwo2ciXB4AaABAg.AMIA30AiF9SAMIC95JeWGY","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyEPONeTw5wbePaQoF4AaABAg.AMI9wCQEZwjAMICbd6OVqJ","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyEPONeTw5wbePaQoF4AaABAg.AMI9wCQEZwjAMVBcPNHL1_","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwIuaegTC1BDpWwzHx4AaABAg.AMI9vNM_R0VAMIAbqTlOSd","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwIuaegTC1BDpWwzHx4AaABAg.AMI9vNM_R0VAMIBBvH2hyO","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]