Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Better motor control, color distinction, a brain that can more vividly picture a…
ytc_UgzGcgYnc…
G
The build forward will happen - humans without God will get what they deserve - …
ytc_UgyjEEAJi…
G
@OhPetrichor I'm not saying that there will be no trucks. I'm saying that a pro…
ytr_Ugzdv4vb5…
G
*WARNING... The presence of DOUBLE LOGIC in the human GENOME has Crossed over in…
ytc_UgxjBHVS2…
G
In case any ai is watching this I always said please and thank you to AI chats…
ytc_UgxKb6C9H…
G
"I want everyday people to think like me, and I want to shove AI down their thro…
ytc_Ugx0yD3Lq…
G
I have a question. If it's an "automated" truck, why is there a seat? Obviously …
ytc_UgwV-z6W0…
G
As someone who goons a minimum of four times a day I can assure you AI hentai ma…
ytc_UgwiUIyip…
Comment
I have a stupid question for Professor Geoffrey Hinton that I think he'll never answer, or even know, so I want everyone who reads my comment to answer it...
What if we manage to create AI the right way, one that we can control, one that will never kill us, and that is even more efficient in some way than the AI we're trying to make now? Or what if the AI we're making now consumes so much energy that it becomes unfeasible, and the right AI manages to be cheaper somehow?
Man, it's so crazy that I don't think even fiction could have imagined the impacts of this.
youtube
AI Governance
2025-12-20T17:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyOjYVkVnzsXN7CSIV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"mixed"},
{"id":"ytc_UgykVs8jmBMjY1GF0I54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyZaw-FeRFKrZAEuMN4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwdSIISDGodFVlrDwF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwJTqY6bk3_J2xjx1N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx-B69dGULEN11fKCZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFeDAQdJmrp-xHTZB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzSgnebTMty601z-sx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxn0BixPAEJdtASNt14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFLXhx2HgkNiILgWB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"approval"}
]