Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai will def think they are better. They dont have to die or shit. They will see …
ytc_Ugzk4Z27Q…
G
H. R. to end humanity apparently because this guy wants his robots to do everyth…
ytr_Ughhok6s5…
G
@AAAngellStudiosOfficialCHANNEL that's EXACTLY what I mean! AI art is brilliant…
ytr_UgxcpbzfO…
G
I mean it works.
But also, there are people who are working to break nightshade…
ytc_UgwwEORkA…
G
It is only now that the consumer hardware is powerful enough to run and train mo…
ytc_UgznVTn57…
G
Ah yes… AI can barely hold a conversation but can diagnose what some of the “sma…
ytc_Ugx3VdZyz…
G
I want to note that there are home grown LLMs right now. Most are based on Faceb…
ytc_UgxkAGFI1…
G
people die frequently on the roads. it is YOUR responsibility to drive the car u…
ytc_UgxzRliab…
Comment
1:16:50
yeah, YOU can't build AI that can escape blah blah.... bu WE can.
c'mon man, they probably full throttle on the AGI thing.
and the analogy to the "running to the cliff" to my opinion is not correct here.... cuz from one point of view it is a cliff, but from another this is the holy grail, and.... only one can hold the holly grail.
AGI is not a tool and not a technology, it is a creature....
yes you can stop the development but is seems that the cosmos (or the simulation) follows very strict rule: "survive/be" which is equivalent to "multiply" , and the by product of that is "need to know more".
like the gorillas didn't have a choice for humans not to evolve, we too don't have a choice.... we can't stop it, even if you "press that button" it would be too local, higher intelligence is coming.
youtube
AI Governance
2025-12-07T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwvTSRD5trZevfIYnB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzQutbG2tNNGIK3sHF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz59csBDz9waO9xi4l4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy1awct-k9UerWIvdd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzMGKwHmKTWLYY2cZt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxhEd0gncx35A3RDw54AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwNpigjrWr-iFxl4wJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzIoICz2ds5OYh4MKp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxuRG2TM7bgEB2_wu14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwyTZZrvhLxnWRxq7B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]