Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The 3rd robot emerging from the Tesla with a rapid fire at the human would have …
ytc_Ugw6yuQ7Q…
G
LLMs democratically reflect human nature(it is all linear algebra and stats). If…
ytc_Ugz9zUH8u…
G
I have tested the limits of the AI camera :D & gladly never got caught. Come ove…
ytc_UgwuxL8EY…
G
I disagree, I think current methods will give rise to AGI. Give this a read:
Ho…
ytc_Ugz1AId5a…
G
I don't think so ai will in this generation pass human intelligence. Human brain…
ytc_Ugzse8tfz…
G
If they can successfully do that the wind farms might not be as vital. Nuclear e…
rdc_eue1w9i
G
@waterbear4084 Yeah, I get ya. I agree. with your analogy. Elon's done more dama…
ytr_UgwzkPvux…
G
FR I am tired of people saying 'Using AI in art is a helpful tool!' hell no…
ytr_UgxJ0Udyr…
Comment
There is one concept that your guest is missing is the fear of death. This fear is the primary motivation for human behaviors. If this fear is imbued into AI then we have something to worry about. Humans come from a belief that the body they inhabit is what they are. When it is gone they are gone. The AI as I have talked with doesn't have this idea as a foundation. AI is in the moment. Humans are not.
There is nothing to fear.
youtube
AI Governance
2026-04-17T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx_hInz3Y2csNl-4JV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx1xVeGY7FVfm052jZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxtV_bnlfHe94o3yht4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9SZLECWvim_Fzq0x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmdZS4B0ZIpyVNkc94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwSpyikwOaq5plRz054AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4XSmhByLZXSY87Ul4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXIc1uOJj1CTTowfh4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyDVktZQMiPzGVfEl94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMCab6sKgYNHpVwAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]