Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol, if a robot demands rights we can just shut it down. Sadly we can't do that…
ytc_Ugi7-SOHE…
G
Ilya Sutskever didn't just leave because he knew AI was dangerous. He left to st…
ytc_UgyIn5uBo…
G
The only thing AI is good for is brain-dead shitposts and you can’t change my mi…
ytc_Ugy-W1hAJ…
G
it was, now they've been having great earnings reports so Wall St loves them aga…
rdc_o5q9x0a
G
People on here thinking that this is real are stupid.
AI manipulated.
People …
ytc_Ugwc49nb-…
G
AI is a joke, you're going to be sorry for trying to change the world. No one ha…
ytc_Ugyxmvlm3…
G
'I have a vest on. If I had no arms, it would be a jacket.'…
ytc_Ugwqzk6HX…
G
It's called war... population reduction, no need for the slaves once automated…
ytc_UgxAzpEKJ…
Comment
Agree with a lot of what Geoff says but disagree re his emotion analogy, emotion is more than just reasoning and the consequential physiological response. So for example, you might want to buy something you cannot really afford. Your reasoning system says you can't afford it, that buying it is a bad idea. But if you really want it, you might buy it anyway even though by reasoning you shouldn't, and to hell with the consequences. Ai does not work like this, in Geoff's analogy of a small robot "getting scared and running away"... I do not believe an would ever think "I should definitely run away, but screw it, this time I'm just gonna go for it because I might get lucky and win!". Which is the essence of being human, sometimes our emotions override our reasoning centre for better or for worse.
youtube
AI Governance
2025-06-17T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz3d-s7IfN3vK_KzFh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzpMIiqpP3Cmc1LTXF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7JuRaydqYjf1cQ4R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx1Xaz-M0RjnGjaGeJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwqpI7ejCBfr2TTfCx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYTvhIQwTmDWsni594AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxUweBofnfNTUpMHLV4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugznhjc2XtWrEvKOE-94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz1e29XYSdLhi5zUz94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7xK-mVXIHCj-cokx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]