Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can anyone tell me one benefit? I see none and these "geniuses" appear utterly c…
ytc_Ugw-bRznb…
G
@lukeursic I finally gave in and decided to respond to you incompetent fucks.
C…
ytr_Ugz5i2qRw…
G
no one fantasizes movies, AI still has long way to go for the better of humanity…
ytc_Ugy9II3t4…
G
I as a game artist for years but pretty much stopped when AI started saturating …
ytc_Ugy_ypS2V…
G
Shut up. Not attacking anyone, just gotta get that delivery done to be a good r…
ytc_UgxnD4Fvj…
G
I never understood this. My best guess is that programmers are on the forefront…
rdc_nmboeqw
G
This is NB2. I don't use NB Pro.
The reason being that NB2 runs on Gemini flash…
rdc_oi29kdn
G
Would be nice if AI could figure out how to get Elon to STFU and share 1% of his…
ytc_Ugz26P0YN…
Comment
The fact that this guy from Google called Musk a "specie-ist" (if that's the spelling) for raising concerns about AI, is massively revealing. Evidently the people at the top of Google are relaxed - at best - about the notion of humans being manipulated and harmed by the monster they are creating.
youtube
AI Governance
2023-04-18T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxd7W921BfAiqqn_X54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzTfFQZ5y42fCy5y8R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzk6oWxOoFX6nEHaHN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2R_WZqhidFaV8rS14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy52cI15FZ47jbqQNN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzyUFh8ooKQT3mrTi14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxNAbd8K9PLBM9GKu14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy7tU1u8EOQ0ERt7iB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyFx6fMRiynIAwEXLF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMIqpve1Y6NBpVT_B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]