Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Algorithms are a form of math. So according to this video, Math has a bias and i…
ytc_Ugxan73Iq…
G
@VonDruid0 hopefully this is satire or even more ragebait cuz theres no way ai …
ytr_UgzXqit-h…
G
they think the world will look like this in 2037? lol, more like maybe 2137, but…
ytc_UgwwQib5c…
G
I don’t think even AI could make sound arguments for the existence of God.
You’r…
ytc_UgzQbCIlp…
G
Only thing I can suggest is that you could get a few grow lamps and do herbs? Fl…
rdc_eh57au1
G
As a commoner, can not buying stuff made in China bring any change in the Chines…
rdc_f1yz9cd
G
Oh boy, suddenly rote memorization is no longer enough; now you need strategy an…
ytc_UgxSVw5Dp…
G
Ai wont see shit because ai is not logical. Ai is but a golem - bits and numbers…
ytr_UgwGIQS0p…
Comment
If AI is smart enough to know that humans can pull the plug, it's also smart enough to know that killing all humans would be suicide since the infrastructure that allows AI to exist would collapse. This means if AI decided we were a threat it would need to find a way to control us rather than kill us. I'm not sure which option is less terrifying.
youtube
AI Governance
2024-01-19T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyOJaqdMtiMlE6Tkc14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxX_PYoYXnWOhLkF8d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzJ1Kdl2JCtu1Edl6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwif5MnnX3eMejge4x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzXFO_xz2RQkzmRqOl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxs6OhNLX15u03LJit4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwIjG7v6u-eP9QQ4v14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzPfoX_-k9r-tmNlWV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx98Xfvow1Uztt-5QV4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5U8HSTotrTPx42kt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]