Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Something that I've only noticed while on psychedelics, AI art has no emotion in…
ytc_Ugx7tGqZd…
G
I'm a unity developer and it always cracked me up how people believed that simpl…
ytc_UgxhEWd65…
G
The whole "accessibility" thing I side-eye on multiple levels, not the least of …
ytc_UgwfAu4z6…
G
I had my professor tell me to just generate music for my film. I declined becaus…
ytc_Ugxb-NdPL…
G
anyone else feel like he went from somewhat informative stuff to now shilling fo…
ytc_UgwgYgsxC…
G
No, llms do not. True AI may think however when it finally comes to fruition.…
ytc_UgypHzNK5…
G
Just like your Tesla will appreciate as a robotaxi, the cybertruck will have an …
ytc_UgwByzw-L…
G
Wahaha - Using an AI voice to slag AI. Little optimistic methinks. More likely g…
ytc_UgyUyIyl8…
Comment
1:16:28 see but I think it's already there, as far as the dangers. You don't need some super intelligent hyper mind scfi skynet thing to do this by being a singular controlling entiry. For instance the CCP may not have enough agents to read every single text and screen it for concealed anti-party rhetoric, they may not have enough agents to listen to every microphone in Beijing 24/7 and correlate it with who's in the area at that moment, but that is feasible with current ai technology. You could easily automate a dictatorship with current ai tech, no superintelligence needed.
youtube
AI Moral Status
2025-10-31T13:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyWaZcwZKHKSnqZBRl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHHm1x9c4h0BFHDfl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwJTkYP_ZpT637843F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxc0ucUv_nx8gWFl0h4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyWmnX02I_kpBLFk0N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwnAidCb7yNIhXit3F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyW6pJd9Hs3u6CotBV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzcXCV_wS16j2Zaa8N4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyQBFzwFC9lH-2PNvl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwBt7O82f3b_hmFac14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}
]