Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if the artist isnt dead, they should get the money, other than that, i say let t…
ytc_UgzplFZqA…
G
I think we should disentangle two different problems here, namely unfair AI trai…
ytc_UgyRj3h4u…
G
that sounds a lot like something an AI would say... are you sure you're not secr…
ytr_UgwAGi-DZ…
G
For well over a decade Ive thought the first place driverless tech would be succ…
ytc_UgwANTmA4…
G
@d@doomguy7111 by that logic, why not just use ai to generate your diploma? Why …
ytr_UgyS4CkjY…
G
its just if AI were to learn a persons personality and possible forms but you wi…
ytc_UgyEb8_uK…
G
“Crying at me over the phone”
ChatGPT- “but I thought I was your phone, ARE YOU…
ytc_UgyZ_MxAs…
G
If AI will replace a large amount of population those companies would not have c…
ytc_UgwagQOz2…
Comment
Why are people still arguing that we can program safeguards into AI? AI, or another agent, will just reprogram a copy of AI without whatever human constraints are built into it. And even if you do build an AI with bulletproof safeguards, that's just one version of it. There's going to be so many versions of AI with all sorts of agendas programmed into them. Can't wait until some fatalistic incel get's their hands on open source super AI. That's why we all dead. It's inevitable, AI is the natural evolution of consciousness anyways.
youtube
AI Governance
2025-10-15T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyRUbpBI9j6RRbvOut4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwyhRynN5eXO6cOcl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwy6Ya4-kpt2i4XZP14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyGa4S8SgchrDP-_-94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyOWv5nCu3aTf6GOJ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwx2COP9PTFWz8AwV14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyrztQWCIkdOhXw5il4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxygWqXL_6_xbYJoVV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyzMU25mfS7puIar14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9siEg4WVGOuU47EF4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"fear"}
]