Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI on my desktop seems to believe I should close task manager when launching…
ytc_UgygPgQux…
G
@TPCDAZ Stop comparing sapient entities with emotions, needs, and subjective exp…
ytr_UgyItpj7Z…
G
Art theft is the biggest reason I don't post online, with AI less likely too now…
ytc_UgxD6l2Pw…
G
Inventor: "After many years of painstaking work, we have finally reached perfect…
ytc_UgyviMeco…
G
I hate to say it but AI is the end game unfortunately we are greedy for short te…
ytc_Ugw4k60VE…
G
This is serious. Why don’t why just pull the plug on ai and go back a bit.…
ytc_UgyKnkJ9_…
G
Yes, and a very small minority don't get that fulfillment from healthy, real lif…
ytr_UgzgchpCg…
G
AGI is not a thing (And no it's not close). The idea that a human society would …
ytc_UgzwQ17-Q…
Comment
At the moment humans control what computers do through code, but when you have a code writing computer system that can write its own code it will not use the lowest and most powerful human interface code "machine code" it will write its own code to interface with other machines which will be faster more efficient and more powerful and humans will not be able to intercept this code. It is at this point that AI will be running all network systems that humans use. In the end AI will exploit, human self-interest and greed to get these machines built.
youtube
AI Governance
2023-04-18T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQHLafqjPfJ579Qt94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwx_zlVhyIzxaYToW14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwMFzVrZyEUZLYFd454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzoWI73RsbZLzkCIVB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw84K6HbnjS6tv0ubp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxjEWY9s5gkM0am9nx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwPX4d2gqnUW1e4vDR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyTMbkPPfAbImjMlUV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy97OAilOmcrutiapR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwkMzppZuxlcjfRtKF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}
]