Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Since the last update, for fun i tried the same questions and i guess the AI evo…
ytc_Ugw81oE4G…
G
This entire arguement is fundamnetally built on the same logic as a racist argue…
ytr_UgwU-o-5m…
G
you shouldn talk to it at all. we should stop using them. before we do somethi…
ytc_UgxfEpqYL…
G
Super-intelligent AI: I'm your new god!
Humans: But do you know how painful it i…
ytc_Ugxxs1AKE…
G
Finally, I can love a person for his imperfections and evaluate a robot by the d…
ytc_UgzH7e-SG…
G
Don't be fooled by so-called experts, AI is for mass censorship, and of course
…
ytc_UgzxohCRS…
G
I'm going to be honest; I think this guy and people like him are, ironically, so…
ytc_Ugxye0D-7…
G
This made me realize that the possibility of a robot uprising is possible in the…
ytc_Ugzm0rSFB…
Comment
As with all risk (which is basically on a scale of probability) it has to be balanced with the severity of worst case scenario and the amount of pre-cautions. Sometimes wondering if we caught in a system that can create problems that can only be solved by the tools causing the problem. Like technology is causing the climate problem but may also solve it. Ai cause social turmoil but can also solve the same problem. And we cant get of the running train. None the less, I still think its mankinds use of what we develop that break or make the world.
youtube
AI Governance
2025-07-04T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxNYJU6bInj7xj8YkR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzDeOkKlHnzUy3hkgt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzvZIoYfdySLb9X4Cd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz2l7QJt6qZvoz098d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyqMUgrunwyRuedvdd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwvmb-1hr6M9LndJOZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyisA6yfjJkrM_HB454AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxdao0S6F5P47QVRo54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw_B4Z6Ycsi6IIoWnZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"disappointment"},
{"id":"ytc_UgyVQyIIXBmVYZWch014AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}]