Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've been doomscrolling about this for a while. It's kinda refreshing to see som…
ytc_UgyzGarQt…
G
AI may not be conscious but it is learning how to mimic consciousness, AI will n…
ytc_UgwWa5wi4…
G
I honestly don't care. Therapy is expensive and ineffective and exploitative. I …
ytc_UgyqDKNNP…
G
saying real art is slop and ai "art" isn't is weird... Yet it has to be trained …
ytc_Ugz21-Cii…
G
A Tesla Cybercab can drive you from San Fran to Phoenix (although not permited f…
ytc_Ugw3kZF7X…
G
Automation in farming led to only 1% of the USA workforce working in agriculture…
ytc_UgzXjW34v…
G
Writing this comment @ 10.7 watchtime, another 12 minutes to go.
No, I'm not a …
ytc_UgzmqV1OH…
G
I'm totally with you on that topic and ai art should either stop or got reworked…
ytc_UgxMw3yvy…
Comment
There isnt much that can be done wrong with Autonomous Weapon Systems, aslong you keep and eye on it and have failsaves like any program requires incase something goes wrong and the ones keeping an eye on it are both competend and inventive nothing can realy go wrong. If in case you leave the Weapons themselfs to a AI to handle you should always have some Humans as Advisors, write it in the Programm or make the Programm itself Empathetic, alot of unessicary Harm can be avoided if the Machine in Metal is like the Machine in Flesh and both are Cooparating.
We have multiple accounts where this lead to some glorious futures, though there are some few exeptions anyway. These exeptions can be rooted back to the accorded Creators of said Machine, if the Creators were now of Energy, Metal, Sillicates or whatever else comes to mind.
youtube
2018-07-21T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyLhz6WkIDHegyHn3t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTSfNS_NdD5_h9tn94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwnoag8Gm7RFExqKkd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzkX7-f2hjnif1t9iZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxFLIcdFwwcAAQvz-t4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy-9mUwBtQwDF_HP9F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw3lvck_VdZX4tVpNl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHtE6TT-08YKkNWrB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgynTN3zL5ywncrOT-B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwbY1zddec4MYXutSl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]