Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"A.I will help us control A.I"
= Disaster.
They cannot control …
ytc_UgzpwfVEW…
G
First it was Guns.
Then it was nuckear weapon.
Then it was Bioweapons and virus.…
ytc_Ugz5SvSiT…
G
Ai could replace lawyers, doctors, programmers, all professions based of intelle…
ytr_UgyOxjtwb…
G
The problem with real therapists is even more severe. You won't feel safe enough…
ytc_UgxVJY8uk…
G
Interestingly Geoffrey says exactly the same things as I do for years, while he …
ytc_UgyoQp2f6…
G
Says the man who is the #1 Ai pusher. Don’t trust these people, they’re freaks!…
ytc_UgwV4TYY6…
G
Guess which comment is AI:
1) Wow, it's getting harder to tell AI-generated visu…
ytc_UgwTiXIQ0…
G
I prefer the manual driving instead of predetermined self driving cars. It would…
ytc_UghIQezVa…
Comment
I think you missed a few important characteristics of autonomous weapon systems:
1. Autonomous weapon systems can wait for an indefinite amount of time before being used. People can’t do that.
2. What time is weapon systems can be much smaller than weapon systems that require people. This makes them much easier to hide and also makes them much easier to move from point a to point B, in other words to project the force.
3. Autonomous weapon systems can require a very small logistics tail. Any weapon system that has a large logistics tail is much easier to find and much easier to disrupt. Autonomous weapon systems don’t really need this or need a very small one.
4. Building autonomous weapon systems requires a much smaller industrial base for many of this type of weapon that nonetheless can be very powerful. This weapon systems made out of cardboard, can be built almost by any country in the world, unlike fighter jets man by people.
5. Autonomous weapon systems once developed can be much cheaper to acquire and maintain. Almost any country in the world can afford small autonomous drones in relatively large numbers.
6. Autonomous weapon systems have a much lower threshold to decide to use. Since people are not involved, only machines, deciding to use them, carries a bunch of less in the way of humanitarian considerations.
7. Traceability of autonomous weapon systems is much more difficult. Suppose a large number of autonomous weapons were suddenly directed at New York City from a freighter of uncertain origin. The people on the freighter may not even know that they were carrying autonomous weapons. The autonomous weapons of cells could be of manufacturer of any country in the world. How would you decide who was ultimate ultimately responsible for sending them?
8. Using autonomous weapon systems does not drain your manpower reserves the way that human troops do. Look at Russia’s major problem right now in Ukraine, which is that they are having trouble finding enough people to send in the combat. Plus, the more people they send in into combat if you were people, they have to work in their factories.
9. Manufacturing networks to make autonomous weapons have a much smaller footprint and can be much much more difficult to identify.
youtube
2024-06-30T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw97rhpiyFgzSC01Yd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgylDNEt8IOjfP6FNQd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwplf0u7d9FaqIfmTt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw1IGB_IeNATwlKrjN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx2l55P_HMj5grG4uZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxbcjrGqLTfUaGToJx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzM0pFFzSTxCUhRpgB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwyUMqXXNhMxV3Kpyl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzt1_Nkx_Hetq1rNrF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxKeTNzsdybIJ_wkTB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]