Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
good video
- while people are legit waiting for us to fill the void ......
- …
ytc_UgwZAQ8wn…
G
I have several problems with a purely materialistic view of the world. For one t…
ytc_UgwDWfbbr…
G
Bro this is total bullshit because I put last selfie and didn't get anywhere clo…
ytc_UgzoqF85E…
G
one person dying is enough idc how good this shit is, also it should automatical…
ytc_UgxraoflJ…
G
I would challenge anyone to just read the Bible and then revisit Sam Alton, the …
ytc_UgyYGSZ1y…
G
Everything these idiots say is BS and they think it’s funny all of these AI idea…
ytc_UgwDDa53z…
G
The war in AI development is between the Autism Spectrum compromised individuals…
ytc_Ugy6bHMqi…
G
There's not enough information about these cars, this historical event just star…
ytc_UgwibtZMM…
Comment
It is scary, no doubt. I dont know that anything can be done about these new weapons though. It is against the best interest of all large militaries today to accept a ban - any 1 other military force currently weaker than them could surpass them by ignoring it. It´s another arms race scenario already in the works. Besides, the costs of money and human lives could go down drastically in the short run, making it very attractive to politicians and soldiers alike - understandably, to be fair.
I strongly believe that anything that is technically possible will be done sooner or later. A ban or limitation may slow down the process, but also create the risk that the least scrupulous end up with more scary tech than more humane powers. Therefore, I dont think that it would be wise to slow down development of such armaments. If we cant stop it from happening, might aswell increase the odds of the technology not being abused (too much). Who knows, the power these machines have may lead to the end of war aswell - it is probably too optimistic, but 2 factors would play in favour of such a scenario :
1- war is rarer today than it´s ever been. There are few wars with relatively few casualties, compared to the decades and centuries before now. This trend may continue for various factors independant of AI weapons.
2- the wars that are going on right now are mostly civil wars. Intervention in them is usually a bad idea, but maybe, with high precision tools such as AI weapons, local human lives may be saved while no soldiers of foreign powers would enter the country and be endangered, making the idea easier to sell to the public of those democracies that may want to do something. Imagine smart drones in Syria, blocking the many factions from killing each other...
This may be way too optimistic and Ill admit I dont trust that this is likely, but, given the certainty that such weapons will one day soon exist, I´d rather have them controlled by those that wish for a peaceful and democratic world.... though I wouldnt mind waiting for the end of Mr. Trump´s term for the US to get their hands on them. So please scientists, delay until after 2020 (or, if the US are crazy enough to go for a 2nd term, wait for 2024). Please :P
youtube
2018-04-03T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugw0QS7E7tno3PFcub94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"approval"},
{"id":"ytc_UgyPqjdvlZmjGugvEnh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwqmRfpWMiI7S74MDB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwKiYnjFrFjmgY2BYl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhbH5nbdQPIETSJ7h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyIfVkRJOUFGBPM6K94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx9lQHy65E6ilBa9_J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyfaA-ijnkCtVPcqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgykYB7zSFgCalhpSFd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjWdvb4RrIuqg9T1x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]