Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Came to this from a documentary of the Ukrainian attack on the Crimean brifge in July last year. the attack was carrid out by originally five drones. But due to human decisions enroute, three of those five drones did not have enough fuel to get to the bridge and had to be selfdestructed. Of the remaining two, one missed it's initial target being the railroad bridge, again due to human decision. It came around and instead rammed loadbearing pillars udner the road bridge. At the end, the bridge was severely damaged and could not be used for full capacity for a good while. But it didn't come down as was the intention and as would have happened if all five drones had managed to hit it, and it was repairable. If these drones had been autonomous they likely would have brought the entire bridge down - and thre is little doubt what the Ukrainians would have preferred. That is super scary to me, because it shows that it's only during peacetime that everyone agrees AI shouldn't be left with deciding power of initiating an attack. During wartime that all changes, however, and without anyone batting an eyelid.
youtube 2024-07-26T19:3… ♥ 14
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugw3TXGskPD_3WOpH-94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzcuvLNh2KM3hv_tMV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxscAVpBCkDCc7s6Hh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugy_vr3eNVutyjCpqOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxO3YPdl8yx_GviUcd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzooZgmczeIX2AshG14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzXYEZ3DR5JNrij5jh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxdnoZxFPy27Feuqn94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzRFFU5zxs67pWVj-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyBisE2mX62IZBhHjR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]