Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
These idiots don't need to create actual AI for it to be dangerous. All they need to do is create something bad enough to mutate and propogate across the open web. It doesn't even have to have a nefarious purpose, it likely won't even have one whatever ends up making life real hard. Coders don't know all the possible outcomes of what they code that's what scares me the most.
youtube 2024-01-07T03:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzVqyB9SIVezavFaXV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyFWWHttt6GYJiYOGp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwhUmj-yjh_8_9hvlB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyifBFxDWceqRkbIrV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwymHr25Fn_Bm1BZap4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxQKayT2Xl5R6wE9ip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxLU1huybkn4UXr5gp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyxrQyqC0j_LzOI7914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz9DWlmX0IDtYu9aGJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyIs8MgLDka7Z3zadJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"} ]