Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
8 out of the 10 most cited living AI scientists believe there is a significant chance of literal human extinction from AI. Timescales beyond 20 years aren't taken seriously, and 2-5 years is a common view. We have to take this deadly seriously. We have to halt frontier AI development at least until there is scientific consensus that it won't fucking kill us. Because we do not have that today!
youtube AI Governance 2026-03-22T02:5… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzxrtmujEu2J8CupXF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwJgJv785QtXzYiIhN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxXvfrHxVO0Z0De-394AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxMlcJuNDLp5KpRsxN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxOwXkbhYNDBUIRnQV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwGRwroF0bTqFXg3Wt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugys_ARzaXNOX-BiPz54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugx3IpTd5oxoiO_6sRJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzDnOzvdUpAOqGwHYl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgymI7SI3-OqGPlJkvB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]