Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
But... Our P-Doom if not centered around a specific, like nukes or AI it's 100%! So the clock idea makes sense, but I feel like we're comparing apples and oranges, here? Also, if our doom were to come from AI, nuclear is very low probability of being the tool used by it; but could be our desperate response. AI would not use something that will hurt itself and the resources, while using a biological weapon can be selective.
youtube AI Governance 2026-02-25T06:1… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyguxlSmhlIKh4gZdd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgweDRYen7rHTPUc3lR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgywEqcCcgDCABDNsJt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwWtSy0N1tjtMhxcZd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyVfVxsND3Ua3tNcqV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwrISWJ7hLjSvcP1Zd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwyK5F1g2Q8-W0m5Wl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzGSkrwJrn7aXPYS454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzlL3VQZqpQHX0KRBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw0R9HqqG275eEcUxt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]