Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm not belittling the probability of societal upheaval, but that pales in comparison to the risk of super intelligence ( IF it becomes self aware) in humanoid form. I've heard very little about the obviously huge risk that is. Historically consciousness means killing people bc they don't believe like you or have what you need. So why exactly isn't humanoid form ai top of the list of shit NOT to do??? How is this not priority #1 ???
youtube 2025-04-13T18:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgyU7l9A_1muHLAQMdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxUnazbFeuWL0pIXZN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz1duy4r3L69ffEZAd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxPOvDlR8RspM7dGO94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz7u3qZGFad9b45Xtd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzCxT1gu-yU0LfxDXl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugwixxa_D5dPM2diiKd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwWUgc286ZJGAchseZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyuIljFsKBeJpbWNeR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzrd39tEyjjXSkYrPl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]