Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What the actual eff, will someone please explain to Dr Wolfram basic Intellidynamics and iterations thereof? And send him a copy of Kevin Kelley’s book “What Technology Wants”? If computational irreducibility does not easily help us grasp complex Intellidynamics, it may not help us at all with extreme ai alignment issues.
youtube AI Governance 2025-03-24T07:3… ♥ 1
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugx5mhqb_lSeZWtQve54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugzb2qOXs9QeyJVXFuB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzci4azcKKznmKT4Pt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyFada6LeqqUOef58R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyMqi1HzvbCRAUHhZF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzhPjRIQCy-Y-ojdDN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzoNUAH6G4MvLVfH9V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwdY12QxCZKeylSuYV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzVmoWLn37HGGM2Cxh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgzGDG9HkSk1yB6j5I94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"} ]