Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why doesn’t some of the Hollywood producers and directors create a movie/s about AI and the estimated progression/destruction of the world? People love a good movie, and as well as the “entertainment” (horror?) value, it would also educate the masses on what might happen if we all stay on the same track. I know we have the terminator series, but maybe something slightly more realistic (although I guess killer robots are not out of the question). As we (the people of the world) move forward, we need to ensure that governments (our governments) around the world stop/slow the development to structure AI in such a way that it doesn’t wipe us out. I’m all for advancements in education, medical etc. I’m just concerned (like many) that AI will become a real problem in one or more ways and we will regret ever going down this path.
youtube AI Governance 2025-07-22T22:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyindustry_self
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwqkHsrcA8KLfJ_ggN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyYRbkcXLRcog5OkEd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx16MIn8-WgZWB2RUN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxkeyJO2zrZq-zOFBx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwLTgdxI8Z09S9NPIZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx_aOGH01Def16O4lB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxtffePsGnEI5G_YIN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxAgLji_cuRyM684Sl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxT_PgGXOAEAtbfYpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"ytc_Ugxb15q3Bx29kr2BhYJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]