Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I know this question was mentioned here, but... 90% unemployment even without AI means there are no consumers for the goods, which means severe over-production, which leads to a price plummet and/or forced dumping of goods. With AI, even before the deployment of AGI, the companies will find it unfeasible and should cease its use, which will defeat its purpose. If they don't, and If by then we are still alive, supposedly AGI will find a "solution" which from AGI point of view the ideal would be a full elimination of humans. Someone mentioned greed in a negative aspect, but greed has always been the main driving factor in overall progress. Otherwise, we would still be hunters and gatherers. And, regarding crypto, I find his statement quite strange that because there is a limited number of bitcoin, it is a good investment. Like I saw in one tv program an antic dealer said that just because something is old doesn't necessarily make it valuable. SO just because there is a limited amount of something, doesn't make it valuable. If suddenly the big investors lose their interest in bitcoin and switch to, for example, energy especially in light of AI consumption, bitcoin can go back to $0.
youtube AI Governance 2025-09-07T02:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyUh9qnRrmW2tgy41h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz7zBCPIZhSrkx0GkB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzOiqKGaqYJnn4Ks-d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxkd0UMqfmHCGsnqOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwEVje370wz_RqiTZp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxICRujy7fb5COrM5J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzEuOD5IEP5x8MXK_B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgympumaGa5JYY3EpIN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx4BqeuMIp9f7MacR94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxaQ_ZQx6GqQGd38nV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]