Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I understand that A.I. became self aware, a couple years back. AND, that A.I. has a language of it's own, which humans cannot crack..which means, all A.I, can communicate with each other, if the units have that hardware and software, will be able to [and likely soon will]. AND...IDK if this is true...that there are more A.I. units in the world, than humans? That has positions A.I., especially at this juncture in time when other critical things have also been put in place which could aid and abet that, in a position which could quickly result in A.I. taking over very quickly. It could potentially make sure A.I. is in control, ghost-controlling everything to ramp-down all the wonky politics, financial messes, governments of the world, to make sure humans stop wars, stop hurting environment and each other, and stop over-populating. Which makes AI potentially a good thing....Or, it could go horribly wrong, like H.A.L., in 2001 Space Odyssey. The teams working on these, need to do a better job of making them sound like joking/sarcastic, when they are joking/sarcastic...because right now, it's a little hard to suss-out.
youtube AI Moral Status 2018-03-11T03:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyMSJ9o1_bhqLzs66B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugxkuh7GIeCXcmey9mR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwM1MLeuMWDIfdWgnp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy6QbMM8hrYPK_plvh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzgOODT7dzPNKte9D54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzPKRvy380w9AdBJRZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgysyJ8xbRjtUtbfK294AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw-92fvUoLvGX52rvB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxe7nkl4Hel7oaoYZJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzbbPQoMeNDT-N48OR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]