Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
LOL! What fiction. They were afraid that the stock price would go down on their stock options if Sam Altman were no longer around. Since Altman was the programmer hyping "A.I." A.I. is a SV marketing term that markets fiction to the masses. It exists in the fecund imaginations of programmers everywhere looking to make a quick dollar. And they are successful. Programmers even hallucinate that "A.I." "hallucinates". No. It's the ages old well known programming maxim of? GARBAGE IN, GARBAGE OUT. G.I.G.O. Fake Marketing "A.I." is useful. The programming can be used to do things. But it is not an "intelligence". We can call, "A.I." BSI... B-ull-Sh*tting Intelligence. BSI lacks the physical and mental foundations of how intelligence came about. Emotions came before intelligence. No computer or program ever started out with emotions. Emotions are the building blocks. Physical feedback systems such as when our gut tightens in response to perceived danger, or the phenonema of "gut instinct". B-ull-S-h*tting Intelligence is none of that. It is simply programmerss hallucinating that they are creating "artificial intelligence". As a scam, this one is going to be worth hundreds of billions, if not trillions of dollars.
youtube AI Governance 2024-01-17T07:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningmixed
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyv5Kodbqtd2Za3kqN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgxnodQhTiLR1MqMw854AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzx3MHQPUeuD8coHtR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxr_7DtuDmeQA8R6GZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwkacI9_eHoJ5dSkBR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugw8ugwX0J4qrBcuLPl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxJWL5SYOt7U4qgVC94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxmFXIff20X8KQiTRh4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyFHqCxE2UlL1IzQ3h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwSlLnTtSZsUuR01vN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"approval"} ]