Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Long, looooong time before it will be able to coordinate complex projects, like building musea (where so many branches and problems need to be tackled aesthetically, contextually, politically, heartfully - apart from technical, financial, legal factors, that could be possibly handled well by AI now), or arts in general - fine, live, whatever.. it cannot have an algorithm for a context of chaos, laziness, forgetful/irresponsible/pretentional nature of human being, greed, political/personal/historical/heritage context of relations between parts of the project etc. etc. etc. Lacking context meaning lacking identity... and now a bomb - consciousness. We, people, still live in a concept of consciousness, but not really knowing what it is yet. If we do not know, and AI can learn only from the research we have done initially, it cannot become conscious enough to feel and know what to feel. It is like a dog - it rolls on the floor because of the context of award, not because it feels like doing so in present. It will not produce a piece of music because it feels like doing so for the reasons of the fire in the belly. It will do it only for a command, and arts for a command is only arts for order - useful one. And things like arts are arts because most times we just do arts for the sake of doing it. Also we are lazy or scrutine at work because of our life complex context, not lack of it out of a random
youtube AI Governance 2025-09-06T20:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwG1x04_JTNHB3mb_B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwVEOSOuDwlT39yEbh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz_zpNF0D9ej8gu1Bd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwO8SYJUmMR4wYWwXJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxyJZxgFQfObxaPNvB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyYYY7g4Nbw6dsnmll4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyIDou2wZq5x_vWdX14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxmqeh0uEX93S-Wtop4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgykVsA1Bp-8_pkvoZB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwBLDBipqdnN9vYxJl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"approval"} ]