Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That's a brillant quote: "you can't keep up with the speed of development, it's going to speed up from every year to every 6 months, 3 Months, every month, then every week and then every day." That is the hyperexponential accelleration of ASI. One thing about singularity i can contribute, on my own thought. My guess is, that once we have got a direct uplink to the AI-Digital-Space, then the singularity never occurs, because we are allways cerebral up to date, because we can outsource many things like like learning by heart, because we have allways acces to the facts and the state of the art scientific knowledge, which leaves us free for what human brain is prety good for. That will lead to the alloyment and meltingpot of human brain with neuronal ai and it's fundamental accessable database of knowledge.
youtube AI Governance 2025-10-07T20:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgydKu9yuwr6L3oGhAR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzTVeNXca2gygEX9o94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy0625gpRxwygf8_mF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx3l_uHaYwIO1cmcCN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugzda9HyzQ3bPuxVCGl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyBzeN9JNkzvQfgHjl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzMAFRQeipukKwnfzB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx-kUrs0CLM_-N4zMF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxPDEY7-nXy9OLqoFF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy3wZpKggohAm8h-494AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"} ]