Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
only solution about stopping ai evolution is killing all the scientist who are building this ai to prevent destruction of the world and save the earth... this is the only best solution we can do now, no ai exist if no one build this... we must start to stop them before they create a program that ai cant stop what he can do and know everything how to defend them selves... no government in each country will listen if you told them to stop or slow down in building this ai...because every big nations or powerful nations is fighting of whome gonna lead the world... and to become the superior in all is to do any risk like building such a powerful tool to beat their enemies, this is the dangerous mind they always think to dominate everyone... no one in earth think to slow, or stop, or to be safe in creating anything, they only think to become the most powerful on earth... they will never think that the only powerful on earth will be left on earth is the weapon they created... not humans, not animals and not any leaving species on earth, only the weapon they left behind of their stupidity and greediness of their ambitions... and the worst thing is, earth will also face the same destiny like other planet in the universe, it will also explode into pieces...
youtube AI Governance 2023-07-07T05:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyban
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgypHQZ-V1uGXqxD3tN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzJOvyylzbh6aklK0d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzdW3cFNTOIkWk5ish4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxeIYnBxrFyaxwG6-94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxgB6iZISeffI56bx94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxoQyX5o4oolprCv7N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz6-M8Z51Lmgki0L6l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxsWSdlIxdqLur8BqN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy0tHKF748SnGMR5wB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgzxPMo5NLOe6EBKHuh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]