Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I wonder what would happen if we safely created the most advanced AGI, but it was not functional until you flip a "switch" that enabled the AGI to essentially "start your engines... Go!". If you could set a finite amount of time the machine had available to learn and advance itself (increase it's own skill levels), we could experiment with how quickly things turn dark or how quickly can the machine go from a more basic AI to a true AGI. If you could air gap the expirament from the WWW, and just simulate it with the majority of the content of the internet, the expirament could theoretically run it's absolute coarse because it's sealed off from causing any real harm to the world around it.
youtube 2025-09-12T18:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgySogYMmrqcBDNuNVd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwj9Qwqicq2m0KJH-Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwhcZR2yflhbd8Z9AF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzcrBpVWZpyuWYkFL94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwdUxe0xhsPCcZudQV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzjedkzVlGb9Pcp5DZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxxFh5AupAjOgm0aHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyDD8AlnUAzgHZ-XyV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx9GuBCvnURD55G53R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx4P_jAc1BXXwVH_nB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]