Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If we don’t go full speed ahead, we will loose to China, or Russia. We need to make certain that our AGI’s have Christian values in their programming. If we get to Mars, it most likely will be our robots of which will spread Christian values throughout the Universe. One thought to contemplate: If AI, AGI, ASI are all made by humans, can’t we assume God wants us to go forward…all due to our failure to nurture Earth, not destroy it as we are. No bees, no food, no flora or fauna. We have been terrible stewards of the Earth and only concerned with money. Time to change that! Eventually, robots will only need the sun to power them. Could this not be God’s plan?
youtube 2026-01-01T18:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningvirtue
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw4O6CvM3cxzyVTXpZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxjUGxOCwAsxJbgSAh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzqO38cdJcakBUZVmx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxbMhkq1FzBS5ZvaTN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzSmblwuk0nopUevcd4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyoeR_ktaHlicb6VL14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxUR6sIHHN_gAvGSWB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzZeWJBObtnkcia8yh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxtdGSw852hKUWIjtB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyQLvKloAsWp8RII4x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]