Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Firstly, this is a bad idea cause then how are we gonna get money? Secondly, the AI might have a slim chance of malfunctioning and not listening to their orders and start killing people. And lastly, they have to learn it and you have to write an entire script for them to do it while for humans you just tell them
youtube Viral AI Reaction 2025-06-02T23:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugz4DrRYGeDo4-SWQYN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwV7RdHWYi6J9I1eS14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyMgEi1hVOFxgrxCJx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"sadness"}, {"id":"ytc_Ugwt0wCOjd032cdidEd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxXrOUlER0RLIQpQNR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyg6pij7w9bklOQbQt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz4fcRlHbExFPqDdNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzkJdLNCwYcURjr_sx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyysxulZmurf86P1bJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxYQUiIFdx-XxvLQK94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"} ]