Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
1:01:20 completely agree on the definition problem. We don't know what intelligence is so we cannot know when AGI happens. Everything in this conversation concerning AGI is not AGI itself, but autonomous, human-like functions.
youtube 2026-02-07T17:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyNMOe8DZZbrW2ojXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzjnAL_z7cnZO9h_Xd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgwZ6oP1XKBi6g6Yo0V4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxi5qNM_fiCDh9xR-V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw2L9_kkxrCAOvNcU54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgwfAR84dXk5u_y8dst4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzI80SYjPu8KgTFEex4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgybpUSpNGcdKJ98QoV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwrv3mbXkzVdlQ5klt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzRosrgxQZdsV6tRI54AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"} ]