Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The catch is that once the AI is advanced enough, it will be able to conceive of and then cover the "jailbreaks" that humans cannot. This is a temporary problem for the human programmer that will eventually be inverted, making it an infinite problem for a human to overcome. The guest made the error in logic because he's basing the problem on the human limitations which will not apply to the AI when solving the same problems.
youtube 2023-12-27T01:2… ♥ 48
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz8li4V4W6fgbdEIiZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyiqauqHHWoR8ucTxV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzmpHaW2Z5BG3pUETh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyDRqr6UwhXlMPV3Yt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxTwI6nlNYrvVcM0VF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzYzrfbpvrzmck-h1J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwJFNjd6YiUqzux_7p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwIqE7SbSgBQ-FK4sB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxeDETNX06VlgPlHSB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzpSstLCAneHlPuXYF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"} ]