Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The problem is that the AI is onlu learning based on the info given, not real context. It doesnt actually understand, so if you as a human went toe to toe with chess bot as an example, it knows the moves that need to be made but not why, you do meaning you can use that context against it. And we've already seen this sort of stuff happen
youtube AI Governance 2023-08-14T11:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxbkTqLkjU3uXlH5yV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzfaB7o_9NV7eTuvud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzJE9RpB941BwPu5UR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz6DkWZnmJ8bc1cgy14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwSVeMt9jSBCsBJQ9p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy4Y_1s-dX3uM54znx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxlX9V7X6EzlbA6WTl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxia4Nuzlppp-ihIAh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwefLJ2c6fGafnjCW54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx8coPjG5_AdbX1Qh54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"} ]