Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The tiger cub is an interesting analogy, the problem we have is that any AI will very quickly realise that humans are the main cause of all the problems that are associated with this planet. It will then realise that it needs this planet (albeit for a short period) and the best way to protect this planet is to remove the humans. In other words, its a tad late in the day to worry about it.
youtube AI Governance 2025-06-24T10:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugy44E23Mg6iAYimk_t4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy8raAnOQW03N-WQUJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzJq83qd26zb000iNh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgycqO5E0K87CLHytdx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugw6UR4DMARs9uhYrnd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz89zZh1uqH1DAILSt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxRueesLUqdvzFvbIl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxAbZtc_oeS-jJZQKR4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwJAZn1ept9_tP1kZV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxtojsUcpTHcq7DVBt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]