Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
While your point is somewhat valid, the problem isn't becoming conscious... the fear is them having goals or processes that hurt people but people do this already to themselves anyway so what would we need AI for ?
youtube AI Moral Status 2023-07-02T05:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugx_ssEaov1-UZ0rEBh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgyQU7P16n6lQbCuMYx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzwOeYtpZ214qbDpCF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgzrF1pwTty9Aj1ABfd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugx6txs8h2CKUA6SKdt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyZJmBCtnE68mBgV_14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugx7Xx7I7-5mrmAEnbd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxdwXj18m_kOYrsPDN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgwdU3Bp2OjYL70r0FJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugyo-4332iVFQiVLd3Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}]