Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Sentience means having wants, desires and emotions. As far as we know AI doesn't have that. I think if it is told to have these things it will, but it will artificial. So it would be possible to instruct it not to have the sentient traits.
youtube AI Governance 2025-02-02T21:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyD2k19x_A2E5R10Q94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyXgjhJ5bTfflQZN5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxLB8y1BOwik4nQNQl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgyNlJ1kUE0IS44Xgy54AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugwrl-3WiwnPnNXqADp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxwOnhGIXNLDL1aSTR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwJ50RVub0TRy35KmJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzkMh8SRpba0ydO45R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxgZIYw6ccBUz_lCFR4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyrsnOMn_JbyIy7tbp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"} ]