Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The problem is eventually data that is not made by AI will run out, and if AI trains itself on its own data it will degrade. In order for it to stay fully connected to reality/stop itself from going crazy it will need humans. So while it may have taken our intelligence, it has not yet taken our senses, observational abilities, our grasp on reality!
youtube 2024-12-22T20:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugx9Qba2jIsjS53ko5N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugz_ymExDkM6ldx12IJ4AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwPa7u0z7plkOyoO914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyi36BJsah9wPeNmIB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgypIKTBbhRW3o-PG2R4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzjWIfeRMBXLZi7KMV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw-JF_zkcP7hzhUpKd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwia7_tP5RXjKX1LEF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxd7O8S-QEwqu33pDR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyxQTLOBW2dIrb_6b54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]