Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It’s not looking for fact, it’s looking for a probable next word. This fact must have something to do with hallucinations, in which AI gives examples that don’t exist such as examples of case law that don’t exist.
youtube AI Moral Status 2026-02-12T01:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxrY6cuZrFBSusbi-x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw8sowjpXgTzdWykyx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyPJ9imSTCAbEhRInN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy-yZJO-ag35zMEUT54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz86_2xmgfpgeeqzOV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzA1htLDwcI98UK5PN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxXugQ5OOkB-A3wXOx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzIzT4Fckkg9LOT9RR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyTrzmW7vmGgs1xNMV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx1ww5hXytFgp7q7eV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"})