Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"what the AI is doing is massively outstripping our ability to understand what's going on in these systems..." A misnomer. We don't understand what's going on in these systems, because they're based on "fuzzy logic"; a necessary tradeoff is that we don't understand how they work, for them to work, because we're giving up part of the running of the algorithm to the algorithm itself.
youtube AI Moral Status 2023-08-20T19:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz8TjFfF1KbP-SLbfd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwduKgbYlwr8rpXyQ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgylIWsTHYs2v5xc8554AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugza24H3yRtt0p0wRDZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxzG8zOKGZF9Bse01Z4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzPwHJCbEpISc9j1qZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzArzDu0tgB5SPh33d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy9XJ5rDAwaraxseGN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyJVMYav0z6qa0Jfr14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxRVIPjPcmwVNdM-QR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]