Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We build cars without airbags, seatbelts and many others safety features first. It's hard to find any human progress without making mistakes along the way and it's easy to say just "make the landing gear for AI first" but the problem is in the first place that noone knows what that would even look like and there is an opporunity cost to just "slow down", not to mention how such a slow down would even look/work in practise. I mean what exactly would that entail? It's like telling physicists in the early 20th century to slow down with all that physics stuff so we have more time to figure out how we might deal with nuclear weapons down the line, it's just not how technology works.
youtube AI Moral Status 2025-11-02T19:5…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyindustry_self
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyjZyTJQdV33bw0vop4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwCMEtyTtZwynwkXrV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxh3riF0-4UK4etQ0d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_Ugw7UPSqMIu1xFiIUSl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzrp8HbL5oyccS7tDh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyJZ5WYBWtWhye6KXN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw7_T-EMPRxzTRgF_N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy2ds2xE56wcAnbRrZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwRM1UtUh06iVVjG654AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw22W8hz_3dOr8fC7h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]