Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
52:58 The host describes how if there was a 1% chance that he got in a car that he might die, he wouldn’t get in the car? Poor analogy dude. Isn’t there always a chance of death when you enter a car? I think that lack of awareness of the consequences of our actions is the core issue with AI advancement. We don’t truly understand the probability of the negative impact, yet there we go, getting in the car.
youtube AI Governance 2025-09-30T19:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzpsbGOTuaqLy_FVHh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugwx_ijuGP8IFsg_vLx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxE7zGPba8gCF62oHh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzP3jv6YJ5jpCS4nsl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyXTH9c-xjSAT-NESB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxiiqZ4rbxTA33OVNB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz41kDRGqE03CaXDb94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwhqY8FhELfTzri3PJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwDwJcZwabPJNq7gfx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxWyw8RiF3qQ_QxgS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]