Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I am an ai researcher, I think we will be unable to pump the breaks until a catastrophic event happens. I just hope it’s not the end of the world when it does. Something very bad will happen and then hopefully we can stop the race and worry about building super intelligence slower and safer
youtube AI Governance 2026-01-14T19:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyzDNSJ6O58f0F6yjV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxTxwT97XB1uyJbsCp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyOx4qCgvsDbF2EJDh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwlX2TibexKnnF_EJB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzqfJM0u6POmot46EJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyZ79PqwjB5NGz3lFl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugz243AINxoSrRnDcJ94AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugz7RRKW7csX3BHEIqp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzBk3UNmC6cyhVyGxt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugw_RyyF4WoHGnw4t0h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"approval"} ]