Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@3dholliday You're critiquing them, when clearly YOU didn't understand what they said. Here's what they said: When we humans have a car accident, we learn from it. But other drivers don't learn (since our knowledge... reaction times, reflexes, etc. stay inside our brain). If we try to "teach" others about how to avoid that accident, it'll take us hours/days to teach someone else... through repetitive practicing of that same accident scenario (road conditions, visual signs, other vehicles, etc.). However, when an AI driver has a car accident, then all AI cars (ALL OF THEM) will learn from that one accident. And all of them will learn from each/all car accidents... making them become much better, much faster. While this isn't 100% true today (one AI driver in one car is not connected to all other AI drivers in all other cars), but we're building in these interconnections into future AI cars. Today it exists only partially, for some car models/companies. But we can make them all "interconnected" very easily (we can do it in as soon as a few years, if the car companies agree to do it).
youtube AI Governance 2025-12-02T12:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgwFBRhG4wGhLWq3eQh4AaABAg.APd9UNjlbneAPfjYZdu39g","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwFBRhG4wGhLWq3eQh4AaABAg.APd9UNjlbneAPgW-4lmF6K","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwFBRhG4wGhLWq3eQh4AaABAg.APd9UNjlbneAQEGjMjSifT","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwoXAkcvF-h0utWPwh4AaABAg.APbnj-8Zr1oAQE9ZM-U__4","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgzNU0vnWUyB631VBFR4AaABAg.APYrYS9ofaKAPbLaMrnrN5","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzNU0vnWUyB631VBFR4AaABAg.APYrYS9ofaKAPg0Ww5RWbK","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytr_UgxZ-6VzPHArZxxLz7x4AaABAg.APWqebiTTIcATbdRdGPUjn","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugxh7JrvtYFVlQkE6zV4AaABAg.APW9dzxkQPzAPYxg2E-OIX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwQ5nYO_lm1W8lHNhF4AaABAg.APW3PiDhzKNAPW676LyNYb","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytr_UgwQ5nYO_lm1W8lHNhF4AaABAg.APW3PiDhzKNAPW9N3btHfc","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"} ]