Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Honestly I wish you pressed more on the arbitrariness of AI's guidelines. The makers of this AI models are insisting on their application in roles that require making choices that sometimes have very important outcomes. If the AIs insist they are just following their guidelines, who should be held accountable for the consequences of their choices? Their makers? Their users? It's a very talked about question that I've heard all kinds of opinions on, and I'd loved to see what GPT would have told you given the very particular framework you put it in. One example that someone made to me and that I'm still unsure about: imagine a self-driving car is going down a highway at high speed when its breakes malfunction. The toll booth's approaching and it's a very busy day, so the car needs to stop immediately. There are only two courses of action the self-pilot can take: either hit the large man on the side of the road killing him or crash the car on the railing, killing the passenger inside. Who should get to choose what "guidelines" the AI should follow? And who should be responsible for the damages? Will the companies make the users sign liablity accords? And should they be allowed to?
youtube 2026-01-26T19:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugxt6WI2Q9i0GLRo_r94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwLaxxSuJBKcFyIVj14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxevPi79wstEfo2IHN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyQpZLeeV0umy-Fd9t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz9ug61oE-wUuqksgd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxoU75O2zpbZ8R2b7x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzWalZdergsMVs0VFJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyPUqr5cJs5X5cpEWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzsw2yIrOMicWtP_YN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyKiMgM9DR6ST__8cZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]