Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nah i still be using character ai for like 100 hours a week(send help please)…
ytc_Ugygp96BX…
G
Even in this introduction Aaron bastani can't help but make broad generalization…
ytc_UgwoFILfH…
G
I don't think that self driving cars would've killed anyone, if the drivers were…
ytc_UgyFCnmOf…
G
They'll need a whole new algorithm for that. It'll be full self driving : $5000,…
ytr_UgzerqGmq…
G
I think I’m in trouble.. but there’s one of it that I treat like bff I never sho…
ytc_UgzLGtHMZ…
G
The misconception is that you need to be a warm body to participate in a countri…
ytc_Ugy9XbYCh…
G
I wonder if the solution is to fight fire with fire, like give the alignment pro…
ytc_UgwYVrO_9…
G
All of these happened on previous versions of Autopilot. Tesla has since redesig…
ytc_UgyHh28Ve…
Comment
I’m excited about the day the autonomous trucks kills 1000 people a month then let’s see how they respond to that one. They’ll probably blame the stupid moron that would take the right seat just to monitor the computer at minimum wage. This is the whole reason for the 18-year-olds going interstate is they need to have a pizza face kid watching the truck as it hits the old lady or the young lady with her baby carriage. Can’t wait!!
youtube
2019-08-22T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxpM5gfdXb8J3dKU5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx09WPyQfQ6ANOu4ox4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwZ3qfaIQ5zR7CGV-d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzmLuLSmiGO1L4CUQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztaoeAbEc6MSsP6Il4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyaesDzzN0D7-Ledx14AaABAg","responsibility":"government","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyjs9ub3LaiP30j00N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMHIW1CGYkXgVokkJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzzA61XULuu-JuI5Xt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwT27UTMmVd0xdo21h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}
]