Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I saw a tweet of someone who's SA was completely denied because it had a "100%" …
ytc_UgyKUsUBR…
G
Not cool, they're young they need teachers not some cool apps! School is a priso…
ytc_Ugw39KjX2…
G
Agree with the danger of super intelligence and a lot of jobs in future being pe…
ytc_UgyjiJxO8…
G
Bro literally, its just art. All art is art regardless of how its created. Im no…
ytc_Ugx2dROL7…
G
It sounds like you've got a pretty solid plan already which is awesome. What I'd…
rdc_kgp79vo
G
Raise the wage for human use and it's values than Ai and use free time to enjoy …
ytc_UgzQXKCtY…
G
AI is now moving into the professional suit & tie career field! College graduate…
ytc_UgyBi1_KR…
G
I've been trying to say this for a long time. The reason I can ask AI to code so…
rdc_moxvtvu
Comment
What they could do is to use smaller lightweight vehicles for their self driving cars. I'm talking microcars etc with a lower speed limit. FAR less likely to cause damage or hurt someone.
They should be not be allowed to use SUV's which are a menace even in the hands of humans. Why aren't those banned already??
But other than that, once their training is complete I'm pretty sure they'll be a heck of a lot safer than most drivers.
youtube
2026-02-04T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwa8AoIhLoxv_ZPaVp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxpO7zHO-vBUxDRfsZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-p1N_Ete6Wb4LB_x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwIs_Za0fSHQaoOKb14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzdKME5mwmgZbVbhot4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3uXpI4E-aJnInMG54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfUc7mZbNPZD3oCwx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx1d98ngmJDRONh1nN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsL4OuZGE0vgW-PWt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxB1v9PFTMicnt5EtB4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]