Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
App uses invisible marker on the art that humans can't see. A AI program will se…
ytr_Ugwx6RNer…
G
Who gets fired when AI doesn't deliver what is expected? People will be employed…
ytc_UgzQkAtNT…
G
No way. Any possibility to control la AI in the future is barely a dream. In 20 …
ytc_UgwnL7R2d…
G
We should also remember that Palantir are now adopting the same technology and s…
ytc_UgwnpqplG…
G
Bmo : What's my purpose?
Moe : You understand fun and take care of someone's ch…
ytc_UgxWl51xd…
G
12:27 What if i generate images to practice by learning how the generated art fl…
ytc_UgwRCI7dk…
G
Y’all are so fucked and confused. The first video Brett Cooper played is 100% re…
ytc_UgxT2qOLg…
G
Everyone has an opinion but they’ve never actually ridden a Waymo. It takes 10 s…
ytc_Ugw84a-Ve…
Comment
No, an indicator does not give a truck the right of way but any sensible driver without an ego would consider letting the truck pull in. I'm all for automation easing the mundane practices of driving but autopilot is a step too far because it lulls the driver into a questionable sense of security. In the event of a collision the responsibility rests with the driver....not the autopilot. Too many examples of where the driver has ceded the responsibility to the autopilot all together.
youtube
2022-10-27T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwUH2cn8FTqeHkKOjZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyrOqFgruxL62RAES54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzZOcDAjivfPw8S2294AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},{"id":"ytc_Ugx48LcYI2lXAcTkFfF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgwDzB5DMgAwWb0lkyt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgwKGRsuAVqVnwYvBuJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzINdqp7Y-U939ej8h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugy0kzjaJBskfs6LzhF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_Ugz79WgPW3BRC-fKSqZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgxTvoULI9GUO5QOjkN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]