Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
First of all, I am no Tesla hater, but someone who studies and works in this fie…
ytr_UgyeORD8R…
G
The biggest safety issue in every car is the human driver.
Teslas "Autopilot" is…
ytc_UgiRdMUEk…
G
Realizing AI doesn’t put feelings into it, just judging by what it finds out the…
ytc_UgwqQxF_2…
G
0:38. There should be an era of AI where everything from a corporation THAT work…
ytc_Ugxu2HhEL…
G
This is a fantastic video that really sums up a lot of what concerns me about th…
ytc_UgyYrG3jP…
G
Sophisticated pattern matching is a big part of how our brains work too. Of cour…
rdc_n7zlfzi
G
Quite the vice versa it's digital artists cry about the ai is gonna replace them…
ytr_UgzVi7z4o…
G
Nah but if it was just a null value issue, I feel like Claude would find that pr…
rdc_oi09evs
Comment
As Elon Musk once said "suppose we're building a road and there's an anthill in the path. We will just build over the anthill, but not because we hate ants; because they are in the way. I believe that a superior AI would see humans as the anthill if we were to stand in their way".
youtube
AI Moral Status
2021-11-08T15:1…
♥ 1865
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy-IL9NDoyfNb8KTVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxpnw4qSBhyHYFCS5d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwiNLlNrrbqozWD2XV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwtWkZwktMXbRQfuVN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPjC4v8koz4eJpfPZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0aUNYTVz2HS9GXlt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxnyeZUNXvi13BIQgd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNSUzYsGk4FO3r9VB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzwg4G7sQG7kIYG-o54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxyPN6CIGKIqXNZ25d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]