Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai steals from other people’s art and mashes it together. It has no “imagination…
ytr_UgxeNjvvK…
G
the most interesting take out is, that even if we manage to live happily while A…
ytc_Ugy7Qp4J0…
G
Translation:
We could be humanitarian’s and use AI to build a future that rese…
ytc_UgzZl-4oe…
G
AI is being fed so much misinformation that it couldn’t tell it apart from accur…
ytc_UgzlxQS8D…
G
Claude has already replaced ChatGPT as #1 on the Apple Top Apps list (and Gemini…
rdc_o857sia
G
But you will be replaced by a Roomba, while the AI still requires real art to be…
ytr_UgztLheAh…
G
All the ads throughout this podcast have just been about promoting ai apps and c…
ytc_Ugwfq7MZQ…
G
The most concerning thing abouth this video, is that someone called this robot “…
ytc_Ugx3LnBHX…
Comment
What happens when a people meets anothet less technologically advanced people? The less advanced people is conquered and pushed around. Now ask this question again: what do you think will happen if one day we have a people who are not only more technologically advanced, but also have superior intelligence, cognitive abilities, and physical prowess, meets another people who are inferior to them in every way? Humans want to win and conquer because it's our natural tendency to do so, it's a part our psyche. Anyone who suggests that we should simulate our brain in an AI or robot is a fool.
youtube
AI Moral Status
2019-05-24T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzaERbUY6aNv0bDWn94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyKYdbdZLIQtogqrGR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzScYccHO5Bt5h9B714AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyesJQ9EnB4XZnPZyF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw6S4JWM68GaEcAKa94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwsK3U1Js6lsqygvYZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNmFP6KjDf5Rwnv6Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwCFn4HpjcCAWJbW214AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugyc49PobndzcEhmq7t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzPhNMxQ5gyc3xTg8l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]