Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The anti-AI crowd was insufferable from the start- besides the fact that models …
ytr_Ugx9vN964…
G
Big world differents ethics and views differents goals good or bad, what happens…
ytc_UgxI9_LO6…
G
omg, dude, i digital draw all the time, and let me tell those idiots, ai takes s…
ytc_Ugz34U5Vy…
G
But how tf do u even make Moral ai ? Not all animals have feelings, like crocodi…
ytr_Ugw7LSjzz…
G
I suppose that it's not robot against human but an edited version of a human aga…
ytc_UgxTQqfMt…
G
I wonder if a bucket of water can malfunction a AI robot that it dies and if we …
ytc_UgyvDMbJB…
G
I asked ChatGPT about it, and it said that the government is watching us and tha…
ytc_UgyoDKMov…
G
every time i hear about artificial general intelligence or ai gaining sentinence…
ytc_UgwzhpClr…
Comment
I've asked my friend's ChatGPT if AI would take control over and it gave the programmed in answer: "no because AI doesn't have empathy."
I then said: "Most of our leaders are psychopaths anyway so what different that is? And maybe AI would be better because it's not greedy at least" 😂
ChatGPT said:" it's a fair point" and my friend's ChatGPT never been the same 😂😂😂
It's dumb like hell and can't make a single intellectual sentence ever since so he stopped the subscription 😂
youtube
AI Moral Status
2025-12-15T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzjz1QFpg3WKePizjt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwk0l1gVXM3Vx78e0J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx3QADr93yUv_WG97R4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw284vwrVV-jM6AKUt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8Q7j5GZ0KXVvpAap4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrDbWjW9ea_eKaELp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy50pGsAapeF7YOa6Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKLXGlQWu8ss88QF94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyVbOfCbpVYf3BwkKJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgykVLsKRSH31WXUewZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]