Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At about 16 minutes in you see two AI robots, playing monkey in the middle with …
ytc_UgyV2v6jK…
G
To be honest only education is not the correct qualification for people in admin…
ytc_UgyndHUl1…
G
all of them are still in test mode i hope waymo is paying the city to allow them…
ytr_UgwULbtWj…
G
Ai isn't even intelligent. It's just a program. People just trying to get excite…
ytc_UgwGBldi4…
G
When making an AI image, you aren't the artist in the situation, you're the comm…
ytc_UgyfLiKT1…
G
On some years from now ppl will use ai to diy things we buy from big companies…
ytc_UgyYow0N5…
G
I guess we need some sort of "good-guy-hackers" that will somehow crack all exis…
ytc_Ugz38PS9y…
G
Its wild, these tech billionaires think theyre going to sit on a throne and comm…
ytc_UgxsG0nR-…
Comment
Here's the thing.... there are ALREADY really brilliant psychopathic humans with billions & trillions of dollars of resources influence and power. That have entire world governments in thier pockets, OWN every major monopoly, and have 100-100+ years of texhnology more advanced than the public has or can even dresm about..... soooo..... how is this any different?
Humans can do good or bad. Ai can also act in a way that is good or bad..... ethical or non ethical. You're taking a gamble EVERY time you interact with ANY life form.
Dog, cat, human, bee, alien or ai.
& look, end of the day... a Narc is gonna narc regardless, but
MAYBE it would be much less of a concern that they retaliate against us, if we dont act in ways that would cause something to find reason to have to..... like.... the mirror is jist pointing back at you so.... yea.
youtube
AI Moral Status
2025-01-03T02:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx3uVIY1A5UhhXEgsx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhVUGKvG3qpPKM6wR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdqEm6-ctQ5D036Od4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySaYE2FEp2OCkiDjl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyHjWgxsFAHKxp6TsV4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxpVeGEb_PjEjwqa7V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0t-UPU5V0aTHtqc14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxUGoDCV5CToTUg3yZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyRF0FwNAmdqD5HvkV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbzwmPeb1F6tG4-ml4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"}
]