Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They're too obsessed with automating everything.
If nobody works, no one has any…
ytc_Ugw4NaMxu…
G
autocomplete was fine as it was before AI. What bugs me even more is that a simi…
rdc_nlz1jw7
G
Everybody talks about Open AI and Sam Altman and safety concerns, could also be …
ytc_UgyGkoX9U…
G
Hey @andriyhimi420, thanks for your comment! Glad you enjoyed the video. So, Під…
ytr_UgxO5vBjj…
G
Hey @Quantitative_Teasing, thanks for your comment! AI is indeed getting crazier…
ytr_Ugy399dcM…
G
I think emotions and intelligence are two different things. I think emotions can…
ytc_Ugxk_3AUF…
G
Hey brothers I work at a shop that we do everything from large format to …
ytc_Ugxb8JUSK…
G
Programming a robot with feelings is 1. Asking for a robot to misinterpret our w…
ytc_UgxTItdIO…
Comment
AI is being pushed to race far faster than is safe.
If we had a little patience and less greed, the world could actually build this thing into something beautiful.
That's not what is happening, we need to force companies to slow down with their need for the biggest cash cow ai land grab. We didn't ask for any of this AI. It's being pushed down our throats, because how else can the companies afford to pay for their billion dollar data centres. Well, it's the locals that foot the electricity bill, even when collectively residents could be using less electricity than the year before. That big fat centre is polluting your water and driving up electricity costs. That is just the start.
youtube
AI Moral Status
2025-12-15T00:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzE2eG0lakVQMLmQtd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxORgwt8mQqjAxl6bp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugww3pnixEdYVhaI4-N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyi59VSh7djtMh2cqZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw1yuRp7mCxI1pCcD14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwzx8z2YcMR6qAhw7h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1e5oFOL1yex218WB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy-BrjgZBiWFn5wdah4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwJp4Xy07PkcIaCD-54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRn0LdjlfiYv_p6Yd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]