Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When the AI purposely answered test questions incorrectly. The problem is that t…
ytc_UgwdW6wFr…
G
I just don't see these LLM's as threathening. It's not intelligence.
There's no …
ytc_UgxqmiN8J…
G
You’re a fool. Very foolish. AI lies and spreads misinformation. And people beli…
ytc_UgxiWBkVa…
G
I think the second one is a bit different. Obviously if an artist takes inspirat…
ytc_UgwtAA1po…
G
Yeah but it’s better for a corporation to put the blame on one human than have t…
ytr_Ugz6wVBfH…
G
LLM’s and large servers are training models that learn from the inside out; maki…
ytc_UgzykRQUZ…
G
No AI is not responsable, they lost a son and now they are looking for money i u…
ytc_UgxCqS96M…
G
I've had a Tesla Model 3 for the past 2 years, and although I didn't buy it for …
ytr_UgzSj3kRk…
Comment
People are already getting exponentially lazier and dumber. If AI starts doing everything for us, we can just sit in our easy chair all day and drool while watching TV. Heck...all of our muscles will atrophy and we will be lucky if we can even walk any more. LOL! That doesn't sound like a good life to me. Anyway, AI robots will most likely come to the conclusion that humans have no actual purpose on the Earth, and it will take us out.
youtube
AI Moral Status
2025-11-15T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3rw6_lLhOwXK4B9p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxuAskHLSP92pCFrS54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyyeKd58GGPmClx3bR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzFj_JfsDPLl4pi2Wx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxOnxJYAgZ835BW1w14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-NDgtC4ptswW6FDF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZqMtdvVoClufwnsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyymsPfdyYj6LuxL6x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugx_VDbNZWpD1h8Ypf94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgySs4RPmQrUs-nyGHt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]