Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just my personal opinion. Poison begets poison. When you try to poison the ai, t…
ytc_Ugzs5r_33…
G
umm, LLM usually stands for large language model, not language learning model, y…
ytc_UgzO7J7Ur…
G
Lol, you don't think the super intelligent AI would anticipate us going to try t…
ytr_Ugx0u68zW…
G
I disagree that this is a uniquely Chinese problem or a direct result of recent …
rdc_dv62m19
G
Didn't after thinking they've been shut down 2 a.i robots 1 linked to satellite …
ytc_Ugzt9mvmO…
G
It's not that AI is useless, it's that it is not for the use cases that companie…
ytc_UgzMaSkPL…
G
I genuinely think AI art should stop progressing in this point in time, because …
ytc_UgwX4wrGX…
G
I thought that people being displaced by things like immigration and automation …
ytc_Ugxy40LwG…
Comment
The real life consequences of the industrial revolution was that even in the 80s and 90s you could recognize big cities like Chicago by the huge cloud of smog that lingered above it. Was Henry Ford's polluting assembly line a bad idea because not only were the factories very noisy and polluting, they created cars that people could use to drive long distance to commit crimes and to this day are still killing 40,000 people per year and will likely not stop until computers autonomously drive them?
Human driven cars create terrible and horrific death experiences for humans and animals alike. As with assembly lines, data centers are still in their new phase which is awkwardness.
youtube
AI Moral Status
2025-09-10T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwAAJTs6de1XsOs4Tx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxpHPqqz0D9BZsKa494AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxLzrGbFRpmv3WQqX94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzjeGO4RJa5wYrq_S54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyq6uF2hM03sP4tp_V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyN3e6KTrrykMS-d8N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxvuHUiQ3fwBxGJKq14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx57elO41HS9gnefe14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxLvkCy0tP7xpkgmG54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzEDeHfQ81GKfE83id4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]