Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Know he knew better than to get his ass in the ring with a damn robot…
ytc_UgwsjnGlX…
G
She is so dead against Ai.
She comes up on how Ai will
fail at a task and will t…
ytc_UgzU0g4Df…
G
why do you want or not want? the AI has been trained to do so. for us its baked …
ytr_UgxjpwjIH…
G
AI art is NOT ART. It's just copying someone else's art style or work. Go argue …
ytc_Ugyt_ZtXB…
G
Creative jobs like artists took the first hit. AI won't just replace repetitive…
ytc_UgwGdgXfz…
G
How can these legislators have no idea of basic data centre requirements? its no…
ytc_UgzSW4Q1x…
G
The worst thing about Anthropic is how they want you to think they’re the good g…
ytc_UgzN-VU4v…
G
I use ChatGPT-5 and Claude Opus 4.1 to speed up my PHP projects, but AI in 2025 …
ytc_UgzC7265l…
Comment
this is the problem I have with so many anti-AI arguments, they always talk like we live in an ideal world that's going to be ruined when we actually already live in deplorable conditions
youtube
AI Moral Status
2023-08-22T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwhzFL7fatPj_gK6mR4AaABAg.9tgaf5J52CC9thB1dpcR_i","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw0abhkaSiH9iKbGPR4AaABAg.9tgJ9fbJRWC9tmOkqTRA_h","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyR8PEa5Zru6x8qbWp4AaABAg.9tgGfBiVsVq9tgKCYqeBob","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxYb6bfliNstI6_pRZ4AaABAg.9tgEcicEvDP9tgO2JtsV2m","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgweATzT4Ki4qXoDrR14AaABAg.9tgDaE4n4Hy9tgE1o8qCUP","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgxR3q8TZ3ELOo9hSJl4AaABAg.9tgDEJ_Te9u9tgULfbSRKf","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxR3q8TZ3ELOo9hSJl4AaABAg.9tgDEJ_Te9u9tgjE5nEZ91","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgxFSQlH52bt-hBPshN4AaABAg.9tgDAZXnZa79tgLcJM9LSB","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzPBnD4QT61u7jr0R54AaABAg.9tgC3Sl9SVh9tgXVTDs4w4","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytr_UgzPBnD4QT61u7jr0R54AaABAg.9tgC3Sl9SVh9tgb8ld1aTN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]