Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"They are as gods of old are merely imagined to be. We are mud in their hands."…
ytr_UgzCkg9Hn…
G
I swear I’ve read this as a short story where she taught the robot guitar…
ytc_Ugy7OmxaE…
G
It's not just efficiency also, but also lean upon the ethical side of the AI use…
ytc_UgytAQGU8…
G
Even though this video is comedic in nature, I still wanna thank you for putting…
ytc_Ugwni6_df…
G
The robot is as pissed as I am when I screw up a drum beat and start hitting my …
ytc_UgynoiNIY…
G
I would note as a doctor that eye pathology is pretty straight forward mostly an…
ytc_UgxlEj70w…
G
@NoLefTurnUnStoned.That is really an unenlightened statement. Either you don't …
ytr_Ugx52ygBp…
G
Hi Siram, we are sorry to say that you got the wrong answer but in any case, the…
ytr_UgyQccel-…
Comment
@idontwantastupidhandle1 Yes, maybe. Although an LLM uses feed forward within the decision trees to maintain context, which is analogous to conceptualization. Until we understand what the mechanisms are within our own minds that do this, who's to say we're not doing something similar? I work with ML, but have zero expertise with biological processing, so I can not say.
youtube
AI Moral Status
2023-08-21T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwhzFL7fatPj_gK6mR4AaABAg.9tgaf5J52CC9thB1dpcR_i","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw0abhkaSiH9iKbGPR4AaABAg.9tgJ9fbJRWC9tmOkqTRA_h","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyR8PEa5Zru6x8qbWp4AaABAg.9tgGfBiVsVq9tgKCYqeBob","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxYb6bfliNstI6_pRZ4AaABAg.9tgEcicEvDP9tgO2JtsV2m","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgweATzT4Ki4qXoDrR14AaABAg.9tgDaE4n4Hy9tgE1o8qCUP","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgxR3q8TZ3ELOo9hSJl4AaABAg.9tgDEJ_Te9u9tgULfbSRKf","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxR3q8TZ3ELOo9hSJl4AaABAg.9tgDEJ_Te9u9tgjE5nEZ91","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgxFSQlH52bt-hBPshN4AaABAg.9tgDAZXnZa79tgLcJM9LSB","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzPBnD4QT61u7jr0R54AaABAg.9tgC3Sl9SVh9tgXVTDs4w4","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytr_UgzPBnD4QT61u7jr0R54AaABAg.9tgC3Sl9SVh9tgb8ld1aTN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]