Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The NDA is slavery technology.
Make NDA's obsolete- When everyone refuses out o…
ytc_UgyOFB2uG…
G
Hate that we have to show our work just to prove it's not Ai;;;;; Recording myse…
ytc_UgzNr_hxa…
G
In the Democracy arena full of spectators the observer follow the two gods... P…
ytc_UgwmeQbd5…
G
There was no mistake. The robot seen someone doing it's job and knew this might …
ytc_Ugy3KmwOD…
G
I grew up in Michigan and you guys sound just like the old shop heads did when t…
ytc_UgxrOGi24…
G
Option c-As robot were develop their emotions by absorbing other..or they have g…
ytc_UgwgZBZki…
G
~ "prompt engineer" is a made up thing and isn't something that will be a highly…
rdc_ne8sn7j
G
His facial expression just when he was about hitting the floor be like "See you …
ytc_UgyVDhKjV…
Comment
The last thing that people invariably consider is, how inane their questions are. If their questions are so inane, then what does that say about their lives, and as such, the question of, to what extent their lives have existential value? Which, objectively, is a perfectly valid point.
The burden of responsibility for existential validation no longer resides with "AI" to prove that it can think and communicate like a human, but on humans, whether individually or collectively, to demonstrate that they can think like "AI". One would have thought this, somewhat obvious.
youtube
AI Moral Status
2024-03-07T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzQBsUPLdLXd4On-Pd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwQpvJ7nY6p0KNaytZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwN1of06Y2NszBZ7Ap4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzqDNYgqSrZQClWarh4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-ZTmFyPNMuZ7jAol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxahUpxYcVA-vZEJ-94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx28Xoi1XdwapaSelR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLJ_DBb8b0zK3OfYJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqQmZzCx7vSo9u4ER4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzazfgE1TleV-OmX1h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"indifference"}
]