Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Shade_Nox
Obviously this is the current state of AI, however it is believed an…
ytr_UgwPtIYip…
G
Switching off AI if it becomes super-intelligent will be impossible; it is much …
ytc_Ugy_D7CRn…
G
with the vast amount of AI generated art. the AI doesn't need human artist any …
ytc_UgziisDE6…
G
Last time I checked facial recognition doesn’t process the color of the persons …
ytc_Ugwsv6xsU…
G
I really like how you were not just mindlessly shitting on AI, but genuinely pro…
ytc_UgyX4O-jc…
G
OMG, I love you're voice, I haven't seen you in forever, and when I watch this I…
ytc_UgyaYacpR…
G
What happens if Russia or China or North Korea gets to Super AI first? Do we all…
ytc_Ugw9dav_l…
G
"Ai will kill us because it has human qualities like... deception. Humans can de…
ytc_UgwDX-nxB…
Comment
if we get average human intelligence level of AI with being able to be upgraded to specialize (so training equivalent) working at similar pace and costing as much as 3 humans (it will work 24/7) then adding another AI agent will be cheaper than educating human.
If they start working on improving itself.... it's just a matter of time....
youtube
AI Moral Status
2026-02-21T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyzBgsoouLqTXg5rjF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyyEnflszydGdeT1tR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxoCfreGlx94lO7cXR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzZCF_rM8JgfMS4bNB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxU9BQfxa43Z_MbXcp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyr6zIW0zUs2aNldBl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwvkqnfXUqRLq5ma5N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZTXdP1_NEsKcsDqZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzPyBrYzzQS0fhc22l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy9aqh8NfsRzZklPUV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]