Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not a robot. It's a animatronic. Disney has been doing these for 50 years.…
ytc_Ugyfkk3sd…
G
Is it just me, or does Karen sound like and talk a lot like Naomi Klien?…
ytc_UgwD5i_v0…
G
Bro, you don't even want to see what my art used to look like, I didn't get here…
ytc_UgzPhNPE2…
G
I worry that no one will actually care. If the economy is working with AI, then …
ytr_UgxeqCThV…
G
As an AI/ML Expert, I think John Green is just as equally likely to be correct a…
ytc_UgwbJDQzW…
G
Well I do sound like a fool in this comment. What are LLMs if not prediction so…
ytr_UgymQPZqP…
G
Humans are special they created AI. Each human has come from infinite combinatio…
ytc_UgzRPRVK7…
G
Because that would be too obvious. The point of software like Nightshade is that…
ytr_UgztbQKfS…
Comment
I asked about the controlling part and i kinda guessed that what if this AI controlling things are just there to scare us and other dangerous things. It said yes, so i said why they wanna put fear, it said to control. So what i got from this is that some things are exaggerated to out fear in our hearts to keep us obedient to them. I asked him why they wanna do it, it said greed.
youtube
AI Moral Status
2025-08-03T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzmuaY3tngH0uygSPl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxTQa3SZ4hwBH-noKl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzlklqEAcRMJozqrO54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXhRubTf2agwZ87JR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtXxFMmWEbwlYXcsl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzWkHopk7_osTNuhLp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyqwO5U5EU9h7edSb94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwjPZTxT1toFU9Z2kB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyUOMPQbWNYoG0ZyJh4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyoU8fhAscR8zuxz3d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]