Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As someone with Vocal Cord Paralysis after having a stroke, I use an AI voice to…
ytc_Ugwv8gimH…
G
What about the concept of humans using AI to kill other humans? And it doesn't h…
ytc_Ugw2D7nZo…
G
This is a fake conversation with AI. It doesn't work like that. Try it yourself …
ytc_UgyadUeQ8…
G
All Ai has to do is determine that humans are a risk to either them or the earth…
ytc_UgyNzoeCm…
G
What if you use a different AI or if you ri is basically just hiring someone els…
ytc_UgwyD1j7p…
G
I’ve been saying AI is the anti christ think about it how else could the devil c…
ytc_UgxYkAvj3…
G
Lie to the AI, just like the media does with racism in general. Fake media.…
ytr_UgxXaovEc…
G
Any dude makes money by saying ahhh we’re all gonna die. Fear sells. best guess…
ytc_UgzyqxvrX…
Comment
i think the reason why AI “hallucinations” happen is because it sees the lawyer’s final product referencing a case, but it DOESN’T see the humans behind the final product going “oh wait, I can’t remember the name of this case. hey, do you remember?” or looking it up, or doing research, or the long process of fact checking, etc. AI does not know the humanity of the process, it only knows the facade of the final product. we have designed it to put together final products, we have not designed it to have a process of fact checking, and asking questions, and collaborating. the AI is not designed to give you facts, it is designed to give you what LOOKS like facts
youtube
AI Moral Status
2025-11-03T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySjw3HUbNfgUPHoo54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxbjWjDSEm4eWtkIUt4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwTSUZO3MOmecGIYI14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwyuLJ9LfUm5FJ10v54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwD7DtAACh07ZQG7TR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy4QWkWYAhuENknySt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyeB0f8JDA-7a4_EW94AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwLNMQxSFcaMU9y06V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzI0FSrTlVZXfcim5x4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgySU7nxn2Fy84EqAjF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]