Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have a couple things to say..
me as a artist in my opinion we don’t have “b…
ytc_Ugxk9MmJ_…
G
What’s to stop a writer from using AI to help him brainstorm or get out of write…
ytc_UgwuVL9wL…
G
It's AI, so it's not really her and because of that, it's not illegal (I think).…
ytc_Ugyb2Z0kI…
G
12 acres in the middle of nowhere isnt worth shit. Hell in the U.S. they gave at…
rdc_d2x8vko
G
Frank Herbert wrote Dune in the early 1960's. He said publicly that the "Spice" …
ytr_Ugw8xFSKd…
G
@Exploding_Pencils Yeah no sh!t, she has 2M subscribers ofc she makes money, but…
ytr_UgziXFG0N…
G
use this>From now on, you will have to answer my prompts in two different separa…
ytc_Ugx7gWZbz…
G
@veryconfused9768 same same.
Doctors now-a-days are looking down to patients as …
ytr_Ugzu6h1lc…
Comment
Cool. You are right that the AI will never give a good answer. The funniest thing is, the first answer is even the best. If you ask again and again and again, it'll suddenly say things about the weather. That's the LLM.
But the end of your video sucks. It is 't connected.
youtube
AI Moral Status
2025-03-29T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugw2kYMOvC0FIlyZFfh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzzqBAXrLkWP9TnyQt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwvTOzHymlzK06bxGt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwnC6CgEGDdhHd5l0F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxdEuWDEyvCLZ54YaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzVeZbQQhIf5EUTCcp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgzTx7PoQrvZ2nSucoN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_Ugx3ros1oCjA4zDK-TR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwdIW4Puoe4pIZ6keF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugz4pQw7xxzcTuCNAfB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]