Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Theres taking a risk than theres being stupid. If the ai is made to think like u…
ytr_UgzRBqrvN…
G
The NEON GAUD -- that A.I. in the CLOUD -- attained sentience on April 29, 202…
ytc_Ugy7WJC25…
G
The last generation raised without internet had troubles integrating, is that go…
rdc_jf7oux6
G
Human create AI and AI kills most of jobs. Government should ban AI to replace c…
ytc_UgymIi2jK…
G
Parents need to get their kids straight most of the ai use is by teenagers and k…
ytc_UgwyEaQSa…
G
I got an ad on this video for AI health support. Couldn’t be a funnier coinciden…
ytc_Ugy5rWTWC…
G
Ai is just the beginning, try to improve it. try make Ai better, or come up with…
ytc_UgxMzA8q-…
G
What if A.I. deems Humanity insignificant and turns humanity against itself? Zom…
ytc_Ugx7U2Rbx…
Comment
+ProjectP Of course I understood what you wrote, and I agree with that, the AI misunderstood. But what I'm saying is that the guy didn't ask it "if it will.." but "if it wants to..", and given its answer it made me think of the difference between what you say and what you do, and how does that work with AI.
youtube
AI Moral Status
2016-03-24T16:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgjTIKP4jpSMNHgCoAEC.8BwHvbjaZLb8BxeT1yUazK","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UghiC9DLaHuJiXgCoAEC.8BuEtGyZm0t8BudnYbSl1J","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UggG5eHNK_n7f3gCoAEC.8BsH-vDG4eY8Bv9kxcCR4q","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UggG5eHNK_n7f3gCoAEC.8BsH-vDG4eY8Bxe7NYwByb","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UggaI60jwFS-V3gCoAEC.8Bpw3kfocb78Bq8sSVD8NZ","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgibQKlJl_eU9ngCoAEC.8BnPkKyycfn8BqwPySsEKz","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgggMndQdvdfPXgCoAEC.8Bn7jJOW2k88BoJ4VvwIxE","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgggMndQdvdfPXgCoAEC.8Bn7jJOW2k88BoNWfXWWWt","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UghnKF6FpqHR4ngCoAEC.8BmTs2qeUMS8CHC-7zeEyu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UghV9NnqvEaleXgCoAEC.8BjhpNrzTed8Bkuk6R1HaU","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]