Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let's recap tragic foolish human history. The United States invents nukes. Has a…
ytc_UgyX-4h3S…
G
Some people are so quick to try to pass off fake scenarios like this as real, be…
ytc_Ugy7VsEP7…
G
Cool effects, but why would the droids sound like that when AI voices already so…
ytc_Ugz62ylOB…
G
Sorry but I agree wholeheartedly with the having a robot lift weights for you an…
ytc_Ugx1IZ1FR…
G
Regardless what the future outlook may be, what's most important in my opinion, …
ytc_UgyWsbCnn…
G
you still have to use your brain whether you copy a snippet from an AI, from sta…
ytc_UgxA2xOUk…
G
If you hit the robot with a taser it probably short circuit the computer on boar…
ytc_UgwSOqYfn…
G
Personally, I think no one should legally be able to drive a car. All cars shoul…
ytc_Ugzk-zPig…
Comment
Ask John Searle, he's kinda the final arbiter. But FIRST, can your AI draw a full glass of red wine next to a full glass of white wine? Just NOW, mine could NOT. It does not THINK, it REGURGITATES, that's all. The LLM concept does not HAVE a link to "meaning," only to available behavioral attributes to mimic. That's not a BAD thing, just a fact.
youtube
AI Moral Status
2025-07-09T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwQt4_O5ySgBW8rpMV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwFD0KuGPPFTsrg8qV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyle4TGyngOoKvlG4h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxbEd1YkzQpMxQvPCt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxAbnpD-VzXKAA0gyl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNHwuWFeAg746-m1p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgybiMnBCnRv2fsPBEt4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxNZqpoI7Njj7SP2Mp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypLgN1KzrR6XVUB6N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzNa4fLrU6q-fgAVJF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]