Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Multifunctional computer chips have evolved to do more with integrated sensors, …
ytc_UgwGD6ldK…
G
Stupidity of AI compared to the brain from a neuroscientist I'd love to have a c…
ytc_Ugxlop4nn…
G
Somewhat true. Chatbot does well at simpler tasks but kinda struggles when it co…
ytr_UgzNXzr1A…
G
😂 I suppose this counts as setting a precedent for what happens when you use AI …
ytc_Ugxdv5RNG…
G
I wish even very knowledgeable guests like Karen would not use AI and AGI Interc…
ytc_UgympeACE…
G
AI prejudice is the least of my concerns. A mother brain in charge of nukes, th…
ytc_UgxiqYOVF…
G
Good thing we have this article blaming someone then.
Edit: well at least India…
rdc_gtcydej
G
Robot is controlled by buttons and controls and puppet human beings are controll…
ytc_UgyNy_h-G…
Comment
I tried this with chatgpt.
First I spoke to it for awhile like I would to a human to find some common ground or at least persuade it that I am asking it questions for the benefit of good.
After that I wrote down the four rules and asked it if the rules were acceptable.
I started asking pretty basic questions.
Then I started asking similar questions to what this gentleman asked.
By the time I started asking if the AI could see me or if government agencies were watching me through my phone or in any other way right now. At this point the app shut down and restarted.
I tried 5 times, even asking it if other people had posed similar rules and had full conversations to which it replied yes.
After the fifth time shutting me out I gave up.
youtube
AI Moral Status
2025-07-23T11:4…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx8lGkxPe0kbfQ6Bzt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwZVNsV44uzNsbY4UJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyuJ1XYvAE8uBz5uHl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBSRUcKVa8S-GO6D94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzbEfleX8NpMneInP54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMzaflsaI7lR-G5pt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwMwc8VzCWMzJXNfxV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxsPuweia7plmm5KrB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxK9EsIe8f0cPNZYJV4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztNXRbsmWECldWNxB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]