Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Anyone in 2030 plus freaking out , kid or other wise, No Taylor swift Breakup i…
ytc_UgyLrR6ym…
G
An economic overhaul is coming. You can't have plumbers fixing electrician's plu…
ytc_UgwA7Y8bD…
G
Ai sextortion? Lol this isn't true. Why does ai and the media lie? This is stup…
ytc_Ugxy2VT8q…
G
A good place to start regulating harmful AI's is with the all-powerful Big Tech …
ytc_Ugy_5cQLf…
G
The idea is to have an opinion. Just because you are in the taxi cab business d…
ytr_UgyWjAIrz…
G
Ok so he feels safer talking to AI than a human. That explains a lot. Why blam…
ytc_UgxYGCKtS…
G
Those are very poetic and artistic terms you use to describe humanity, but frank…
ytr_UgwldH0gm…
G
He even went on lying he gets a lot of job offers 😂 Why would anyone pay a dime …
ytc_UgzibBSNc…
Comment
Man writes a book about how superintelligence will kill us if it exists no matter what, but than tells us to be nice to it because maybe the thing I told you is definitely going to kill you will spare us because we were nice to an input output machine that predates it. ChatGPT only “remembers” because every message sends your entire chatlog with it. Disappointing to see the expert ignoring that because fear mongering gets more people to buy his book. I like the book and most of this chat but that moment really irked me
youtube
AI Moral Status
2026-01-08T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxXvP06xB_rvHXU8nl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxB2lUMC10V2WCKMdh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzG6m5nNk-ZQp4yPdd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdLgUpm0zqRww_36x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXKB0Q9EOyb0TYAQ54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz9jWegCqJ5MLH9GXF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxjHKweqa7s6ZC0JHB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy-KZ4-7G2BKQOny894AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy88yz9_C5B-z5vALJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-G5YAEcxVcUZLiZt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]