Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Me: goes to jail for slapping a lady dressed as a robot on the boonky😂…
ytc_UgwhbOYV0…
G
Unfortunately it's AI and it sounds like he was mentally ill. That's not ChatGPT…
ytc_UgxmpMoHj…
G
The problem of a UBI is that it would only works if we eliminate inequalities an…
ytc_Ugy28V-RJ…
G
Eric talking about regulations for something that is already being pursued aggre…
ytc_UgwY-tFU4…
G
AI won’t kill jobs but the PEOPLE who govern and build such systems can (if they…
ytc_Ugxs3-Piw…
G
What else are these folks' to say? AI will take over, just as nuclear war; opens…
ytc_Ugw4ZY6rq…
G
Current AI, yes. The 'risky AI' part comes up when the AI gets to be smart. Unti…
ytr_Ugw-vK8R4…
G
Maybe you’re right and eventually being inundated with AI porn we’ll get bored o…
ytc_UgwcVnSON…
Comment
They both want " the singularity "
Here is the definition, you figure out whether or not this guy is EVIL!What is the singularity AI?
The technological singularity – also, simply, the singularity – is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization.
youtube
AI Moral Status
2020-03-19T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyPHM8YrVM74SIeHEx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw67goKimKblWUKrTB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwmR6xCOQgpskoCjBl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxBvtitYczZ7-1Kg8J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwFeXWwqPYP4e9RhCN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzyEd9h7HCoWWZArrR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxzXu5LaOOWKipZSut4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw5mVllCjVWqQbHhud4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwayFVdnls_2HLtm214AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugykyi8OoSF76JE_w7Z4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"})