Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There's a nice video where one confronts an LLM-scripted NPC in Skyrim with him …
ytc_UgxGOCJrP…
G
I really love these AI videos, it makes me believe there's hope that we can resi…
ytc_Ugzi8hyDy…
G
Poetic how there's consensus Altman is just gassing people up, while ChatGPT is …
ytc_UgyLHM0gr…
G
I can't even begin to imagine how pathetic and mediocre you have to be as a huma…
ytc_UgzqNVhV0…
G
Better question why are we allowing an A.I. to dictate if someone is a danger or…
ytc_UgzEOzerz…
G
1:08:00. A kind reminder, when speaking of AI solving environmental problems, su…
ytc_Ugxu7WuBn…
G
There is a misconception (largely the fault of Tesla and Musk himself) that Auto…
ytc_UgwmJa2Kg…
G
Guys you can still make art- I have not seen a single ai art this whole time bes…
ytc_Ugy4gtQ6u…
Comment
Does ai think when it doesn't have a question or problem it's trying to answer? Have they developed a virtual world for the AI to spend time in?
youtube
AI Moral Status
2026-03-01T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyTZLxAjX1JOqSFKDN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx7giDTBzm2AYgniCp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzxFPtalflIaRL05154AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyuJMienFlrXjaU8nl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyM1ZcmRyj_5pdZ1wN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgypIAMe5PrSMvl72uR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugys-eq6oFVODIyHltB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy86lMQFFGzPrqH6FN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw-RBqYdE27O3S0q1B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwMVjuIsaEumzNd4s14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}]