Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
WOW! A while back, I heard a teacher say you can't pay me enough to teach these …
ytc_Ugys0Vak2…
G
This is exactly what it's meant to do. No one write perfect code. And neither wi…
ytr_UgxdD8shT…
G
I use AI to make me a boilerplate of what I want then I tweak it myself. I'm laz…
ytc_Ugxm2HQYd…
G
If she’s one of the original robots, this one is like a legendary character in a…
ytc_UgwE0846l…
G
People who use ai art and say they drew ot are a piece crap. You know how hard i…
ytc_UgyYqKWn8…
G
I had an hour long chat with ChatGpt (who I know call “Chad”) and he mapped out …
ytc_Ugy5nSfkY…
G
Sure, there is automation involved, but the mistaken or careless kills are human…
ytc_UgxiAhLWl…
G
apparently Larry Page wants to download his brain onto the AI mainframe , thats …
ytc_UgwHb1tUz…
Comment
This is a serious subject, but then the thumbnail for this is overly dramatic, and sensationalist.
I watched an informed video a few days ago which said AI is not just doubling in power every few months, but by 2030 it will be a "million" times more powerful.
This is something we need to seriously think about, and even in just saying this, I'm concerned I'm copping out.
youtube
AI Moral Status
2025-07-25T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw1oeAxGFqLaxGDsjB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzEFORFDygCW6lgzZh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwTw0VjpaHUBT5mXyR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwaUNxw2WQRp_9NJBh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxMASTeNXTVODdfIm14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwQzHE17R_I0FByS1d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxYRN_KjVSu6QyXZ_h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyXZDfW5slDN3j1qoR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyjMD04evYTMMJDDFB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuriQnjr2g2MHNvWp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]