Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is one question that needs to be fully solved before I will support robota…
ytc_Ugw6kggfj…
G
I feel like nobody understands that EMPs would probably be their best friend dur…
ytc_UgwaDry6a…
G
The only hope we have is if we can become digital. Uploading our consciousness s…
ytc_UgyWtWCEf…
G
I love AI because it's just doing its job if I'm not going to like the artist cu…
ytc_UgyVn_yLb…
G
Weak, but possible given Continued Universal Human Cluelessness. Read my philoso…
ytr_Ugy-Q5DKy…
G
The best article you could find was an opinion article from a 2nd tier newspaper…
rdc_gst120r
G
What you mean is that humans have programmed AI to program themselves to do evil…
ytc_UgyV1739l…
G
You are in a good spot, and cs is very important to learn, now with your knowled…
ytr_UgwVjoBz5…
Comment
Oddly enough. AI for war is OK by me. Let robots kill each other, better than sending my kids to fight for big corpo or random government.
But yeah. Leave creativity to humans. You got AI for Menial labor, Tech industry work, Coding, playing our games, Cooking, going to war for us and all this other stuff. What does that leave us? If an AI bro can answer, that'd be awesome.
youtube
Viral AI Reaction
2025-08-17T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxOqFgV870NZeRtKxN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwlvd_RZi3hoX_p4Y94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwAYSmhKjjjGzh_cEZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwgByddafwbgMiv5jB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwUzc-9Q40XqXVy7g14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyPf4sSVpOegtJWXYN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyGZ2JGv9oIA6vp8SN4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzGJkbwrsSVJquHKc94AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwb98aF_Bp5hRbJ0jZ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwTVRPIaBGVwNmqfwR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]