Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Y’all acting like they just replaced their entire art team with AI. HoYoverse bu…
ytc_Ugw6kHfOX…
G
How do people learn how to draw? Locked in a cave for 8 years with a pad and pen…
ytc_Ugx1z97lQ…
G
0:05 AI generated person, using an AI generated script, talking about AI taking …
ytc_UgydwbK68…
G
One of the fastest transformative technologies in human history was electricity,…
ytc_Ugy5ZprQX…
G
God knows where AI will be in 10yrs. You probably won't be able to spot an AI im…
ytc_Ugx89do9U…
G
Aren't most of these "tests" really just reflections of what we think a computer…
ytc_UgicXkrK5…
G
AI can have my job! It’s shit 💩. I’ll happily do something more meaningful in my…
ytc_UgwB3bTa_…
G
Chatgpt is not conscious, it won’t take the initiative and start conversations f…
rdc_j5xe35c
Comment
I think if you bet on humanity being benevolent, we’re doomed. Theres almost no realistic scenario where these people attain this power and will willingly divvy anything out to anyone. Even if this super artificial intelligence turns out to be not even possible, the wealth gap will be so tremendous that we basically go back to super rich and dirt poor but this time the rich have all resources (food, water, infrastructure and currency) and have tech like surveillance and AI to keep everyone in a cage. I dont see how we get out of this besides a total coup of government and systems. Anything besides that doesnt seem like it will be enough to change the course we’re currently on (and headed there at tremendous speeds i might add).
youtube
AI Governance
2025-09-05T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxpVgWmGoJyARsmaxl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzA_XDcXYbP9pulHw14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwAvudo8WdxvuEOIyd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxGII4nsOB5n5rcwJ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzMAV_NNraj8qXBapd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzOh2WsptWSTmlHjRh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZcrDNmGybBc_hPPp4AaABAg","responsibility":"ai_itself","reasoning":"none","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBbtg8kE7-P6xvFSl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwlJZHhqriFWsH-UDl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwzC3aBA_kla91t8rt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]