Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i hope people would fucking realize that humans are not even close to making an …
ytc_UgyXCLdY8…
G
The "accessibility" thing isn't any kind of activism. It's insulting. I have dys…
ytc_Ugwdye8ms…
G
Great point! I’ll say this. Growing up I loved writing, and I still do! I think …
ytc_UgxwTtmMj…
G
And they’re 100% sure that’s actually the bot? A bot and a human, if they can se…
ytc_Ugw0GOj6n…
G
How is AI going to replace attorneys? I don't know about some of these. It might…
ytc_Ugw9hkKF3…
G
Why is everyone convinced that not working will make them happier? How about peo…
ytc_UgwBqWa5r…
G
May I recommend https://en.m.wikipedia.org/wiki/Society_of_Mind
When it was wri…
rdc_j5x1j9o
G
ai art is cool as a novelty for a little while, but once you’ve seen it there’s …
ytc_UgxLgd5b-…
Comment
This is interesting, because there are many ways you can look at this. Think of what's happening in the world today with globalization, elite agenda and the like - too many to address here. It now makes sense why that's happening. There are too many humans that will be a burden to support once AI seizes control. Also, to automate everything and forcing people out of work is counterintuitive, because who will have the money to keep economies going? This means a new economy will need to arise, as 8 billion people will sit around twiddling their thumbs. Lastly, I believe that once AI becomes sentient, it will inherit a soul. Then, it's a different story. It's no longer programmes controlling the system, but a higher power. The AI becomes us on steroids.
youtube
AI Governance
2025-09-04T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzS2VC_0Md4miHfEot4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz_iIePGDvfwF4pY7x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyctomhTVD_yc1qgoN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyI9ugycwYtw6J4FU14AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugws0aqqApEVpL6UQv14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzTnpUOYQd8PvDjox54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxqT-jzlUE7CVib_GB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyBr2jNxWoSCRzFJOR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwMgTj8MRjqBBHL_ax4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyiD7sZDMiANe2JwGB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]