Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the only ethical obligations there should be towards AI is ensuring that its fut…
ytc_UgwBBi9VJ…
G
Stuart Russel on DIARY OF A CEO , watch it. He wrote the text book on AI. He’s p…
ytc_Ugw2zUzx3…
G
I feel terrible that I can’t see the issues with a lot of the poisoned art. Like…
ytc_UgyA-t3Bj…
G
@A704T The Intel deal doesn't produce any tax receipts for the Federal governmen…
ytr_UgwZ-MoMr…
G
I don't even allow ai to expand my ideas, it just feels like I'm to lazy to eve…
ytc_Ugw5Y7bP9…
G
I have no idea who made this news but AI isn't so developed that It has a sense …
ytc_UgxqmFfvi…
G
Today it's Artificial Intelligence
Tomorrow it'll be Artificial Souls
We'd do w…
ytc_UgzQOS0qO…
G
People saying this is spy stuff are so damn annoying. Literally even if it was w…
ytc_Ugz1ZuOUA…
Comment
We grew up believing we would have robots helping us.
Looks like we are about to become the robots ourselves.
But, perhaps this is the next evolution of the human race. We are too fragile in our current form to survive even the smallest of leaps into our own galaxy.
Our next step of evolution is creating a perfect A.I. that will have learned all our mistakes and create robotics with sentient brains that can take everything we have learned so far and move out into our solar system.
youtube
AI Governance
2025-09-08T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz0TKLVCwg3Fsq0gNh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzZJwej7yPqwLH5TkR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyz8chi_YkbylcXMT14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyK6ItDUm1kwYMtV6R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzMvyRGU7Ie_26Vv-p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxNs7PDjvt8tHFTq6h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxwVpi0eC-lp1QkHJh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwPEnBVJcHFd_bCJoN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzA5rGyVwvlesVp5kR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwNal7G36mm7mGKmYN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]