Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai influencers: With AI, people won't have to work anymore.
Yeah, they wont be a…
ytc_UgxJaG4tu…
G
Your art is so much better than ai. Because you are right, at least it is human …
ytc_Ugxqfb3Vv…
G
I got ChatGPT to identify as a zodiac sign. It identified as being an Aquarius. …
ytc_UgycPgVg0…
G
Men who want self-driving cars are the same men who want a silicone love doll as…
ytc_Ugy6WptzM…
G
There should be very strict laws and guidelines with Ai development. The problem…
ytc_UgwXpW4ko…
G
Eh. Yes and no? Ur right they don't use skin color, a lot use landmarks on the f…
ytr_Ugx5aSx2A…
G
Since human intelligence has never caused problems I doubt A.I will. Human emoti…
ytc_UgxXmID-0…
G
@zxdp747 That's not how machine learning works; you don't neuter data just to m…
ytr_UgwK1PZgt…
Comment
I believe the development of artificial intelligence into Superintelligence or AGI poses no real threat to humanity—on the contrary, it brings only benefits. Humanity is the creator—essentially, the parent—of AI. Fearing the rise of Superintelligent AI is like parents being afraid that their children will become super intelligent or powerful. I don't know any parents who are against their children becoming successful or powerful. We should see AI as our creation, not our enemy.
youtube
AI Governance
2025-08-02T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxodujcKDHYurgOObl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtEmWvnoFeozPrRfh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugypn6YQmYkkLS68QpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyPMd1i5wly8_jL3qZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMYHOyd2WgZnTzFu14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw4p6zxPlyJJI-3Shh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwb7kZe1diJLieW2sd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwueHHRLE9HjCX92HR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyogtGfKYtjSBPPeyt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkdLREyUSY0swiN-V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]