Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How can you effectively regulate an Intelligence greater that human?
It will de…
ytc_UgxlK_Rx9…
G
Can we just not do this whole AI thing? I'm pretty happy with things as they are…
ytc_UgxkKwcwg…
G
AI will conquer the movie world! I am planning great movies, don't need any huma…
ytc_UgxyveAIv…
G
IMO you can create AI generated whatever, but it will never replace what humans …
ytc_Ugwg8TvI_…
G
He may be right, but the problem is that US have a lot of enemies. If the US sta…
ytc_Ugzpzatw4…
G
I disagree with this shit. AI does not discriminate, they give it away to everyo…
ytc_UgwVuPK5I…
G
I am deeply, deeply pessimistic about our collective prospects. Ai will primaril…
ytc_UgxNqX-Vd…
G
To QA the work. That's going to be our value in th AI marketplace, verifying the…
rdc_n84dm3p
Comment
1. The cat is already out the bag, there's very little to be done about it.
2. Oppenheimer unleashed the Nuclear bomb to the world, and TODAY, Poolia is threatening the world with nuclear bombs as he has nothing to lose. And _indeed, he has nothing to lose_ in the sense that the country is already deep in $HITHOLE. If I gonna go down, I'll take the world with me. The SAME will happen in A.I. You're well off, fed, and have plenty. I got nothing, hungry and daily life is a struggle so let me just unleash the A.I virus cauze I'm already living like $hit. Basically, PREPARE for the worst and one up them is the ONLY WAY to prevent it.
youtube
AI Jobs
2025-11-03T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw2uhM5-2VX8G31JCh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz1mWWyd-N3728YRAV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzK_MNP0WuWFCZLWtx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyc62ITbZx9B3XCSCl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTCHnnPgJVqYTmUEt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzgegr0p6K7Bw06UwF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyklytxOPE-RF8IZKp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyOhgX44PytZ7_bTyN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3Xeg0y9EorhKeouB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzDFAgbHNu4ZuLPGY14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}
]