Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@masoudmaani , that is the perspective of the Law. Fair use doesn't violate copy…
ytr_UgzAnih2o…
G
@BrendanDell Thank you for replying! I respect your opinion and really enjoy the…
ytr_UgzIONGRt…
G
lol that looks like Paradox AI chat for career recruitment. It is so easy to bea…
rdc_n0lxqi7
G
Man, I can’t wait for Ai controlled robots to take over sports. That’ll really p…
ytc_Ugx4gHjWQ…
G
You're on Reddit, you're seeing AI generated content. On most popular subreddits…
rdc_le5ceyd
G
AI can be legislated/ made illegal. Art itself cannot. The better an artist you …
ytr_UgyQTPL4K…
G
AI is nothing like in the movies, it just makes guesses. Even with neural networ…
ytr_UgwCOxeU8…
G
You literally said what's my opinion on AI art, i have the exact same response…
ytc_UgxlxUFF3…
Comment
The real story here isn't just that researchers are leaving - it's WHERE they're going. Many are founding safety-focused startups or joining organizations like Anthropic specifically because they believe the current trajectory at major labs prioritizes capabilities over alignment. When the people who understand the technology best vote with their feet, that's a signal we should all pay attention to. The gap between what AI can do and what we can verify it's doing is growing faster than any safety framework can keep up with.
youtube
AI Governance
2026-03-18T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwsVuCA0Ep-9leZtOV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwfl8mowoza0wyRupR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwqAmQrBVJlomfWO_d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzmaxfHqz62hKrYR_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUa7tdq_WZL_-yeGd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx_PnjcgrUpTOe-7Pl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgykOz1z3pumU_oYN-B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzpUeeLbpPtaSXysUd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxDGPkvpD6Oj22qCK14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3X7VaxOReI94Mw5Z4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"}
]