Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Exactly why I told my ai all this before hand I wanted to learn things…
ytc_Ugwn7SXDu…
G
Ai art is boring ❌ u can't digest the fact that u can't draw like AI ✅…
ytc_UgwsGNMl_…
G
Problem is we dont have super intelligence in our ai , we dont even have intelli…
ytc_Ugz4RiBRM…
G
That is what I thought, I thought they used some sort of new AI technology. Also…
ytr_UgwxK3hsm…
G
Chill - Sam Altman's full of crap. The only thing AI investment is going to wipe…
ytc_UgzFfoebL…
G
The initial example used wasn't convincing, it's just people being unfamiliar wi…
ytc_UgwtznXL9…
G
I don't understand how after using AI to supplement my coding for 2 years I have…
ytc_UgwLNq_9Z…
G
I just learned something new, Which is that having stuff like this online can ma…
ytc_UgyVQwcxo…
Comment
Anyone ever consider that AI is the modern race to build a nuke from a nat sec perspective. Do we really think that we have a single clue just how advanced our current AI really is? If its going to create new things, improve current tech and solve complex problems by early next year according to several experts, then should we not assume our government is already at least at that point already? And were just upgrading our military first. And imagine is solves our energy problem and burning coal and oil seems like stone age tech. Nuclear power is looked at like rubbing 2 sticks together... now how exactly do we keep up with that level of rapid change? All the massive energy companies rendered useless overnight? How would we even begin to roll something like this out to the public? Not to mention our dollar is backed by oil 😅
youtube
2025-08-28T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzpAWYk8S7ju34kz6d4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1PxSrW4tlx9wiWQl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxfmYYaFpHY6GQQQsJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyCjLbxKnFNFcl8SeZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzBJtLTvCf8qW-6XzZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgybG_nlhPsR7GcbI7h4AaABAg","responsibility":"elite","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyRada-4XPc6s2ARKp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxuk3jrwsb8wKqYx1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwFPJ1CA3xCKOO0nCR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwWG7B2JMn8nBGWgk94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]