Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro you can't hurt a robot, you punch it and the damage still your's 😆💀…
ytc_Ugy_FKxDm…
G
I swear to you that Minecraft is going to sit here and turn into some kind of AI…
ytc_UgzawmWhj…
G
Automated truck driving and arable farming seems not too far away. Food processi…
ytr_UgwYA-dZa…
G
also, remember. just because we are intelligent, does not mean that another "int…
ytc_UgxWWW_Np…
G
It's like any, and every, thing else. It can be used for good, but mankind is so…
ytc_UgzqIlgMl…
G
@Mr.Drew70 no, the only part done with human emotions, it's the prompting
Mean…
ytr_Ugy8Fx_B0…
G
I don't get it
If she is okay with people using her style, what is the issue wit…
ytr_Ugz7ZHyk1…
G
To me art is just inherently less interesting when I know it wasn't made by a hu…
ytc_UgyiKwVVF…
Comment
But why should this be inevitable? If the critical mass of people worldwide would refuse to use this technology and would boycott the companies working on superintelligence, the industry would not invest a single Dollar into the further development of AI. No further investments from shareholders /the industry would make it impossible for states to create a superintelligence. Such a common approach would also avoid the upcoming financial crisis once the AI bubble collapses (its just a matter of time). Why are humans so stupid and rather willing to put the survival of our entire species at risk for personal wealth? If this isn't the core of intellectual stupidity, I don't know what is.
youtube
AI Governance
2026-04-10T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzr5AISxs1E6AYuwjZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJSGmP4YZXEEpw_zx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzXOeF7l4qD9JIFC0d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZheIhL5ZJ9I1-ljJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVmZy7c9ga4ZLVIn94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSIFmqUmiuJKsrbhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyrx5fJ0inUihpRtx94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwkh8sp0lU9jR6Y4zx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz7YUl3BDUWIwEsERt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_WJCgeaUoyZlOBJR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"}
]