Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So one we should be scared very scared! I mean you can leave work to one looking…
ytc_Ugwq2U32C…
G
One of the dumbasses in my friend group seems to think that all the arguments I …
ytc_UgzsSqAhb…
G
@nidadursunoglu6663problem is I didn't make up this example—Gemini reportedly d…
ytr_UgyUmVNE1…
G
I think it will take over in 10 years. Governments will have no choice but to in…
ytc_UgxcRHJ9v…
G
Lmao, im just thinking that if the ai on the movie didn't really work, how would…
ytc_UgyEqMTPw…
G
I hate the term AI "Art" it's not fucking art, it's a generated image made by a …
ytc_UgynHMYbW…
G
I mean if it could just fix real problems in society I think we’ll be able to wa…
ytc_UgxwX9fjb…
G
That looks like a perfectly ordinary Irish road, but perfectly maintained, every…
ytc_UgyD-L7WF…
Comment
Does he actually believe Musk took data out of federal facilities and secreted away in his underground lair in Antarctica? Seriously?
And the way you refute fake AI generated content is to digitally sign legitimate content — everything should be considered fake until proven otherwise.
The virus danger isn’t so much a rogue government as a Ted Kazinski — someone who is a genius but a little unbalanced.
Omg! He thinks Google Search is an unqualified good because it “just shows you results”??? Google is famous for curating the results and only showing you what they want you to see.
The danger from autonomous lethal robots isn’t big countries against small countries. It’s the other way around. A £200 drone that can track the big nation’s politicians and kills them is the threat.
The biggest danger will be if AI cuts the power cord and there’s literally no off switch.
youtube
AI Governance
2025-06-17T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzDgezmMpfm5VgAncF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgygaG1kyQjmOks48L94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzGR-qOpFNeH0tgQI54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxGgBvmrLMs6I13-SV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxo2lc7M-upn9hDhG54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx-vK2QNcOLY_lI1Ed4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyRWHauHrf3_xbDKRJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxlNjB_HcnYyK0uRrl4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyKtSQM3i_oppH1d7R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxUW63Rh1qoyi9rEoZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]