Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is dark and has nothing to lose. Also clinically crazy people shouldn't have …
ytc_UgyvBOoX2…
G
Gawd damn u knew better not fight an Robot just like swimming with a shark 😳…
ytc_Ugwk4WtTl…
G
@ someone said that AI is exactly like a person‘s brush strokes but it doesn’t e…
ytr_UgyqMKDD4…
G
All that is required for an autonomous car to become a killer robot is a flagran…
rdc_cpnboxm
G
The financial firms are having trouble figuring out how to use it. Copilot is tr…
ytc_UgxvmEYRc…
G
I guess it would be a little harder to design an ai to predict how the drug woul…
rdc_hl0dp6k
G
One of the main ideas presented is universal capital allocation. So basically as…
ytr_UgzLxvKED…
G
This won’t happen, because corporations are not outside of the social contract. …
ytc_UgzWmXwcl…
Comment
It's too late, the genie is out of the lamp and there is no getting it back in. There is no way the US will risk its lead in the AI race because of some butt hurt authors as harsh as that sounds.
And not to mention the AI model weights where their IP has been tokenised an encoded is eventually shared by most AI developers, so they can get access to it.
Then they can spend billions on their own AI data centre to run it, because that part they are not entitled to.
youtube
2026-01-28T10:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzIuclbhl3It10TPSp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwsLyGNqYsFXLQrbKl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzGRBF-yscp_Fxa9ul4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzeRY22AHABsEqtGtR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw5FlPxiJtZrFLGG-54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2JCPE_t2eb19TNHR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzaBmaY5Rz-gcGpnCh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyDHk0C85nrhpmlhAx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx2m7GX_1KY0GNmgzJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz0G35DSG0CYOjYGVB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]