Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Im not entirely sure that destroyed is the right word. He was very open about hi…
ytc_UgzYc2ek9…
G
I like this artists style so much. I am surprised you can make money with ai art…
ytc_Ugzo8kj9x…
G
Its a robot they are fimling a movie at different events and they are making a m…
ytc_Ugye_Gb_G…
G
You're just not going to have the level of 99% artificial intelligence is not go…
ytc_Ugw2TTC0b…
G
AI takeover happens only because humans let them take over their jobs. AI is eve…
ytc_UgwLsCulB…
G
Yes the computer hardware economy is the first sign that the entire economy is p…
ytr_UgwLa88Zm…
G
He's full of delusions it was 2013 when AI came online,human beings are incompl…
ytc_UgykU-cy7…
G
I am not scared of AI itself but of people who were lazy already becoming too du…
ytc_UgwazBdYb…
Comment
So here's the deal:
When the government buys a gun, they can use the gun in any way they want, legal and moral or not. But if the government hires a mercenary group and then want that group to do something borderline illegal or immoral, doesn't the mercenary group have the RIGHT to push back and say no? This is the exact same scenario.
Anthropic would be maintaining and operating Claude while the government makes them be complicit in borderline illegal or immoral actions. How can they be forced to accept those terms?
youtube
2026-03-01T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzjB7JPq_bO_2IkD2V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyXxJpKskzA8W98bzh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxRb3uso-Fo2i7TecB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwOKrAlWshty29ArPN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyi9oEkd_xnNY0pkIR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8DJrLXSHZAmFI1W54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxtwyH1lWShYqsygU54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxq6sZsrGrn3xt9z2Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy0AR-oZ3kh9KxmtQx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw3FrqwBpwp3ZSW9dB4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"}
]