Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I want self driving cars. Good, reliable ones that can actually save lives - thi…
ytc_UgxQKJJoS…
G
AI won’t replace human doctors, the reason being super simple: No tech company i…
ytc_UgztQb3hP…
G
'Nobel Prize winner Geoffrey Hinton, the physicist known as 'THE GODFATHER of AI…
ytc_Ugx91Jctr…
G
Interesting. My experience is the opposite, with people using these words less. …
ytc_Ugx1rKh8P…
G
The crowd backing AI is absolutely repugnant. A bunch of sore lazy losers only w…
ytc_UgyDJZJAz…
G
You know... during the pandemic lots of artist inspired me to draw but I admit i…
ytc_UgwD0he1G…
G
All it takes is one little bug to change it's core fail-safes. Of course the pro…
ytr_UgxZZS0fA…
G
anyone that lives in a city where AI cams are being installed, it is your DUTY t…
ytc_Ugw6TEoVe…
Comment
Remember who Anthropic was working with, Palantir and Musk. They want to create an autonomous golden dome of weaponized satellites to be able to destroy any "threats" to the US. We've all seen how Musk manipulates Grok to fit his agenda. Do we really want to insane men like Elon Musk and Alex Karp at the control of a creation that can target anyone or thing of their choosing? Call me crazy but if that isn't some Skynet crap. I give Anthropic some props for standing up against Herr Trump but at the same time they should not been involved from the start. We are being destroyed from within far more than Russia or China could ever hope for.
youtube
2026-02-28T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwmD-UGGNKrLyp2z0N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxxpgos6KwBV9iPhpp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyM3OqODTngONf0vPB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxIYFOkmRi0El1Um-54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy1jxVTGa2ViJdCijh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCt6MPCiykT808vox4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwFt9wCq_E0SqnqvlF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyco7sGwjJdh0rCio14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwhmPn3QC_YrH5YuJl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgycByoLx_XyJtCniux4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]