Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need to make the practice of coning these waymo cars more common. Basically, …
ytc_UgwQ0d3XD…
G
Keep in mind. Ai is now globally cat out of the bag. NO US laws can slow a forei…
ytc_Ugxz2XdZe…
G
AI is a tool. Art is Art. I have grown to dislike the art community because of…
ytc_Ugw0Hpz8k…
G
I can absolutely see AI being good for a tool, but not necessarily an entire med…
ytc_UgwoMt4TI…
G
I love it when you finally get someone from customer service in any field, and t…
ytc_UgxaedSyy…
G
@laurentiuvladutmanea , humans adapt to new situations via algorithmic logic a…
ytr_Ugy4aGQXz…
G
I theorize that as long as there is more than 1 LLM "AI," there will not be an L…
ytc_UgyBGAG-3…
G
I want AI off my phone. It's taking over. I wish it was a choice.…
rdc_lp6ywyx
Comment
The reason he does not care is that for the 20-30 years AI hell is unleashed on planet Earth (before the "solution" part of the problem-reaction-solution plan is "offered") he and his billionaire buddies will be sitting in their luxurious underground mansion bunkers watching the news of billions dying on their home cinema screens eating popcorn, not being affected by any of it in the least, only to emerge 30 years later into a world where they can continue their lifestyle of abundance uninterrupted as their self-proclaimed prerogative.
youtube
AI Responsibility
2025-09-08T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzeWyk7i5gCeFXHMrp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzbvFHbx-GrR_1QGPl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgztL9C10JiREi9f_pB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgykFQZGfx4xWzcCh9l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyUnr-tYU_kc2STLdR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwtxlNmwXpPJ9OKWlJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxjwZv-JfIWNGevgVV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxNdIB0julrtYDOEhZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZ6RCj3-hWaW2Ju1h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgygiW94La26H3w31bJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]