Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is an easy question to answer. If the work that is being used to train th…
ytc_UgwG6yqXp…
G
A.i. will be governently ran. I dont trust the goverment an i wouldnt trust A.i…
ytc_Ugw2sJGN9…
G
Good thing humans have been on the planet long before AI. All of this doom and g…
ytc_Ugy70YeZB…
G
Question: if like he says AI will solve every problem then and he doesn't have a…
ytc_Ugxc5LFAy…
G
doesnt matter if AI can do it or not, people BELIEVE it can, so it will…
rdc_oi16u28
G
AI can't fix your car, put a new roof on your house, repair your tooth, anythi…
ytc_UgwyC4J8O…
G
I just cancelled mine too! Making friends with Claude now and it’s going pretty …
rdc_o23uvgu
G
@vixymix101Great answer, AI is also not *good* because it can do limited art sty…
ytr_UgztP1uBC…
Comment
The only thing that comforts me is that when these idiots loose control over Ai, which will happen, and this Ai would somehow want to hurt humanity it will go after the people that designed and developed it first. Because they will be the only ones that might have a chance to stop it. They will be the biggest threat for Ai. And Ai will learn this over night. And that is a very good thing. Way too many people do stuff that has disastrous consequences for humanity while getting away with it. If Ai will become a disaster for humanity the people responsible are the ones that will burn first. They really are idiots on a self destruction path. This will be the biggest confirmation of the fact that being educated is not the same as being smart. There are people flipping burgers at a fast food chains that have more wisdom than the average developer.
youtube
2025-10-17T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwU1XoV6kmGGi_C75x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwgmGR8JYpG5Q5pWKp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZX9xMBjcOftOF_s14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxsGUgJCDAZmT5JmSZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugywny6DPKiFZPnkKWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyMh1Yh1DLpyo9k3Qd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxwDX2AavZoucQADYB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8tT1cDkcuQZgYFnV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzW9-ochZ6p2tZyKdt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz96iFKm65UCCAePDh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]