Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can not beleive that people like yourself can make such a statement. Have you …
ytr_Ugzd-mWw7…
G
@cecilia1300 you just confused me (cause I always head the 'predictive' explanat…
ytr_UgxhKFpcH…
G
This isn't even the AIs fault this time. It is right, in *general* they are not …
ytc_UgwBI5_FO…
G
I'm only mildly annoyed that you're trying to compare anyone who doesn't hate AI…
ytr_UgwWUQXJd…
G
Totally proves the hypothesis that in reality, Waymo hired gamers to drive their…
ytc_Ugx02dysi…
G
I guess it did its purpose and helped inspire artists to create pieces based on …
ytc_UgxmxAvAu…
G
Everyone who loses their job to AI will be upset with it. There are going to be …
ytc_Ugzq31Oro…
G
Ok but have you ever been on character AI? I had a heartfelt conversation with w…
ytc_Ugw88bmMw…
Comment
10:15 "We are not going to stop the development of AI because it's too good for too many things"
We are not going to stop it or take it more slowely and be more careful witht AI because of profit and war. The problem is not necessarily in developing AI, but in doing it for the wrong reasons, which is typical for the current predominant state of consciousness that humanity is in (and which can be overcome if we survive this current state and evolve beyond it). It's just like the atomic bomb: "If we don't develop it first, someone else will beat us to the punch, and then they will have power over us."
youtube
AI Governance
2025-06-16T07:4…
♥ 10
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxY9dHzOY9Slm90bdZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyIc5njT9ik7DPksWB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOa2GLhFvJZyJXhTZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwHzq46bBMdVx49gcN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQCxRwgn9WUq-89Nt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzVwa8ZYBtb-f3b-tp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx1WN2UtF0I8PAFlzJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzU6fa4kMHp9FlGJXB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwcr42N49afRykzZA94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwsPJ7q00s9BqHpQcJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]