Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even if ai is faster real art actually takes time and effort showing how much yo…
ytc_UgxrAGmob…
G
CEO is like...at the end...they won't need me either.
If I can ask AI to make n…
ytc_Ugzb3x4qH…
G
Your mistake - a common one - is treating ChatGPT like a search engine. It’s not…
ytr_Ugw_5mgMt…
G
Is this the beginning of the end? AI doesn't fit into some agenda but can't be e…
ytc_UgyF6KWcA…
G
This AI revolution is going to hit large cities, the hardest specifically those …
ytc_UgzxWNcLs…
G
I HATE AI FART, BOOOOOO
seeing old paintings from the masters ages ago, makes …
ytc_UgxMdxWkg…
G
I think if the ai was trained in a fake court it could be cool to use to settle …
ytc_UgxpZZaWg…
G
If Ai Robotics attack and destroy those programming the machines that is the sce…
ytc_UgzBV6Biy…
Comment
The... and I mean THE absolute most critically important point in this video is made at 53:19
IMO the 36 seconds that follow encapsulate and 100% justifies the entire argument for forced worldwide stoppage of the development of General AI, and immediate implementation of regulations as strict as (if not more strict) than the current Nuclear and chemical development regulations. The fast track development of General AI with such "consequences be damed" mentality poses a greater, more immediate, and more accessible means for human extinction than any nuclear or biological weapon in existence... so why doesn't anyone need our consent to to keep going pedal to the metal on it??? Or are shareholder investors and megalomaniacal corporate executives the only folks who get a say in whether or not ALL of humanity can be subjected to such endangerment?
youtube
AI Governance
2025-10-08T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwR9DLJmdBe_TExYmN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyEJHTB9bMCaT5BLAF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3PcZTOnEl0f3_fjB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyh-NXRzVLGDBImhip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxS5_bvqxialGyXUOl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxNbNbq4Gyw6Bvm0gF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyeIcOXnTcj_XBySPx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwPJa33Ji_gmGubl_54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXztiqo5W1nRuRpD54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOkUzRtABJjI28nV54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]