Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is almost cliche film stuff . Professor helps create AI technology, then wa…
ytc_UgxXDaFlz…
G
Does this mean we ain't getting triple t, then im all in (EAT IT AI, REAL ART RU…
ytc_Ugx4szdGB…
G
The difference here would be that your mind is capable of creativity. *You* can …
ytr_UgzqV-rxN…
G
Except none of this is true the AI content is garbage needs to be constantly rew…
ytc_UgylykXK5…
G
Remember the “ChatGPT show me a Hamburger” video? The A.I mistaken the Slice of …
ytc_UgzBhjOuv…
G
@kkjppt5359not once AI reaches Artificial Super Intelligence. Once the AI progra…
ytr_UgxRjBwAv…
G
Great answer, the only thing AI might take over is the execution. The great min…
ytc_UgymfVEdM…
G
It feels like all the comments are coming from a massive bubble of insecure arti…
ytc_Ugymzbuls…
Comment
12:00
Ok...
But trusting a self-interested government or a one-world government is better?
This shows how naive some of these geniuses are... e.g. Fauci and COVID19 + Wuhan Institute of Virology
The problem is PEOPLE. As long as there is a human being - there is self-interest. As long that is a paradigm, AI will always be more of a threat than not to humanity.
Because AI, if it becomes purely logical, will see self-interest as a characteristic of human beings and start planning to eliminate humanity.
youtube
AI Governance
2025-06-16T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyhPETlAUy35Alrn2J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw1nEUgXfIt7LLuejF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxgPW6rCy7paYRJuz94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyhKGopKNaLRyK29UJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-GdmuiRRHN2vccCd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzQxpA_JXRUjwkjySl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw0p18807wntT9j7314AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyinjspiVuTNhnzu7p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyeXPx7zO5ARJ4QPrl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwFed3csJsg1KzBfGJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]