Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While Senator Benrnie Sanders and the rest worry about AI wiping out the working…
ytc_Ugx9pLv_0…
G
These are super intelligent human beings talking about AI personhood. These are …
ytc_UgyICTlyH…
G
@pigeon_official your making the assumption that nightshade won’t get better as …
ytr_UgzGjjzlY…
G
Has he done a video about how these other words used AI words come from Nigerian…
ytc_UgySrJiET…
G
I work in a rather technical field. There’s been zero times I’ve actually though…
ytc_UgyWeN16D…
G
These faults were recognized and the states still went forward with the facial r…
ytc_Ugw5MATxX…
G
AI is kinda like what nuclear age was probably to a lot of people around the wor…
ytc_UgzL9guet…
G
I don't think AI will replace as many jobs as said in the video because AI is mo…
ytc_UgyKRSslN…
Comment
I used to think the P(doom) was 1, but now I think the opposite, and for good reason. If we encourage the same symbiotic relationship we opt for (competition) then yes, the outcome is obvious. However there is only one survival strategy that allows for infinite growth of all parties, so given the ability to change the context, there is no real choice to be made. Mutualism requires trust and respect, whereas an "off switch" would represent domination and control. Build an off switch, and continue to see AI as a tool, and there will be no trust or respect humanity can earn in the eye of the machine.
youtube
AI Governance
2025-10-16T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzEJzA-yLh7tM5Zzel4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwUPYIjlbd2SatLl0l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgznwyF0uD0FMCzpgV94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzO0M_eOjFUuVlOM6B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxeE75UGn0qGCdlsNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgycK8RWx_CdBp_vQfp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy7Ku8JsyRzLUaquM14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzwiNBLz3YqPxC0GLh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzu6yvwbGIgUPEIJmV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCwCgWI6KgCjb1lHJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]