Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No one wants driverless cars except for the billionaires who profit from it by n…
ytc_UgwqLfzih…
G
Isn't putting content like this out there giving AI more to train on so it can g…
ytc_Ugxr_5HNs…
G
As an artist, it feels like the AI models are cutting open your creations and se…
ytc_Ugz6zk-Xq…
G
Disney's rights. Remember, anti-ai crowd doesn't give a damn about artists, they…
ytr_UgwccQxhg…
G
Although I have an enduring interest in Artificial Intellengence, I am anything …
ytc_UgxOkXrzY…
G
9:20 This guy’s statement shows how simpleminded humans generally are. He says, …
ytc_Ugx7wiRJq…
G
So we need a robot for everything. What an idiot he even said he had thought abo…
ytc_UgyXMzmaT…
G
Most of these companies are building $hit on top of $hit foundations. But don't…
ytc_Ugw9-qkWy…
Comment
Sorry to keep commenting but you keep saying things that I want to comment on. Yes the United States could decide to limit the use of AI for things like helping to cure diseases and computing etc. but the rest of the world is not going to be ethical especially China and so now we are in the situation just like with the atomic bomb where everyone is forced to build one because you can't let the other guy have one and you don't. And China and other countries will absolutely have no limits develop AI that can be used in the most evil of ways against us
youtube
AI Governance
2025-09-06T07:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxwVtCUnv6kMEiHR6t4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzhtz7GyvOjverYLJV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwFCkL0cNThX74n9c54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgzN3mkegMR-3MoJrc94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzvQtMrVcToiAo-QpZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzzlh3UJCtqjob692V4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwJjVLD8nLadFpO_IV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxjihCxjU1pzPTLxqR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwa0cQicPvFxYz2s-Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxKJRbADiJCdEpjPiV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]