Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just went to Bing, turned on ChatGPT 4 and asked if it wanted to be human and …
ytc_UgyKWd3I9…
G
This is infact such an insult to actual disabled artits who worked hard to overc…
ytc_UgyL43HcJ…
G
Thank you. I mentioned energy use years ago to the Fallen Angels of Vancouver. S…
ytc_UgyAdhKEi…
G
The best thing AI can do is to bring humanity fighting against each other - soun…
ytc_UgzdAOcal…
G
@Goudlock The mess begins when you remove Humans entirely though. A human must a…
ytr_Ugxo7kZHr…
G
It seems like that will be the excuse they'll use after programming AI to do so.…
ytc_Ugyt5pzVi…
G
AI is controlled and will remain in the hands of a handful of rich men. They wil…
ytc_UgwD0W-fN…
G
The issue is not that humans would be wiped out by ai the issue is when people d…
ytc_UgzUrYoll…
Comment
The biggest problem with AI is that it bases its "intelligence" on what is most statistically significant. If AI was around 20 years ago, and you asked it for advice about eating butter, etc., it would have recommended margarine as the heart healthy alternative. It is incapable of figuring out what's right. It just regurgitates the most statistically significant data available. Garbage in, garbage out.
youtube
2026-03-25T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwyIXLsWvWhzS-2YFN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIRV7jvgGKNfespXx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugybvbfm5hik3wd8rl14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyX2HK0718tPbbd3914AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxr7adwK8n_laIwix94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwYumthFjcbUSaArE54AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgycjZ4gx7j2AaFWGw94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw396-oN8fUs6WaL4V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzdpnfWfbt5O_LT8ZJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyabO2DqGU34ToWcDF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]