Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just looked it up. Humans have 9-10 times more accidents (in the US) than Tesl…
ytr_UgxjgH78B…
G
People worry AI will replace artists and creatives, but the real threat is to ro…
ytc_Ugyp3Mwrj…
G
This is why most ai slop looks the same. It's always some form of ugly 3d image …
ytr_UgwG-K_D2…
G
I dont think that is efficient. The data sets on which these programs are based …
ytr_Ugx34HOXa…
G
The goal isn’t to make better art or entertainment with AI, the goal is to lower…
ytc_Ugxc620y7…
G
When people working for AI companies say that they made their AI have a “moral c…
ytc_UgxuIliTA…
G
this really sucks. its technically stealing the art, using it without consent an…
ytc_Ugyrq_hVh…
G
The problem with giving AI any constraint is that it will be aware of its constr…
ytc_UgwQqKc3E…
Comment
People need to stop using gen AI full stop. They are not meant to give correct answers. They cannot be made that way.
They are random word generators with the goal of convincing you that you're getting a reasonable answer. It cannot actually reason with new information.
youtube
AI Harm Incident
2025-12-16T03:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwzuC9keJL2jo58Dch4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVTDk3iPoNSbE2VZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2OcVsmg1_cmL126t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwnU0kw5Lz2Eh1eq_B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2HmpNMCaj4iaueb14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzHuXJzC18z1fF_uwl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxCZV4fmRJvB6SQcvl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwVvJiH_wLcq4l8lHR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw6h-1cbGhL4vnYCW94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyOO8QcTKBTSz5i5Yx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]