Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The algorithm suggested I watch this video and now i feel indignant...am I being…
ytc_UgxGovJWl…
G
AI art just scares me in general. Art as a concept is supposed to be special, it…
ytc_UgyZn_Y1n…
G
What if a victim generates a deep fake and plants it so it's shared and then sue…
rdc_nzfef7g
G
I think that AI like we see in sci fi movies will kick in after 50 years from no…
ytc_UgyjE1qvS…
G
How will governments balance the need to encourage industry to keep pace with th…
ytc_UgzgwI0XL…
G
No..no. AI systems are not conscious...they are stupid bots that do what they re…
ytc_UgxLmIeXl…
G
One of the most significant days of my CS degree was in University of Texas's OS…
rdc_jttfgbf
G
I just want an LLM that doesn’t show up as green bubbles in group chat.…
rdc_n7sy0ep
Comment
The fact that wrong answers are called "hallucinations" says it all. Hallucinations require imagination. AI has no imagination. The entire AI industry is propped up by such smoke ans mirrors.
I already know "this isn't going to happen". I just watched to hear your reasoning. Too bad you baked in an ad. I won't see another of your video's now.
youtube
2025-11-08T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyPENHm8nttBYyog8x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwVMlPN66H4ujPYBwx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0bVcznkDOms2rEzF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxofFY44puJdtb-ixl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzanZnHhANC7GhGbRV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxBId35OZPaLaOAmV14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7qzFp2MGrAS1AcpV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwNPYZC7pSTwHWMnUB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxa4KLoV8bx0nh-iGx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTQZzqUx7F2rXrskd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]