Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why the hell are we selling AI chips overseas? Even more dangerous than nuclear …
ytc_UgyGo-Gdk…
G
Does it make sense that I’m pro-ai art in, but only in a vacuum? Even people who…
ytc_Ugy7PKs81…
G
Hi, I am disabled. I'm autistic and have aphantasia and very poor spatial reason…
ytc_UgyE671Qb…
G
Comparing AI software to computive software is like comparing a biplane to a ste…
ytc_UgxUywcP9…
G
Ai "art" is so odd, you made nothing. You told a machine to find the best arrang…
ytc_UgzjK6AFQ…
G
I have been saying this for years! The sad part is that those two gentlemen real…
ytc_UgwQsMhrS…
G
any ai creation should be marked as such... let people that don't mind it enjoy …
ytc_UgxjJgW8H…
G
I think the problem is that people are forgetting what money was supposed to rep…
ytc_UgwiGorsz…
Comment
It seems like everyone else talking about and using AI is using different systems to me. Everything I ask which I have knowledge of came back with far to many hallucinations to have an effect on many jobs
Society has many problems, economical issues are many and AI will effect some jobs but nothing on the scale this guy and many others are scaremongerong about.
Anyone remember Y2K? That was hype, I worked(still work) in IT and my employer made a fortune out of the Y2K fear. Thankfully it was easily discredited by the date ticking by and everyone shrugged there shoulders.
If society breaks down it will be due to dumb people believing dumb stuff AI hallucinated on, unfortunately there is no date to tick by like Y2K.
Hopefully all these big tech companies will be forced to remove AI off there front pages, right now the nonsense it produces is the biggest danger to our futures.
youtube
AI Governance
2025-09-04T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgztJGt2CnPf3dIo75d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyZtqiRgsxxFdjeWPV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwY-PoVyE_SGqZnVG54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZzuZmYxl7NM-1hsx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwkwPKUbH8nJ7FbkiZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwx-cZ_U7chLsrSYWh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyiMyD7EANTV428Da94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyiRsk4kleJrZAwCOh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy4E-NuNJK_4wA5UYB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugydn61qq3hBGVIU0Ax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]