Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People are now realizing they were not raised proper? What if Ai just does not w…
ytc_UgyQKiiUw…
G
I own several 3D printers--machines that are much smarter than your average toas…
ytc_UgjANcp3q…
G
well if u have these things called eyes u can see that it's actually a robot.…
ytc_UgyQZPgP4…
G
Chat GPT - Yapping
Grok - I would pull the lever without hesitation (Justice)
…
ytc_UgxWrkhhO…
G
Ai/AGI will not be a major threat until it’s able to run efficiently on general …
ytc_UgzziEQx9…
G
Every country should and must have their own AI otherwise they must bow to those…
ytc_UgxGNllCD…
G
Logically speaking you're wrong. There is two ways of defining new concepts: eit…
rdc_ohqlim9
G
I am in the trucking biz these things are a safety issue ! They cannot think the…
ytc_Ugwxd_I2k…
Comment
The thing that strikes me in this debate is who actually wants this thing? Clearly, it's technology people whose financial success or even survival now depends on doing it. They are people who are fascinated by these things. An average person does not want an omnipotent technology to replace almost all things human. There is literally nothing that I want from ai. Is it fun for some people to play with? Sure. Are there things it would be nice to automate? Of course. Do we want or need something so powerful? It will fundamentally alter our existence in ways that we cannot even predict. It's not even promising to do things that most people want. The reason it's getting done is that some people find it interesting and now they are massively incentivized to do it as fast as humanly possible. No on is asking for this. Why are we allowing these people to do this?
youtube
AI Governance
2025-10-18T17:1…
♥ 31
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz54tRSGTf2WUpK3XB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwex0cyIxLK0IvzWZl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxgtOpmkin3sT4E5bt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwSNavfO0EFFp1ZIW14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVqiB2wQ0iw5-d1794AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwf6yGy9XbDbxjJEUZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzALobPq-0dklCoiS54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwBuWyXbWDR7pMniZt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwoUIq-U3tZOcbWT7l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxmPi5LHrKMWrCuytV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]