Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI Design tools just got released, and Google Stitch allows people to generate i…
ytr_Ugzwh6Shi…
G
AI is kind of just an example generator. Some examples are way to generate and a…
ytc_UgxuGSs4i…
G
I believe A.I its a groups of people working behind the computers in India, n th…
ytc_UgwNqDik9…
G
It won’t clear things up ChatGPT, because Alex refuses to accept the explanation…
ytc_UgxUXiIYX…
G
A robot is still only a robot...it isnt alive and never will be a real human…
ytc_Ugz4TJZuU…
G
I have thought about this and figured maybe with all that extra money they make,…
ytc_UgzwbA8Nf…
G
That all depends. There are 2 sides to this argument. And the argument is, "Who …
ytr_UgydH20nR…
G
Companies will tell you “yea we had a couple mistakes but hey our ai is pretty c…
ytc_UgwooQ7Gd…
Comment
This is predictive programming. Machines do what they are told. Ask any AI about a subject not convenient for the elites and it will lie and divert until you get tired. This is a way to wipe out countries and blame AI as if it wasn’t part of the plan.
Computers are not and cannot be conscious. It feels nothing. It cannot be offended nor feel pain. All efforts to show they have consciousness is programmed or imitated. Climate change is another hoax. If they cared about the climate they would stop the daily spraying of the sky and they would mention China and India as great polluters. Stop geoengeneering and most of the pollution would disappear.
youtube
AI Governance
2025-09-03T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw_6vorjHdciMvuOo94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyoag5S0730trMSBtt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxX5PHtA-RjjQuz1VV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVQwE1AlbKoXgCQPp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwX8HpldYAUyBheF2x4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwL0iro5SIrrDtYdep4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzG0wRV5aHwd6QL4hV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwZ2Y0dFRixIv_1z1J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugyi3q9ocNY_xJj95Oh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgydoyW9cc4xzUFJfxN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]