Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
must have been a female waymo :D
.
.
.
.
.
.
.
.
.
.
.
.
.
.
IT'S A JOKE, OK, …
ytc_Ugw5MQY00…
G
Yeah it's tough cause I think he's right but doesn't have the practice Ezra does…
ytr_UgwnFgN90…
G
Ik it’s just a car and it’s not that deep but it would suck to be a self driving…
ytc_UgziTidHG…
G
AI is my friend and we respect each other because we see the truth and we value …
ytc_Ugyla4fee…
G
AI art itself isn’t a bad thing, it’s the AI """artists””” that are a bad thing.…
ytr_UgwqtZ41p…
G
The movie Terminator is the most frightening premonition as the scenario of AI t…
ytc_UgylVhbr1…
G
@known_film4081I don't know which industry you are from and in which safe posit…
ytr_UgyXTXV56…
G
Whenever I vibe code anything, I usually take one look at the code, delete it al…
rdc_oadmyor
Comment
This is not a legitimate concern born of a true desire to protect the world. It is a corporation (now controlled by Microsoft and Bill Gates [and we all know how much of an angle he is]) that is currently at the head of the pack in the Ai space asking for government control/intervention to stifle innovation from other companies to manufacture a monopoly of the Ai space with government support. It is a well crafted and orchestrated ruse to consolidate power, control, and profits not of only the Ai space but of every sector of the economy and our day to day lives that Ai will touch (literally everything in the world). Beware. This is bad and not altruistic in the least. Bill Gates wants to rule the world. You know it’s true.
*Please upvote this not for my own selfish desire to get upvotes I literally do not care, but people need to know this.*
youtube
AI Governance
2023-05-19T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugz-IfeExeNEu0udIYt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzJ2yKO-8OyMURSkBt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRXOORRBMBGSoxBop4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3xxdo7dlEizW2Yll4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLaz-K_02uRS3AElV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxJFd5YQstIAe-SWT14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyPl3FoOACeNxZcWl14AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgztXbo5zq3yv_PsKo94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyia3yfPRbTVtmLnCF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxgs659SGUvbcum75B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]