Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i dont wanna work anymore. This system "do a job to survive" sucks. So Ai is wel…
ytr_Ugwmt92Ii…
G
I dont use AI. OR I should say i dont TALK to any AI! I prefer to talk to a R…
ytc_UgwKya3tb…
G
At the beginning of the transition, the existence of robot labor will impact peo…
ytc_Uggcfz830…
G
I remember when I asked AI to fix an excel macro script. It gave me a shittier v…
ytc_UgxjID38E…
G
LoL detroit become human also if robot or smart robot:A.I do ever get felling th…
ytc_UgxJjtlBK…
G
Unlike the computer engineer comment somewhere in here, I'm not an engineer, jus…
ytc_UgzWkEJzP…
G
AI is not that smart or its programmed to give very poorly thought answers, say …
ytc_Ugy5rLkop…
G
On the subject of Ais saying they're conscious, other people say that they are c…
ytc_UgyunZKpf…
Comment
Let’s get this clear. They want more regulations NOT because they think it’s unsafe or may Cause havoc. Rather they only want more regulations done by the government because there was a guy who got the code ChatGPT runs on and essentially made it free to use for all. (chat GPTs paid version) the creators were pissed to the point that they emailed the hacker they would take legal action if it was still up, to which to this day it is still up 😂 this is just due their own self interaction not because they are concerned about the public
youtube
AI Governance
2023-05-17T14:1…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz9V4OHAROGAKiom0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz60Gho3GHoN7idMB14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxDBBNy2gR27gsQsH94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyGAHe0tauj1OsbyR14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzKGYlGuJ6okuIeZuJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgytykIe-b8DsmwskP14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxydU-TzdSCpt_nDah4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8Ew39V6gun_D87ep4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugyq1KsteNBOpHQiEF94AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwWQKOrY-3zd9n36KN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]