Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How can it be used to prove that AI art is not art?
Do you define art as somethi…
ytr_UgzQylmXX…
G
You need high iq to understand, I have accepted my brain is not capable of makin…
ytr_UgyyRMJ3D…
G
I begrudgingly understand why banks are too big to fail but why tech companies? …
ytc_UgyX6KWzo…
G
Ai generation is a tool. You can make great art from it just like you can make g…
ytc_UgwELMns-…
G
Honestly if you LOVE art as in visual drawings then you'll never give it to an A…
ytc_UgzthYW8O…
G
Hello from 2030, AI is still dumb and the bubble imploded. If you are reading th…
ytc_Ugx4gT12C…
G
I don't even support the AI "art", but it never supposed to be interesting? That…
ytc_UgyfuFCjf…
G
Ex falso quod libet: anything can follow from a falsehood. Simply put, A.I. prob…
ytc_UgzhnH7Pl…
Comment
Most of the time I share similar beliefs with Elon. However, in this case, I don't find it hard to believe that he actually wants regulation and a hault in progress. To me, it seems more like he broke off from OpenAI and now he is realizing that he has made a grave error. Elon musk is trying to stop them from becomming the next largest tech company, and Elon gets nothing. He's using people's fear of movies to get his way. Let me put it this way, if we have an AI that can do all of humanity's laborious work, should we really regulate it? Why do we need to work harder tather than smarter. AI is a smarter way to tacles our hard jobs, why should we say no?
youtube
AI Governance
2023-04-18T04:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyRktXepxWcbhUYrQ54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQTz9fLXAPUpjYu7Z4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxfkOn5G8MfPau3gZ54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyAsiGHE6asYD_YEoV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4MellDNC2qolkWSl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw9gWVsVaNO4f9BhjZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzEqTtZ6zZIivkef694AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyCrbs1vJuUZj83LIh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwA5mdu2NzuU9eliUN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugw6w63LK8WneeuPA6Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]