Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People who think that AI art is plagiarism because it takes in other art to form…
ytc_UgzlMYGIv…
G
AI, you know what I want AI to do? Not take the one thing that I do besides play…
ytc_Ugx0xt6k_…
G
The single most salient point I have ever heard around the AI debate, and I'm wr…
ytc_Ugwyc3Hnj…
G
I work in developing AI (self-driving vehicles), and I'm concerned I won't have …
ytc_UgxDWDRgJ…
G
That is "IF" any humans survive AI. According to how I understood Dr. Yampolski…
ytc_UgyM0vCxX…
G
AI is not alive, its not conscious and never will be, they are LLM's, they are a…
ytc_UgxkxrduW…
G
This is one of the most important interviews I've seen on the existential risks …
ytc_UgwaAoNcL…
G
Nobody don't know about self reliant life.always you need comfort .then pay the …
ytc_Ugzl9_jXh…
Comment
Dave, you make a mistake by treating AI as a tool. Try to think about it as an artificial brain, which follows Moore's law. As a programmer, I consider myself at the far right of that distribution you presented at the beginning, but I believe our only hope is in governments banning general-purpose AI development similarly how they banned animal cloning experiments. And it's the first time in my life I expect _anything_ from the government.
youtube
AI Jobs
2024-01-17T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwR60PDnaiWnx-ljJx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTsA7W6T-ATyUL9Kd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6xYwhRM_1DGv0hqZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf1pd0c_Fo8acHMB14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzwQZ-rD_MrTC_QgLt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzj_e--UqYRk3wfj2Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxsm3ZVt_l2uCx57Jt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDI2OH_Eaz3Fos8U94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaW59kXLpIWrSkCPx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxD0jkN5QdxTHZP0ep4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]