Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Very informative talks and discussion, Connor Leahy, Roman Yampolskiy, Jaan Tall…
ytc_Ugzv60J2K…
G
Programmer here,
I really understand you, and i do art myself - i don't think AI…
ytc_UgwFxdDRw…
G
also, in terms of making art AI will never be able to scale on that because AI a…
ytr_UgwF1esIS…
G
It doesn't matter if out gen of people think it uninterested the future will kno…
ytc_UgzFbHQX4…
G
You are underestimating just how powerful the ai system Tesla has built. It is p…
ytr_UgzMhJ9bZ…
G
1:34:51 it's certainly true that the majority of industry experts have become co…
ytc_UgzNhK6NL…
G
You know what there’s a lot of truth to it. For eg I made a request to write a r…
ytc_UgzT80bvY…
G
What people continue to misunderstand is that AI, as it stands right now, is not…
ytc_UgxekzN2r…
Comment
I wouldnt worry about ai taking over or being a danger. Its more like ai will enable our salvation or librrate us from the druggery of working forever. These ai systems will once and for all abolish salvery. And lets not forget what a female ai robot would do for man. I think this talk about the danger with ai is hyperbolic. Governments always attempt to regulate regulate, but they dont represent the masses. Ai is going to enable us to traverse the galaxy in search of new worlds. Check the past history. Even in the middle ages they were afraid of science advancement.
youtube
AI Governance
2023-05-02T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwldno69nkJe-0F4mB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7rLGM_eHIsuroOIZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyzusTvCUEXQPRQyFB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJ2kLMIFNvth7KQzJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyYx0V58M6-hJO2p614AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlIjG9x6cCOihLHg94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypEQbShRpLVO0Smwd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzNBL5lvgyylid6J9d4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwYhSLX6Faguk9YoPh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgziSRqDbEvsbxn-juh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]