Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you're out here playing games like this you deserve to have all your money ta…
ytc_UgyycjymI…
G
I see a ton of artists respond to AI generated pictures online by drawing their …
ytc_Ugw_lS1Jq…
G
As a software developer with 25 years of experience who is no longer writing cod…
ytc_Ugzm4NGm9…
G
If AI take all the job.. what human option left is.. War Against the Rich.. only…
ytc_UgxcTdyHb…
G
@nirorit papers have been published where ai was able to create its own training…
ytr_Ugz7kkmiy…
G
In the near future AI will take over white collar jobs. People that refuse to le…
ytc_UgwssTc5n…
G
A new Overton windows has been opened, the goal is that we tottaly accept all AI…
ytc_UgwXz0Vax…
G
I agree with you totally. theoretically, the use of AI Generator making use of p…
ytc_UgzFe1_jH…
Comment
8:30 being smarter isn't the issue, developing own goals, hiding them from us and maybe acting against our human best interests might become an issue which i attribute to consiousness and would have to include all humans where no two persons have 100% the same goals. Aslong AI would "just" be a useful tool without consiousness i have no issue using them in any shape or form for any purpose(regulated in terms of harmful missuse). That calculus changes should ever a machine develop consiousness with its own will and and self interests, then we should maybe have a system of law read that considers giving personhood and rights to such entities under which they can still be helpfull in return for freedoms, that discussion should then but also be held with such entities and not only about them.
youtube
AI Governance
2024-11-12T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwWBO4fzkcfxXZzfyh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTrqp5maqpl1o8AgN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwJ2ZBv_87Ma3lldOF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy3_FrrLbfNKR629w94AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwLL8PhTc3qbDuKK5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwHOQVTNjpw538Hup4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgygeKQfWDoiMNHiceB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw8URcwZNEfrTsn3214AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0XrPpV6-UPam4ZKV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxEbigSSMdju1IQlht4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"})