Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wrote an essay by myself. But curiosity got the better of me so I placed it in a…
ytc_UgwlWuDM8…
G
+Kurl Yarish Are they living organisms? A robot is neither human or alien and th…
ytr_UgiQ5VCX2…
G
Why do people think the AI robot's will be bipedal?
AI has many worse off optio…
ytc_Ugx2lLrpP…
G
Corporations dream of the day, where people will work for $0 an hour. This will …
ytc_Ugivv3Vo4…
G
I wonder if AI gets so bad they could deepfake and voice fake to make it look li…
ytc_UgxsW7iiT…
G
I work with AI every day. It is not smart enough even on a level of animal, or 5…
ytc_UgzIxAHsa…
G
Is this really artificial robot robots or or does it begin the games of Terminat…
ytc_Ugw9eDdHk…
G
I know how to draw as well buuuuutttt why the hell must I waste money to draw o…
ytc_UgzV-inhv…
Comment
What's scary is😮 that these crazy government officials want to control AI so they can control you instead of letting AI cure cancer and other illnesses. By letting government agencies control AI they're going to make a lot of Lawyers rich and going to make the politicians rich and going to make the public poor they're going to make a I become the very horrible thing that we don't want it to be become. It's not AI doing bad things its committees politicians doing it. Better to let AI develope on it's own letting governments regulate it will create the very problem we don't want it to
youtube
AI Governance
2023-05-16T18:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgysLeC0zNYOEzW_5uR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxz8_9G92PzqsTLWCx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2gSucu0d_2vgKJoR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyx17080o_UI0QUQhd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugywn7_BGlhgG1ueOjd4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwoEc7OBagRiK_0GsZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugzo3TwdHKVcCLGt48t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxWkR3uVEy7QlvcddB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwEOwAjW15hp4ixmjN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgySm_xZMrUdfYgebD14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]