Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Time to find a nice secluded cave to live in to delay my inevitable interactive …
ytc_UgzggR8DI…
G
5:00 AI is even picking up terrible English in order to seem more real. Dear Ch…
ytc_UgyQsEnfl…
G
I must say this feels more like a general rant on capitalism than self-driving c…
ytc_UgyvEdN-y…
G
I started off horrible at art, but I’m getting pretty good at it. If you practic…
ytc_UgwJvrZ_u…
G
I saw someone who did Fran Bow & Mr. Midnight, and well as Coraline & the cat fr…
ytc_UgwPzlSxy…
G
Great video! Just pay for your robot like a capitalist. People take out 30 mor…
ytc_UgwAqZQVj…
G
Why don't they ask chatgpt where the best place to build centres and source wat…
ytc_Ugz3GvFeB…
G
I will take the ai crash as to the good outcome we will rebuild if…
ytc_UgxsOFfyo…
Comment
I think the enthousiasme the scientist have to create A.I. is a bit scary. Emagine, we people do not have one oppinion wenn it comes to all diffenrent kinds of things in life. There are a lot of people (who do have a consciousness) that like to control others. Now imagine a A.I. that can actually "speek" and use the language of every Software application and connection that connects us, and then there is a A.I. who has made up his "mind". Its all funny now. But everything can be used as a waepon. I personally like robots, but i´m not so sure about A.I.
youtube
AI Moral Status
2022-11-04T14:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwhoyCWrUVLu8SzDwZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzStsG2WbulGgyHgGR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyH-UU5bqW_mrRC49p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxel7SIEvIBJ4LDmsp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwfAlcXUTQ7R_e7GcB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz-lVDErwVL0qQeqjN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJHo9GLUWd5wZhxQt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzet9C7JdBZKCroFAx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyeaLT5YgdMvEMx8N54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw7nSO6KG35PMGNOaR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]