Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Once the proverbial cats out of the bag with a.i the advancement to stage 10 wil…
ytc_UgxkqWdRz…
G
Buh bye actors. Hello my honey, hello my baby, hello my rag time gaaaallll 😂…
ytc_Ugzg0c_67…
G
Aida look at you now getting turned down by a robot like the like what the fuc l…
ytc_Ugw_Aal_G…
G
Self driving cars SHOULD be able to recognize all of that, and stop appropriatel…
ytr_UgwPUTkEG…
G
In the book of Daniel in the Old Testament in chapter 11 verses 36-39 (37 is the…
ytc_Ugwopk9-9…
G
The essay isn’t a review of AI security literature because it isn’t making an AI…
rdc_oe2idtt
G
Many can't discern A.I. from reality. That and the data they're using to compil…
ytc_Ugylvb6oO…
G
No mention of shipping jobs overseas and bringing more foreign workers through H…
ytc_UgwlSFAEx…
Comment
A feel A.I. is a dangerous but in some way necessary, both Einstein and Hawking's both feared this, and who am I to argue with two of the greatest minds in history. We need robots to venture out into space for this is the future of are species. But there have been many cases where robots of stated aggression towards humanity. It must also be mentioned that the military will see robots as a cheap and effective combat weapon, after all look at modern military drones. I think we must not only limit the access robots have when it comes to the future of are species, we must limit the actions of people in power to their use.
youtube
AI Moral Status
2020-07-08T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxcmcQ7NC6tTawxFSB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2pn3Ow0nGWen_hqp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyaBmvTaJlAsqMjqnR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwT012w2H0ZKqST9sV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyCmOgYR80MN2uOPWt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZrC4F4vxpMB0o7-B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2igWhyhy0-pCOOWN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0oOfFIYmSJLF-pGF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzKAkwfDHOOCnnzrlt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYoSUfqUa_JW0GoVp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}
]