Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You know, I believe we should just replace CEOs. Why did we build something to …
rdc_mjts9az
G
Man, people really think that AI is profitable?
I use ai for entertainment purp…
ytc_Ugw15i8cW…
G
9 companies like Boston dynamics designed AI even for battle Should we already b…
ytc_UgzVCNhY-…
G
This blows my 66 year old, not computer literate mind, away. I am still grapplin…
ytc_Ugx5AY2go…
G
I've been having the thoughts Geoffrey shared ever since I used a chat bot for a…
ytc_Ugw3I9qMN…
G
You can't, at least not with LLMs like GPT, Claude, and Gemini.
Because to code…
rdc_o7cix6c
G
😂😂😂 AI ko marneke liye fir nuclear weapons banayenge aagli bar war hoga toh AI k…
ytc_UgwiBatIB…
G
Are we going to have enough energy to feed the AI and enough IT professionals to…
ytc_UgzBTy-OL…
Comment
Nothing to worry because an AI programmer includes syntax "i will not kill humans" hihihi... (u idiot a hacker, a programmer, an evil scientist can easily make malwares in syntaxes by removing the word "NOT", and that is equally very quick in computers , hihihi )
youtube
AI Moral Status
2022-01-15T14:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxjbcjHpDQlJlDmq794AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgybZ9nbRGbppfDGFX94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwBF0m0LW7P5jSvrlJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy7U-eWE5dlr2QDjj14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzPnwdhWuwYTvuBCht4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz7zVgL9-ffhEe2Zwp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzL6Yw4nIt8i1n0qAZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwv4oKXNDpSEDU2zy94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_QYwoStCmXXVQiiJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzHf9gsnnAG6_CiSNZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"}
]