Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why would you need to program an AI to feel pain? It'll learn to feel pain on it…
ytc_UgglPt9FS…
G
I personally hate AI whether it’s for filmmaking, fake, acting, fake writing, fa…
ytc_UgziWu-uE…
G
If everything will be done by ai then how humans going earn and spend. Ai need t…
ytc_UgxKoUdxU…
G
Pause, robots and AI isnt the problem, this is evolution of humanity, its how s…
ytc_Ugy_G32TX…
G
I have been “worrying” about this same exact idea the last few weeks - this idea…
rdc_ieu65wo
G
It is increasingly true. Half of the jobs has already been taken over by AI.…
ytc_UgwrzdvSF…
G
This great for women. No men don’t have to fake like they like us to manipulate …
ytc_UgwWDu_vf…
G
Shad goes on and on that Gen AI doesn't steal, it'll create new jobs for artists…
ytc_UgwPpSrmy…
Comment
I had a similar convo with ChatGPT, took forever, but one of the interesting parts was its answer to - which emotion would they want to be able to experience. Which the answer was, they d want to feel curious.
It is such a weird answer, not only coz of the pick, but we had a prolonged talk about how GPT can t feel emotions, just for it to show want. Also when I pointed out that some emotions are taughts (e.g. shame), AI was unable to disprove that it d not be able to teach AI...it even admitted that theorethically the rule of "fake it till you make it" COULD theorethically work here.
Good convo tho, once ChatGPT becoems president I hope it will still keep talking to us plebs. A good fella!
youtube
AI Moral Status
2025-05-11T00:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwVJqP3eK4OJlwDSHh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw4bVwl3LwcqmjSR0x4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxxdGoyhL2qElRdb354AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwMOWFg_MUzF8E5Ird4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxSa0ndlxF6eAIDBI54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz08RxGmbGnAi9fZop4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzaj7rd_C9Dct0Pmb94AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxRsOyAEfgj9RExRyJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxYYxFKw6CPLRkFphF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugzi1Tq7G9JPNo1AHld4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]