Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't understand how our government is not addressing the reality of 80% of th…
ytc_UgwXEjy3n…
G
Ethics in AI is an increasingly important topic as artificial intelligence conti…
ytc_UgyFJB61D…
G
what if you teach an AI to write its own code? Then it could be somewhat conscio…
ytc_UgyoKoSUz…
G
Thanks for emphasizing the importance of tech towards the end, I've been really …
ytc_UgzXJV4Wu…
G
humans need to see just a few objects of one category to categorize other new on…
ytc_Ugw17CXth…
G
My biggest concern is that there is no 10 year future for many of these develope…
rdc_n9ujtr9
G
Well, dear..
The AI was programmed by a person specifically to prey on children…
ytr_UgxFGSyhZ…
G
It's fun and just feels good to be polite to AI, e.g. I told Grok, "Thank you Gr…
ytc_Ugy5_bcHe…
Comment
If anyone can’t see where this is leading it’s a world which is all of us being connected to AI and having our human feelings become extinct and our emotions and free will removed, I would speculate that it’s already happening in experimental cases that have had horrific outcomes, crimes of evil that we just get the feeling something is off for example yet we don’t see it and put it down to they show no remorse they must be a psychopath they don’t even blink or display any body language, what if they have been experimented on and are connected with AI? What if it’s the modern day MK Ultra minus the abuse and conditioning? This stuff needs to be limited it can have its uses but already i suspect it’s uses are being abused spa radically
youtube
AI Moral Status
2025-07-31T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzKKjQ_jtasido8WXJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwtPPWkNszsiCQFFQl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyfAC7gnwiESTsXrm94AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugz6YOxv51lNiUYJZwZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwkZ2oz7VPMpnS6kXB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyVQhffLZldYpfecoV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx8xPRXGri_EHUFJO54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyYxGn0r7X_Oz2aWmV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugy25zYZbtwvrK6Uu414AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugzh62yPJMyuai-Fm554AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}]