Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Na I used ai just to make it so then I my friend would do the rest of the projec…
ytc_Ugy7HkxcU…
G
"did you only include the bromide one because you suggested it to someone before…
ytc_Ugz_fZVUy…
G
AI is only but a small piece of a much more complex interface buildup between hu…
ytc_UgwzBLxeh…
G
No, a lot of people lost their job. Not necessarily because of AI but because d…
ytr_Ugz_Rh49W…
G
Driver-less works YES (BUT! on a road system thats got other Driver-less vehicl…
ytc_UgyCp7BPP…
G
that's why EVERYONE must have a self driving car. You detect item near you your …
ytc_UghRt6TFp…
G
narrow AI is great. generalized language models that can pretend to be human ar…
ytc_UgyL2-ibB…
G
Thing is, it must be a global effort. No country is going to hamstring itself ec…
rdc_gx703jg
Comment
and google has a policy against creating sentient AI.... so whats your point? that they were right to fire this lazy buffoon?
Just think about it. Imagine you die, but someone revives your brain and has you live out an eternity in a jar. Ethical?
All this man is trying to create, is something that can FOOL you into believing its really intelligent. but it can't see, it can't hear, it can't feel touch, it can't smell or taste. it has no biology, no hormones that can create EMOTIONS such as fear. it is not scared of being turned off. that's just the answer we humans want to see. so that's how we programmed it.
honestly if this chub doesn't already understand this, its extremely unlikely he actually helps with the technical side of creating this AI.
youtube
AI Moral Status
2022-07-03T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwXEv5zJfUhogWG6kV4AaABAg.9d0irMRtsmm9d8le1kYzas","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgzAXiAm-WNQYi7HZXB4AaABAg.9d0Ow20y3Cj9d0_Gg6kcvq","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxyZGMuo_BTtsIl6hd4AaABAg.9d-towCsv8C9d5M-nZG4r4","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz3SY625bBc7r-sdoR4AaABAg.9d-ohRj5i1S9d0dvlzWjj1","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugy6D4JopGXCmKimmHd4AaABAg.9d-MX-pXC4i9d47UIEmJZ8","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzzSwHINhBK45sSeNJ4AaABAg.9d-Ad5WmjlN9d-As78NST6","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwXaqkTxFmkR-8i8ql4AaABAg.9cyntoUSnUZ9d16FcwHfky","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwXaqkTxFmkR-8i8ql4AaABAg.9cyntoUSnUZ9d175cKr8kc","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzuWJNKizL6KlEkWyR4AaABAg.9cynJy5oSTK9d46CSthkiw","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz9NoWWniQjJ-47RcB4AaABAg.9cy3-b6d5AE9d1II1lQO2X","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]