Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If I am legend robot and Mike Tyson had a son this didn’t would be the robot…
ytc_Ugxa115id…
G
It’s funny because “very smart intern” has been used to describe LLM’s for about…
rdc_mbq9i6x
G
This is real? Ahahahahaha I knew I chose wisely being Gemini exclusive. So chat …
ytr_UgzdHgEiU…
G
@klarahaplova9098 That really doesn't matter, though. For one thing, you can't r…
ytr_Ugwbt_rTc…
G
So society is racist and Sexist 🙄 and now so is the AI. But you know nothing nee…
ytc_Ugx3JQdln…
G
This was a good debate with good guests and arguments on both sides. Though I'm …
ytc_Ugw0aQNYW…
G
HR is biased too. The main difference is that the bias for any particular LLM ca…
rdc_luzsin6
G
People forget that there's one thing AI can't take, real state.
We are heading…
ytc_Ugw57v_TS…
Comment
No AI does not think at the current time. What it does is match your inquiry to the more common human responses. If you ask - who was the first to have controlled powered flight - it will say the Wright Brothers - which is incorrect; it excludes lighter than air for no reason other than most humans do. The proof of this is that current AI models have to be trained with human inputs. If there were no human inputs - say how to drive a segment of roadway - AI can't figure it out - it uses human inputs to determine what humans would do - humans are wrong a lot. IF Ai thought - it wouldn't need human inputs and would find the facts based on real facts and not human opinions or actions. If such AI ever came to exist - we likely would be killed off as being a threat.
youtube
AI Moral Status
2026-03-02T06:0…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwiEDvqTesk_UlEzih4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOxOsuEn2ejy8jzAl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwJ3Hkh826nX_zG49N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-23u46madYKseenJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwgjwzYQHflu2xOGY54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxbntoRWLqdexkk_054AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyzcQCCev53NnCgI4N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx34S08LynliShVHm94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwhQLZ_nBt2L9ydsUB4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzEiHxNOe8jKUkkkhJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]