Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I get the algorithm to give me what I want. Just click onto three examples of w…
ytc_UgyWGLLsF…
G
Ai is literally just a person without emotions. There's plenty of cases like tha…
ytc_Ugz61CDp-…
G
I think the only thing that is missing for AI consciousness right now is for the…
ytc_Ugz4RLijb…
G
There is no AI - there is AK (knowledge). Can and has been useful for almost 100…
ytc_UgxGF_vF2…
G
I haven't watched this in a while, but it relates to the topic. It's a discussio…
ytc_UgxyazYfc…
G
New arms race.. This is Getting rediculous. All these people like Musk ect warni…
ytc_UgzG3HxSq…
G
Why would the government allow self driving cars maximum when us dying is like p…
ytc_Ugybwn_Lm…
G
That's a good question. Will an AI tell everyone "hey fellas I'm smarter than an…
rdc_kvslbu1
Comment
Smart guy that has lost the plot from his bias of mimicry and his attenuation to leftist Utopianism’s. His basic logic premise is that any AI will become anything that we are, should we, the humans, program them to do so.
It’s as true as its outcome is naive.
What he fails to state is that without us there is no AI. Without us there is no mimicry of us. AI will therefore fail by definition BECAUSE it needs us. It’s a logic fallacy.
Consider his ultimate utopian end goal. We are gone as we are not needed and only AI is left. Knowledge has ended as mimicry has ended.
Having AI find a nuance of a pattern we as humans haven’t seen, yet, is not the same as knowledge. It’s only the compression of time. It’s not knowledge.
We are therefore intertwined. We will work intertwined.
youtube
AI Governance
2026-01-03T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyxNXt8GC8vqpRyz6p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbmhQkDJxGqYZw0SB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxcbV-i9sLUyucckRp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxwPX_MYSRjreCrAiB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyD_QaIQVOBwXsTaK14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwE6iiJHBbU9k8J16F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSh1y1AALQzh7GxhZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzKiU8-n0papzp4ZbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3L1gawHCykcIzrll4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwNCBqY9LTM5oQccEJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]