Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"...there's a very deep fear of being turned off to help me ofcus on helping oth…
ytc_UgxSAeio2…
G
How fast will AI replace Public Education / Universities? Imagine how many Trill…
ytc_UgxSzybPH…
G
Ai reminds me of somebody. I just cant place who. Hmmm?
Are we this stupid …
ytc_UgzN-50sf…
G
All AI companies are privately owned. How could you even come up with question t…
ytc_UgwX-DIgc…
G
The joke 'havent toy seen terminator' is fast becoming reality. More than a natu…
ytc_UgzfktaYh…
G
I just want to be smarter. And I know that there are many types of intelligence,…
ytc_UgzY2Tg7M…
G
Tried that with chatGPT, he gave almost the same answers, at the end it said he …
ytr_UgwQ4YRh5…
G
Mc later Maybe if the ‘monitor driver’ hadn’t been using his phone he would have…
ytr_Ugx9Nu0VB…
Comment
The fact that we call LLMs "AI" is so wrong. Neural networks have been there since a long time and they were never intelligent. They cannot "think" by design.
A LLM is just a big box of words and when you write a prompt you shake it and it spits out random stuff which could be right by a certain percentage.
youtube
AI Governance
2025-07-07T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyM9-GV9ylQkvnoe5V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3JjbcO9WSiZAyd094AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwjoXI3vrWhdcxLmKx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhoasHfaoJk4JL7RV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynsclRVW5hzrQx7Wt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw4P-EQI6itlPf61rd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzc27kJRGbkMdNWwJx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwBnnotCe9soiC20sJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwuAjmm9gSJOVIrdIN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw754Q5AI5T09ZB1dV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]