Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You’re assuming these LLMs are as far as AI goes in the next several years.…
ytr_UgwgdVT8F…
G
It's funny because back then I would've taken it like a compliment thinking they…
ytr_Ugw91CTL1…
G
On the month of July, i went to an art convention with the intention of buying s…
ytc_UgxvZmTNy…
G
I think it does look ai because in most ai images of people, they look super shi…
ytc_UgwrGDdU6…
G
Hi there! In the video, the presenter asked the robot about the meaning of her n…
ytr_UgyyWsL5M…
G
Automated driving around school buses was always going to be a problem.
And if…
rdc_nt0mk0s
G
Of course the founder of an AI company is the one telling you that Ai isn't at f…
ytc_Ugx1LXNRn…
G
OpenAI "quietly" buying all the GPUs and buying all the memory "one company's GP…
ytc_UgwaOSVLh…
Comment
This is a hard one for me because these are two of my most influential “superhero heroes” I look up to you in my life right now. However, they are pitching something incorrectly, which is that these models could be getting consciousness sometime soon we don’t know, we just do not know the answer to whether they can or not be conscious and to say I don’t know is the right answer to this in general.
The second thing is the Turing test is not something that has been passed in my eyes, genuinely ask yourself what is the hearing test? What am I testing for and do that with the model and ask yourself? Is it actually passing this test right now? I have seen Claude do that for me multiple occasions, but it still has robotic features that are baked into it design that make it feel extremely robotic and it is not Turing test worthy
youtube
AI Governance
2026-03-11T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugxwp7VugAwzIR2XeMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzywSBDetSAodsCwRl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8DQEiyL27Sdde_kF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxj3ndGIgaZWmdsZal4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw8q8zeNqWMxFSZX8Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHkkXiXpN123sNKqp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyqeTPYZCpIQjsRPxh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxLxJtksDafJuNcLol4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLugHlzM-UyPrzwxx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwU_AcGOkVlpXHsr014AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"})