Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A few years later...
Dont call me ma'am, I'm a robot 64bit not 32 64bit robot.…
ytc_UgyOXXyjd…
G
The problem is not so much that a junior with AI will take senior jobs. But that…
rdc_oae2lf3
G
This guy makes life coach videos… he probably needs to think about pivoting if h…
ytc_Ugzo2vk_0…
G
No credible evidence supports a claim that 5 million jobs have been lost worldwi…
ytc_UgxKAcfyn…
G
I need to get a computer tech to 100% de-Copilot Word on my laptop. I'm terrifie…
ytr_UgyDMSwu2…
G
We're not even close to superintelligence... These are still just narrow form AI…
ytc_Ugy02rW0r…
G
Maybe. But there is not even a sigle reason I should USE a chat bot =)…
ytc_Ugx0mlmyn…
G
Travelled to Europe and back a short time ago.. they are using facial recognitio…
ytc_UgyQaeVA6…
Comment
I met someone with a full on case of AI psychosis out in the world recently. They spoke passionately (if not eloquently) and at considerable length about the utopian framework that they had co-created with their AI interactions.
I tried to gently respond with my own disappointment at the sycophantic responses I had received when prompting the LLMs, with the blatant hallucinations (anyone remember "gasoline spaghetti" or "eating a small rock a day for your health"?), and was enthusiastically assured that it was because *I* was the problem, that all I had to do was instruct it to be accountable and it... just... would?
I don't know how much expertise there is yet in the area of "AI psychosis", but I figured that you, as a science communicator, might be in a good position to collect what information there is on the subject and help us out here on the YouTube with the people in our lives who have fallen down this rabbit hole. Thank you in advance for your consideration.
youtube
AI Moral Status
2025-11-01T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxMKw0T4rpa289-kn94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgykCZgUqAN9gG9zXNp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy07rYdXY7u9mop5ZF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy7XnmfYHVVLvd2fwx4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxBzO_0s-eWyskk2Bt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzU4g21zQAvcSSiCFp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzePv51wDGYo_IR-T94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzeESuPHUZ477RTUdF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwaHmQGQbaemrbBwzV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzeL7i2XBG6vAdy_Gt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]