Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The scariest part of AI is that I know the drone robots are coming, and some of …
ytc_Ugz0NUtPZ…
G
Not today but AI only gets better by 2030 most jobs will be taken. Ai, Agi, Asi …
ytc_Ugz7dyPJc…
G
Big rich people who made their whole company AI:
Ben: "Hey dan look at my new b…
ytc_UgxpxIXVp…
G
Disabled artist here too, hEDS, fibromyalgia, heart disease, chronic pain, and m…
ytc_Ugx7kYk5t…
G
Didn't Microsofts AI become unglued and started having misalignment problems whe…
ytc_UgzZotQVd…
G
All of your interactions with AI and anything digital goes towards forming your …
ytc_Ugx8Nxry-…
G
We appreciate your engagement with the content, but it's important to maintain a…
ytr_UgziUCjgX…
G
This is the best tl;dr I could make, [original](http://www.scmp.com/tech/enterpr…
rdc_dr4d7pe
Comment
Alan Turing thought his test was a tool to test machines for consciousness. Instead I think Alan would be furious at the number of PEOPLE failing his test!
LLMs are OBVIOUSLY not conscious, but we are anthropomorphizing them so badly that we keep discussing them as though AGI might be close and it so CLEARLY isn’t! This is like if half of society didn’t just see bunny rabbits in the shape of the clouds in the sky, but more like if over half of society expected the bunny shaped cloud to start hopping around at any moment! Listening to corporate leaders and politicians talk about this stuff they just sound completely unhinged.
youtube
AI Moral Status
2025-10-31T03:3…
♥ 33
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwUXNN0BH9UGFe3AIR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzRmfkOp6bO0nb9UXx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyi3RCOeht4txJNWBB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzM2FPyCXlq3ddCGYd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxD4DvwO2UxlJlS6114AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6Pt_A9K6iBockeqF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOjPJrQfssCdRpZDd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQk-TwitKTFePsIm54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugypezwk4B0M5UuE24V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyty4n1d7bq8r3t-k14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}
]