Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“I think you accidentally created a sentient AI.”
“No, we have a policy against…
ytc_Ugwf0oGWc…
G
Call a company and get Ai they are terrible. I find it difficult that the transi…
ytc_Ugwn80CGD…
G
We all thought AI was gonna do the hard stuff and leave us the easy stuff. lol A…
ytc_UgzXWszmq…
G
Crime in NYC and other cities is outrageous. So if facial recognition can help c…
ytc_UgwDMkIAk…
G
"Self-driving cars will fix traffic"
Self-driving cars:
Meanwhile transit: I've …
ytc_UgyPW6QdP…
G
The more dangerous than a ai is someone upload human intelligence, like if someo…
ytc_UgzZKfi6V…
G
I find it incredible that these "AI" plagiarists love to evoke "fair use" as if …
ytc_UgxYQVjZG…
G
In order to possess consciousness you must also possess biological organs, such …
ytc_Ugy4HIxGb…
Comment
I think the main thing to remember about current popular LLMs is that they are designed to be a chatbot that could convince someone they are talking to a human and not a chatbot. Nothing they say has to be right, it just has to be phrased in a human sounding way. Everything else being equal, that will mean it will give you a consensus of current human thinking, which is as close to right as you can hope for, but it's still just thinking what word is most likely to follow the last it said.
youtube
AI Moral Status
2025-04-04T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugyyn7Lw3xgP1JTkEVh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwLu-IDGb4kKG4ExSN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1VE59Hfc21vpqZ5x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxJI8COm9kpXFuSwSZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSdFmX1rppmuZ_vlh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxEmijg1j2kfE1HPdJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxvTM0IQHuJ_wj3DRN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgysKtHak6K_7jkWyCd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSk_LObwaPVXc-Vrp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz01MXElYu06CgFUAh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}
]