Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To make matters worse for the driverless taxi company, this is all on film becau…
ytc_UgxEmmBDA…
G
@Contrailed huh? I have had FSD for a while and have seen the changes. The nvid…
ytr_UgwaI2m4c…
G
Lol! There will never be any "Skynet". If anything AI will take pity on us pathe…
ytr_Ugwli3lZi…
G
Just dropping this here, not sure if anyone else said this. Studio Ghibli has al…
ytc_UgzV5sdAZ…
G
Problem with her is she thinks she can mitigate CLIMATE CHANGE. She would have t…
ytc_Ugz9uPS9B…
G
God do I hate the idea of AI killing everyone when it can barely function for wh…
ytc_UgyjfAyNF…
G
At five minutes, it is hilarious to see Tucker absorb that idea that AI could al…
ytc_UgyDkg4aY…
G
There is no such thing as Ai. A machine cannot evolve past it's human programmin…
ytc_Ugy622ISm…
Comment
I’d just like to throw out a little reminder, AI is still not a “Mind” or anything remotely similar to organic / carbon based beings such as us, while it is a good analogy. It is simply a system of layers of complex algorithms that chooses outputs based on recognized patterns in training data, and not an “alien mind” in the biological sense, nor could it rival a human mind in the near future due to the lack of neural chemical activity, and simplicity of its ANN by comparison. Now, that is not saying that AI cannot bring about the end of humanity, just that currently, it simply lacks the depth or complexity to do so currently. Great video btw.
youtube
AI Moral Status
2026-01-31T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxE5O_6IPYeLiKzglN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwpoSnEXR6W1SDyLvl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxlYGZ7EraHkhkGAAZ4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgygYISJbpBWVYyGRVt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyi9xQZ9jvC5uwC9qx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwmQRKeKDjKB7_TVFV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxjEqsug_i7fLNH8Td4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzki8yUHirHgts5jQt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwz3jKaa5FhkrhQg4R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwSbwzClGucP85hD5l4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}]