Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why is Auto pilot even allowed, it does not make up for human control. Cameras i…
ytc_UgyJeGOCu…
G
So we know that its gonna give problems and gonna take us over in the future (…
ytc_UgxYt1u8N…
G
It's insanely polluting and uses the works of actual artists to teach the algori…
ytr_Ugzo0U-dE…
G
Its just saying things that its scraped off the internet as a whole.
Which is a…
ytr_UgwuqX3tn…
G
Whose ethics do we embed in an AI? Christian, Jewish, Buddhist, Muslim, etc. Ho…
ytc_Ugx2T8cpH…
G
I’m in Australia, as a young woman in my early twenties with a slightly unconven…
ytc_UgwCxM7OT…
G
Whatever happens, do yourselves a favor and follow multiple trusted platforms an…
ytc_UgwgKpqOF…
G
It’s funny how the guest never said any of the bad things Elon musk does or has …
ytc_UgzZ1SlhH…
Comment
AI will not be available to everyone and if you want it, you will have to pay for it. Therefore the most advanced AI's will be the most expensive and only affordable by the super rich... and being able to use it will make them even richer. Since the advanced AI's will always be smarter than the basic AI's, then the poor folks, even with the basic AI, won't stand a chance.
youtube
AI Moral Status
2025-08-02T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwrltEGWaKpaOPZqKN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxhFtGIEcLMXpR6AiV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwAV0K2jQQP8WWzCsZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyj7aiPsZKx8jgx-nR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyA6AyHRqEiXEhW3oV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyNDFQRp0aoxBKA2wh4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPoXfaldCy0ctCBHt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw4OY83m-fdhN8CYdp4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzSJt5lF0J4X-KmfPx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw8acS3PX-Fr3DcdB94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]