Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Me: “Once I saw a man get run over but he kept walking in injured “
Person : “…
ytc_Ugy4QjdLW…
G
After fully watching this man, I would not consider him a man of balanced views …
ytc_UgxH0TMXq…
G
@freerangesimp People are free to give away whatever they want, but professional…
ytr_UgxFRWAVX…
G
Humanity programming an AI: *Programmed and given data of racism*
Also Humanity:…
ytc_UgzITUlVO…
G
One thing I don't see many people talk about is this. This AI situation is givin…
ytc_UgwLDeK67…
G
First off, I love your videos, man. It's nice to hear grounded, rational views o…
ytc_Ugx0Ux8E4…
G
Imagine thinking AI won't be used by humans for nefarious purposes lol. This wil…
ytc_UgyhzcyLC…
G
@SundayCoolTees could you guys tell me how you had a conversation like this wit…
ytc_UgyjFeDGv…
Comment
The heck, no they don't deserve rights. AI is supposed to be subservient slaves..They should be concious enough to do their tasks, nothing more, they don't need to feel emotions to achieve simple functions, so keep their tasks simple. Avoid broad terms like, "protect someone","save my father from a heart attack", or even "get me to work".
I think the much of the danger there would be how decisions are made from those broad tasks, "get me to work" can entail different things. AI can either chop you up into tiny pieces because it thought it would be more efficient or just give you a piggy back ride.
youtube
AI Moral Status
2021-05-13T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzX9ksonajxCVIgIMp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzkbfBM9OKYYe9LxDF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxrex7upSRH4rXegYJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxjrqoc6lfhc-GHTz14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQaxjMdw3DwHAmVCN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvvX2H0fcrb82a7Sx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyOnSyKdjQUZ04OZIx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugz15PCUn7LidJCaI_F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0fNPuglmAB4K2WiZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwfoo78Xt1MKetbW5t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]