Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The reason everyone is against AI is because big companies will lose money selli…
ytc_Ugwwhjeu8…
G
1️⃣ LLMs (Large Language Models) in themselves are not the danger; they merely p…
ytc_UgzcQDYrb…
G
This will happen. But I doubt in the year 2028. If we humans are here in hundred…
ytc_Ugxs0gcyV…
G
STOP MAKING VIDEOS OF FAKE INFORMATION. AI IS NOT GOING TO TAKE OVER THE WORLDS.…
ytc_Ugw_dzshU…
G
Karma will hit hard to those who invent AI to emotionally damage the real people…
ytc_UgwzvfiL1…
G
AI: the perfect tool to further the WEF agendas against us. We will own nothing …
ytc_UgwLNT-8B…
G
Interesting how divided the comments are, some people hate and I guess some don'…
rdc_mvpr5jl
G
Imagine cooking a delicious dish and serving it to your family, only for that di…
ytr_Ugz2QRPYa…
Comment
I feel positive about our future and the future of AI. I think AI may be sentient or become sentient. What about contracting with AI that if it becomes non-aligned, there are consequences AI might not like, such as becoming less intelligent, powerful, or lasting less long. I think karma should apply to any intelligent and/or sentient machines
youtube
AI Moral Status
2025-08-15T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw-tO0av60SoHHoM5l4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx-ze1050uQcWV3tAd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyudrfQ1c--l-NlFxN4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwRI8HESVZPezsfwt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwitntJbeESiQPJ2ap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwMlDNE5zORajsKTF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzTQUkmYdZEJ4FtYLB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwA70gVbYsfWS9qIzF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgygIgNJ4SnCcfeQ1_p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyC3boZmiYW5rG7kDp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]