Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No one ever talks about it after listening to so many meetings that if you displ…
ytc_UgwzuhDVk…
G
This chick in the year 35 is absolutely loaded, as if we'd have AI for free. We…
ytc_UgzTlZ2Zo…
G
Lovable now does frontend and backend, with fully wired up db. Only a matter of …
ytr_UgzJ6zhrX…
G
US is spending $600B in direct aid to people which is roughly $65B if adjusted t…
rdc_fn5pzk1
G
Matrix seems more likely, but I predict a robot will not force us into the pod. …
ytr_UgxTmqMPj…
G
They used "AI" to replace what it actually is (algorithms) and scare people.
N…
rdc_jkfy6pm
G
I had not seen [Samantha.ai](https://Samantha.ai) before and it is mind-blowing …
rdc_ji5jzeu
G
Straight using it. Yes. But using it to assist your art is way different. Exampl…
ytr_UgxbIoyG8…
Comment
I do believe that a complex enough robot would deserve rights. If you think about it, our whole body including our mind is basically mechanical, we are basically "robots" ourselves. A complex robot would be just as deserving of rights as ourselves.
youtube
AI Moral Status
2020-11-19T18:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx-nFWyiUy9ouWq90l4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxxFgWyTVsMvWaYufB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy8MhKdh7yhSELlMgV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwaycaKcv0pFQz5JVl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZGfM2kCpfpzs2t3J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyhXbB9hO43-9wA0ed4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaDRqykz1UJgDXCqV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwZHW5NyvTqzkFlbPR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy1aJjcOly2eQPKXat4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyxmnkfgEH85wywmol4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}
]