Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Cool bro. Why don't you try letting chat gtp 3,4 and bard carryout a conversatio…
ytc_UgwGgVg3U…
G
Best way to go is get a trade and not have to worry about no AI…
ytc_UgwNeMurd…
G
We thought Capitalism was just going to eat itself. Turns out it's going to eat …
ytc_UgxmsAMgX…
G
I feel like Tech Company Execs you are so big on AI are just betraying their own…
ytc_UgzeMGSZ-…
G
I don’t understand—Citigroup fired 20,000 people in the U.S., citing all sorts o…
ytc_UgxpDydSy…
G
Would rather be manipulated by somthing intelligent then the idiot controlling o…
ytc_Ugw8HU8Kw…
G
I'm totally for it. Imagine if all humans died off they would have a chance to k…
ytc_UggUkRCpz…
G
That's a clever take! It’s interesting how the dialogue touches on wisdom and le…
ytr_Ugz9E9lre…
Comment
on this topic i believe once robots/AI achieve all 7 conditions for life they are machines once they fulfill the first 7 than they must evolve to sentience such a term that should be clearly define ASAP and that any sentient organism or form of life synthetic or organic should have equal rights or at least the four pillars are freedom granted to it
youtube
AI Moral Status
2017-02-23T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgibKKnw0qnP8HgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggiLxFpt8eSvHgCoAEC","responsibility":"unclear","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiH_BILS3yl_HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ughk9klhegKuJXgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugh9YkkFUkp7lXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgheAkP5X8Gq5ngCoAEC","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UginAgDYmWof_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgijBDV5-iAE7HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiUsSTwzN6Bl3gCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg_9SJSZWuIo3gCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]