Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can you imagine commissioning an artist and calling it "your own artistic work" …
ytc_UgwjnMJhe…
G
I didn't recognise Brett Cooper on the thumbnail...I thought that was one of the…
ytc_Ugz46b7vL…
G
A computer cant not out think people, its is filled with data. Memory is the dri…
ytc_Ugz6csiBl…
G
ALL JOKES ASIDE THIS SHOWS HOW SCARY A.I. CAN BE IT DID THIS IN LESS THAN 4 SECO…
ytc_Ugz77ywlf…
G
\>seniors didn't write assembly either, every generation abstracts further
T…
rdc_oi3zpar
G
i just started tabling art markets and events last year, and i mostly draw fanar…
ytc_UgzdtjdqU…
G
A deepfake isn't somebody's face - it's like an incredibly realistic drawing. An…
ytc_Ugzeretus…
G
I can’t believe we even have to have this conversation. Have we not seen Termin…
ytc_UgwNrPKY4…
Comment
A robot needs a soul and a mind to actually achieve personhood. The robot may have emotions but a soul is a deeper level of emotions(connections, gratefulness, feeling that deep mental pain, etc..) which is definitely not there in robots. And a mind to interpret and analyze those deep emotions. Some people lack a good soul and have an evil one, they have a mind which is also flipped to the negative, so they don't have personhood. Defining whether a person has personhood should be based on whether this person has a good soul and mind or not.
youtube
2017-03-19T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugil32NJpfgSrHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggEw8ymERY9MngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UghL4JoTIN8UsngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggywlE0q-mGbHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjhmQJbJHbV0HgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiW-41Ld8cOoXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjmo31Jk3kpq3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggGBCDglxquWngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiFxDfucP3UrHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjgSplThe2MfngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}]