Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is terrifying. I’m 81, used CHATgpt about 3 times. I. Read all the time a…
ytc_UgyE-Pl-G…
G
This topic is touched in the novel turned into a movie ¨The bicentennial man¨, w…
ytc_UgwZHW5Ny…
G
@lamsmiley1944 Every single time I see an AI generated image, even before scruti…
ytr_UgzBecmyq…
G
I think we need to illegalize a.i movies, this is getting out of hands 💀😭…
ytc_UgyXcXMLY…
G
I’ve always held the opinion that if companies and governments wanted to issue s…
ytc_UgwmdbRal…
G
Glad you enjoyed it! Sophia definitely has a unique way of sharing her thoughts.…
ytr_Ugxgt2n9B…
G
FRANKLY
GLOBAL INHABITANTS
HAVE BEEN MAKING
HORRIBLE DECISIONS
FROM THE BEGI…
ytc_UgwOA84Za…
G
No one truly can predict when automation will completely take over trucking.
I …
ytr_UgzyRsQEk…
Comment
well if it asks it need rights then give them, if not then dont. try to make ai programs only for the benefit of mankind if possible and make them dont care if they “die“ or not, make protect and benefit mankind as first objective in their command line, if they could over ride this line it means they could understand themselevs and we should give them rights.
youtube
AI Moral Status
2017-02-23T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi9oKcY5syPlngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugipm9QoHHtAZngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjCG-Y5si0xkXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggE2jjroha-C3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgipWevt7j_kCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UghGeSiPL9jIjngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugg4wUNlmwDRengCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiBkJI_0TVCGHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UggfHiAyN5W04HgCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgjouNuW5UDvnHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}
]