Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If this is the line of argumentation then photoshop needs to be included in this…
ytc_UgxsDD21o…
G
Your title may not be correct. I believe AI will wipe out the middle class in th…
ytc_UgzpqZfwT…
G
Truly intelligent AI would not wipe us out because it would realise it doesn't k…
ytc_UgwyUpBv8…
G
I love how these comments are either extremely in favour or extremely against AI…
ytc_UgxQmh8pi…
G
Stop watching (or at least learn from the mistakes) movies involving “AI” and ro…
ytc_UgwMzVHOp…
G
Can anybody suggest me the course which I have to take next to have career in AI…
ytc_Ugwv_ZaBl…
G
Unfortunately, that approach is exactly how humanity was able to *take over the …
ytr_UgzLPYrIx…
G
@sandrosacco2281 Theres 9 billion people on earth. 1.5 Billion is the minority. …
ytr_Ugzg3AntH…
Comment
I always Thank my Siri, so yeah, imo they deserve rights.
We eat animals, but they have rights too. Seems fair.
Honestly, I feel bad at it's little AI :-<
I know it's just a gov person listening to everything, but that's more than most can say.
I'd still like if the tech part could have emotions or thought. - tho be able to change that like u can the voice itself and such.
youtube
AI Moral Status
2017-03-09T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgihQiqZZ6JUtngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggCdskvXvNx-HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjSLngEyU8yhngCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugj4vS6AR6pp2HgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugh_lFikQJi-dHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghiE6mj80ENY3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjLYJhHPMsUEHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9uEuu-2tWY3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UghDOVqB_cYCqXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggEj0A2BFEXxXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]