Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is way beyond wrong! Enough no more! People need a safe place to disclose. …
ytc_UgyqFUKHV…
G
The fact that "AI" and "hallucinating" are used in the same sentence. Nah shut t…
ytc_UgyF5vsXH…
G
Elon Musk is not an expert in AI is like saying the the frycook manager at Mcdon…
ytc_UgytUudi1…
G
I just started on AI coding, and I can say it is the greatest time to be a softw…
ytc_UgzuSWKOR…
G
AI and Robots will be cheaper and consumer will use the cheapest option 9 times …
ytr_UgxzNGNxB…
G
I’m all for AI writing new material. I haven’t watch a new movie or TV series i…
ytc_UgzMoUbzK…
G
So basically OpenAI's response is you can't make an omelette without breaking a …
ytc_UgyMaFP9k…
G
Teacher : today go home and ask from AI about humans 2 times for human pictures…
ytc_Ugwg9Qd6y…
Comment
My opinion on this is yes! Of course anything with sentience deserves to have rights akin to any human.
...And that's why we should be very careful to NOT make sentient machines. At least not to work.
It's one thing to make a robot to be your "child" with the knowledge and intent for it to be its own person.
But if it's designed to work, it shouldn't be sentient because that is a civil rights violation of egregious porportions.
youtube
AI Moral Status
2021-05-12T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyiVhpUHGj5CYLtzV94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7-bD7KCzyMK7Gpx94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTD5DYraQZCgVCYQF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHzHLLEavbTt1nvCt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMuF1HAVmiLZL9gYt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxMyhgRzKLITx-vtB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwLE_BLxnWOgFNixA54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyCVuy9G2lUE0WQcR54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy1sk_7O_rte70vZo94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyKM63xrI94XZJQKvV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}
]