Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sam Altman is a false prophet. He’s hyping up AI to the moon (because he has no …
ytc_UgxD76VUt…
G
And popularize boosting social safety net programs and things like UBI while dec…
ytr_Ugxd_J1-q…
G
i think AI is good if used correctly! like i also use ChatGPT to research becaus…
ytc_UgzFHowOX…
G
I enjoy AI art and like it. I think it depends on the person. I just dislike the…
ytc_UgzqGEB9D…
G
The fact that AI can generate child porn is scary. It shows how much of it there…
ytc_UgxzTTdL9…
G
I should give up drawing and use ai instead my drawings suck i will use ai inste…
ytc_Ugx9t1tjF…
G
@ghoftamri
To say that AI promoting requires no skill is to discredit the work …
ytr_Ugxc3RXP_…
G
you think there will be A.I. Customers that won't yell at me behind my register …
ytc_UgxIrI4vb…
Comment
You know, people are afraid that robotic entities may one day directly state that they want to destroy humanity, but in all reality, these robots are consciously able to do far less than humans, as humans have destroyed each other for a long time, and still are. In fact, a robot population would perhaps be better then a human population (not hinting the mass eradication of humans) based on the fact that they would have less moral capacity and would have to rely more on knowledge then belief. This is true, as many groups destroy humans based on beliefs of assumption, such as superiority. The basic outline is that we should not fear these robots unless we give them the same ability to fear at a level like us.
youtube
AI Moral Status
2017-04-20T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgiH29RQhVyYo3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjRAdI8CBX503gCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugg3apYuxuw7WHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggFvKK1w8GaCngCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggPDObvrBwGQ3gCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgggzjmFyMBpxngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghYctB0_3R8aXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UghUCAhI_rysgHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggtE_QTcYfjL3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugj_mgcN0FnABHgCoAEC","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"})