Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its felt linear, is the most insane fucking thing I have ever heard. What. are. …
ytc_UgztM6sNO…
G
We will be the ones who kill ourself because we wanted AI and I have read that A…
ytc_UgyBUdiYP…
G
We need more people like you julia ❤ this ai is destroying the real traditional …
ytc_UgybWMV3n…
G
Good im a chef, AI will need a hell a lot of time to learn make brilliant food…
ytc_UgxDfMTul…
G
This new addition of AI Reality Check is awesome! Please keep them coming. I am …
ytc_UgyJI377J…
G
@LL-vg2kd there is a huge difference between getting seen by people in real lif…
ytr_UgxqBt-1x…
G
just 50 million taxi drivers , now count all rest of jobs that it should delete …
ytc_UgxDBnmsB…
G
They are not joking about AI taking over jobs. It's interesting that back in the…
ytc_UgzI4wFXx…
Comment
It occurred to me that the ability to feel pain/pleasure is what made us what we are today. Without those and with only the basic drive to survive there's no reason to evolve as a species. If you don't feel pain and you wouldn't care if you died, then why change? Humanity would never evolved to where we are and would have gone extinct a long time ago. If we applied the same logic to robots, they would never develop consciousness and would never get to the point where they would demand rights or want them. So I think giving robots the ability to feel pain/pleasure is necessary for robot societies to form, otherwise they'll always depend on humans for survival.
youtube
AI Moral Status
2017-10-19T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzPEGu4HGHNUfKVL5p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXDHgqGs3BAdW7QV94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWM3z1SFDAhvfggJx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRnz8y6arWUxgk3pV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyRqu7OqGqzkVLvCP14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAL9THQl5YGNvKej94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwxHLorRKIR9x98dfV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3SDO2ms_3YSL_DbJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzOkWxCifF6GgfExbt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwzWE98yeVe5AJptm54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]