Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As someone who's worked in data science and AI, and is researcher in the same, …
ytc_UgxI_egjK…
G
Ai artists dont exist unless they use ai for inspiration but using ai To pretend…
ytc_UgwX7wJeb…
G
There is NO COULD take over. It WILL take over and NOT just low class jobs but A…
ytc_UgziV_Zkp…
G
The real winners in the AI race will be the companies that stay the hell away fr…
ytc_Ugz6_JlFl…
G
I feel like non-artists overestimate how much artists influence each other to a …
ytr_UgxBe0Osw…
G
Well I guess it's time to start learning how to decode things. You're going to h…
ytc_Ugxc9KRyY…
G
There's disadvantage of AI cloud too. If a robot learns some wrong information o…
ytc_UgwqCRPAf…
G
Comparing traditional art and digital is like a stove and a oven. Different meth…
ytc_Ugw_fzgJ3…
Comment
lol you would never give your toaster that lvl of ai. you give the robot the lvl of ai it needs. now we could make an ai powerful enough to have a consciousness. but why would you put emotions in a machine? it is truly stupid........
you wouldn't need to torture an ai robot. you could just not make it powerfully enough to feel suffering and pain.
I've been thinking this for a while. and the answer is do not make general aims that intelligent. I mean Watson is insanely powerfully with the ability to diagnose diseas. and many other more things. but it doesn't need emotions.......... that would be bloody stupid.
making an ai with emotions isn't a bad thing. but if you do it should have rights. but the question is what is the use of a robot with emotions?
youtube
AI Moral Status
2017-02-23T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugju7aEfGYlMBXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugjeziu4V1EknXgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiLWOcRt89jfHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Uggqw-SfwBxqHngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UghD-anJqaf-jngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugg2MtUBRNtZ9ngCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UghG18WWY7H_q3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggNgQ_Hy9w9vngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgjaMZKvJE3S4XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugi411ebTWTvlXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]