Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm just saying, the argument that "they are not programmed to feel pain" is bog…
ytc_UgxvpBSM5…
G
Your silly anti-ai argument crushes as soon as someone would not tell you that i…
ytc_Ugy6uqmJy…
G
This is 100% reality with the godfather of AI who has predicted the future if we…
ytc_UgwMjB46P…
G
So, AI isn't quite there yet, but the reality is that AI very much can do the jo…
ytc_UgwIO3EMZ…
G
We just gave AI drones the ability to decide wether or not to kill us. We’re coo…
ytc_UgyvYB4Mz…
G
My goal is to keep the arts AI free in NYC area. I believe we can switch the wor…
ytc_UgwsD8aIW…
G
Can someone explain this to me? I’m not fully caught up on the ai art thing…
ytc_UgxQiQ9Eh…
G
> 11 - Most of what's "factory farming" is a term made up by people who oppos…
rdc_llbv92r
Comment
This is an important issue He’s right-they will be way smarter than us, but they very probably will lack emotional elements that make us human so it is likely that they will act against our will, against our humanity.
I think AI is a cool idea, but a dangerous one. We could easily end up serving them.
youtube
AI Jobs
2025-03-25T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwzbDtPgLze7Q2pJkR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxssQJWkklC6iufQ914AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxbfvaj9W07EbOjhpN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyL3EEfAZADzhpCL3V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzuvjIN6Wchfj9PSRl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzMU7jUTh95RU7PtpR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxKrayjrSF61izlJq54AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxaNTvh9OcEiVK1dit4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1a6Q71eVL_3vqnrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwW8YNI-iYn03aju2t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]