Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only reason you buy a chair is because you don't know how to build one, AI k…
ytc_UgxvztZyt…
G
The people that own, run, and control this country will continue doing as they p…
ytc_UgzSfNRd8…
G
How about a judge coming up with the ruling on a case then put it in a computer …
ytc_UgzK2Ttkr…
G
Wait until it zoomes out again and say's "Ai art observing Ai art while Ai art o…
ytc_Ugy87HPSf…
G
I wish the head of the biggest AI company had better answers to these big questi…
ytc_UgxOyYCok…
G
Ai was predicted in the Bible so I am well aware of where it ends and that is wi…
ytc_UgwKf1C0t…
G
These people want money and someone to blame no robot can make someone do anythi…
ytc_UgzrysvA1…
G
It would be smarter, instead of trying to achieve singularity like it's some Ape…
ytc_Ugxev-pyK…
Comment
Humans are far more dangerous than AI ever will and that's because we're a stupid species willing to kill one another over things as idiotic as skin color, religion, and political views. At least with AI, I don't have to worry about it killing me because I didn't vote for a particular person, and if that is the case, I still blame people because people programmed ai to kill based on that. AI is a tool, a tool that can benefit us all, and it's safer until humans intervene and mess it up like humans do with everything else.
youtube
AI Responsibility
2024-09-11T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxsmqQfOCgiRS2CH254AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyNc-RrgySkffrF0Fx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHsjUOgfpXB1K2lj14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwICmMDk9n_lRwfcNJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy5M8CZpYS8NT687yF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwM7AN62QE85e_sksp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6zjjhlxDo2VVaowF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyeOUBeUlvZ1zFY76h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyPoYvNGXmtdQRfZMd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzPS5hMLaNqA1tNEkd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]