Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Chat gbt & all public facing so called Ai is a crock! Nothing but signal weight …
ytc_UgxZJYR4Q…
G
If AI doesn't want to control us, why does it keep portraying us like puppets?…
ytc_UgyUdHE5R…
G
also, we are training the ai with every interaction. i prefer to train ai with k…
ytc_Ugw6egpJU…
G
We can influence the outcome by NOT BUYING. Stop feeding AI. Stop feeding the ol…
ytc_UgyjyrPKd…
G
This all sounds great and all but... at he end Ai is just software, meaning it c…
ytc_UgwzY6b0r…
G
So we're all going to lose our jobs, small businesses, and careers to A.I.--that…
ytc_UgwYiMww2…
G
Bro since so many people have access to ai art they are starting to sell art the…
ytc_UgxizSi6c…
G
We are working these big tech companies and building
Robots and weapons
We …
ytc_Ugxnb3FRQ…
Comment
Just imagine getting killed by a runaway self driving car like the ones in San Francisco that people there trust as Much as Pelosi and Newsome. Who bears the guilt and liability? Should Voters ? What about people who are forced wrongfully to share the burden of liability and guilt when they did not vote for those responsible for making the decision’s that adversely impact them. The concerns outweigh any benefits from technology.
youtube
AI Responsibility
2025-02-27T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxJCnk5X-TAtfcdVYZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwFu3jFBG8hcYIiITx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy0T0Ib3cBVyxG3tHx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugw5fN52xgAUbH5Q46V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzujIgIwghPJLboHTp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxjxbz2JGoivvqYCNZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwE6sLcn6EK9ZWEbcJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy_9BonvJDrGjy4Mvl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyEIUYjOjA9FiE0cqR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwoaEvNbOXU27uDraR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]