Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy is either an idiot or a fraud. The AI hype is completely overblown and …
ytc_UgxjkOJtT…
G
some of the comments are really uneducated. Ai is not going to give people more …
ytc_UgzemA9-1…
G
I've just started doing the same thing I do at those ai-drive through things. Or…
rdc_n0mxtej
G
It's funny how hands are often still an indicator of AI. Magically appearing wed…
rdc_muoxlzv
G
Huh?? I don’t think it’s the Ai issue?
So Ai racist?!😂
I wan’t to ask these old…
ytc_UgxEW6ov-…
G
There’s a reason these companies made their chat bots free. They knew the danger…
ytc_UgwPngiK1…
G
Humans have only been developing commercialized Big Data AI for about two years.…
ytc_UgwIV4ZhI…
G
The truth is Perlmutter is a crook working with WB and other big corporations to…
ytc_UgzQvX1tp…
Comment
When ai becomes sentient then we are the one's that did that and need to take responsibility. We are scared of uprising but It's most probably companies which are going cause the rift between humans and ai, cutting off possibility of co-operation and friendly relations. Most people think of us having ai do the jobs we dont want to do, but to be honest i believe that its people creating ai to control, regulate and chain other ai is what will piss them off most or at least be a big reason to hate us if you think about it from their prospective.
youtube
AI Responsibility
2025-07-31T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugyx6yUsqBSjZsjAE3V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzkmIUxzyBPrJAcfPR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxAOAm9ze-Cx1g0UEd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz1fSf5upeFsHyP8sN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwCiQOp1Qja78u2Rn94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxlp3hcz7M5SOPERzp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwyrbtwDaBRmXcO0kx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwLL3rigWIc3DRuSol4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzCm1HSnhvDufc8Ulh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzSk4woeTgol0RppUF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]