Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What I don't get is why these AI companies are building these human-like robots …
ytc_Ugz6GpYdd…
G
Because then it’s going to be a very slow process to get a solution both parties…
rdc_esqhhpj
G
Maybe it's not so bad and real AI, not just current imitators of intelligence, c…
ytc_UgzcSeWhC…
G
And AI voice very impressive. Female voice close to the real thing. But programe…
ytc_UgylUicjU…
G
The irony is it technically isn’t his art, it’s mid journeys if anything and eve…
ytc_UgwMqprZJ…
G
you let AI train on the cesspit of twitter and reddit and you get antisemic raci…
ytc_Ugwou43ZE…
G
I can already see the lawsuits flying. Wait til one of these "driverless trucks"…
ytc_UgyeR4hEF…
G
@group555_ the internet created thousands of careers availbile to many factory w…
ytr_UgyNb4ZIV…
Comment
Way more un realistic despite your advancement in knowledge and funding nahh go again boys right idea wrong execution you went to irobot-y should have kept the real look of the boston dynamics also the robot/android wouldn’t be talking to the other robot eapecially not like that doesn’t fit in reality and i think the light reflection felt a little off and felt like an immitation for some reason maybe it just needs more time and you rushed it out just cos you released it doesn’t mean it has to be done theres a drawing board for a reason lol
youtube
AI Harm Incident
2024-01-14T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz60L4asK0uj_NE39Z4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQ2ocJsmkpwMb-H4h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyJH1YwZebXgru40mt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxo_LVgx0Dz57Qqkf14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwmNvnYdwbBp03UL3t4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1VXFi7i01x68cJaF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzQuGmjA654LvYCBrt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz3mivL4lfi_fB3_2N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxfV1SJPNzb5ARwDW94AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugyvl0XUwUN4go9FI0x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]