Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can you inbed in AI ethics and morals, so all of them would obey laws etc???…
ytc_UgwdS5Gd4…
G
It is not all AI, The push for profits and Return on Investment has dominated th…
ytc_UgxsdiBBj…
G
>If you're one of the billions of people who have posted pictures of themselv…
rdc_izks94k
G
@jaywulf
a) human learning and AI learning are not equivalent, this is a bad f…
ytr_UgxbYoUaQ…
G
AI CEO raises alarm over the safety of AI systems. His solution? Throw more mone…
ytc_UgyVWND0R…
G
AI in itself makes no sense
How can man give life to an inanimate object
How can…
ytr_Ugg4W92XV…
G
@hippoduck1 Your take is so true! :( I don't understand that thought process ei…
ytr_UgzZYDa8r…
G
ai needs to be regulated this is not okay, all this guy needed is someone to tal…
ytc_Ugxxd5uln…
Comment
I've heard of this face recognition app it was only tested on white faces it was not tested on black Hispanic or even Asian faces only on white faces so of course it's going to fail when you've never tested it on someone with a darker skin color the software can't identify or differentiate between two dark-skinned people because it's never been tested for that it was only tested by the person who created it which was a white man he used his face to test it so in reality it's not really the police's fault it's the software's fault and the person who sold it to the cops saying it would help them
youtube
AI Harm Incident
2021-11-24T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzX6Hu42s-t0fbl9bV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_UgwkEpSBoVZ7a_pF36N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgybJNN6IPzoTpnxEXl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxPsPqemGHKPDDNN594AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugxt-j4S8CvLntw1GiV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_UgxO0Uo0fSvKrvVGbmh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugxq8u-uELmJtMhOnqF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgwXq4EQPgiP6lxwDaZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},{"id":"ytc_Ugy02uJF4cbSqs4N89F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_UgzCfnoYUL3A58GTCRJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"]}