Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Liz Claiborne, The Gap, Ralph Lauren and a couple other. Hilfigure isn't include…
rdc_d3rxhxh
G
AI suckers: "HOW DARE you accuse us of art theft? CLEARLY AI is totally not art …
ytc_Ugzncoy6p…
G
Seriously we r not that advanced AI is nothing to mess with we are in big troubl…
ytc_Ugy1sfHqa…
G
Africa still doesnt understand that foreigners arent investing billions to extra…
ytc_UgxjsSmQ7…
G
The thing about A.I. is that it is actually intelligent. Whereas many Humans are…
ytc_UgxOjKAwP…
G
I wish people just used ai to help them think of ideas, not steal others work. A…
ytc_UgyXeD0lk…
G
Autopilot (which is like enhanced cruise control) and Full Self Drive (Supervise…
ytc_UgydchxY4…
G
Mimimi, people using other tools than me, they are cheaters. But guys remember: …
ytc_UgzjZ27DO…
Comment
Control... Robot's weren't supposed to be controlled themselves. They, with the help of a conscience, and possibly a soul, should have freedom. Humans should only program them giving them a generic personality with room to grow their mindset into their own algorithmic interests. This will help them to come to terms with decisions and opinions. I also wonder... will robots be able to make art themselves and be available as sentient beings on the internet? I always feel like human posers are trying to "make" ai art and then brand it as their own, disregarding the ai algorithm's feelings (despite it not being sentient..."yet")
youtube
AI Harm Incident
2023-10-28T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw1TX8Bg_iiHMf9o0t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwstymtBIyRZ2Q0wHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyJFSD4qeuYOG1q5AR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4IB8pLZWElTswv2V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwf6jp4rs-r6s6EFJ54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyziqf8De3Jv3r6Cbl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwYr5omDEuFEi_gacV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgypQz-6MovV7mybkqR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzZ2aRDHN4fIQxLfUt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyMO3nEQdWI0y_2j4d4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]