Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People remaking the AI image by hand in their own style reminded me of the times…
ytc_UgyiidJ01…
G
Dude: I have AI make art. I will not lie about it like most people and companies…
ytc_UgyANtXEf…
G
We appreciate your engagement with the video! Rest assured, Sophia is a sophisti…
ytr_Ugz6DtRlS…
G
Not trying to downsize the importance of the matter regarding this huge, unprece…
ytc_Ugzk5_YKB…
G
Smart phones were a technological breakthrough; we used them to do stupid things…
ytc_Ugzmlzloe…
G
We don't need more smart people. We need more wise people. Wisdom doesn't come f…
ytc_Ugxr39st3…
G
Don't make me laugh about A.I. solving even global warming and child suicide. It…
ytc_UgxLDZgh5…
G
there are way moe advanced robots than that, u cant tell me humans can fly to th…
ytc_UgxAnxT42…
Comment
Any conversation about the failure of AI needs to compare it to humans. And if the system is more accurate then humans then it is not unethical for the system to be used.
Human judges are often prone to error and a Louisiana study showed that actions such as how long it had been since lunch or what football team could change their decision.
While it sounds scary to say ‘oh a computer with determine your guilt and you can’t know why’ remember than right now a human does the exact same thing
youtube
2022-07-28T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy8FKxiuDxadf93dyx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyerXOPx2E3Xgg1Fw14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1RE4ja05o4SykMjp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzzj6qlt-_pyddTL8Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwdcEn9ijERgFXTYph4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxX5ETT7IvfdI_zvGR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw2lA7a6b9IHrUqTCl4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRzaNteHUQtMDnHmZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxV37bQoTFIluNfghZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxf4GmYkuFksFocVAN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]