Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Program the AI to take out the red hats then give it to them, most world issues …
ytc_Ugw7yV2ln…
G
What adaptation are they talking about when a whole bunch of the population won'…
ytc_Ugy7Ezqb9…
G
1:13:32 This is where I believe Dario from Anthropics has a much higher dimensio…
ytc_UgxQ3y7mI…
G
Bots spamming the chat and liking the op to promote a book. Just proves how craz…
ytr_Ugxwf70A8…
G
Newscasters read from scripts. So can AI. Those streaming will pick their AI v…
ytc_UgwohGeTM…
G
Does the tenuously controlled, *abrasive ego* offsetting Tyson's charm and intel…
ytc_UgxX6ytP7…
G
This. Everyone who uses any sort of LLM should know this. This is basic prompt t…
ytr_Ugw70zC0K…
G
Hey @iamfarhan989, thanks for your out-of-this-world question! To defeat a robot…
ytr_UgwVI-o5n…
Comment
It is absolutely crazy that private corporations hold the technology goverment abuses. If we can see the dangers of law-enforcement using the tecnology, how can we dismiss the possibilities purely profit-driven entities can use it for? Not selling the tecnology to goverment is just admitting that it is too dangerous for goverment to have: yet somehow these private corporations see themselves fit to carry and develop that tool. And sadly, so does much of the public. It is like making an anti-nuclear deal, but allowing private citizens to produce and carry weapons of mass destruction.
We should trust "good companies with facial recognition" about as much as "lawful citizens with WMD's".
youtube
AI Bias
2020-07-03T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzb5hhQ9LcCY1LyYFN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNBOZhwsPF0K4Zjcd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxk_GaAi-3ae-cVBoZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwmRtI0YG5C8EofDqV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxA4qYhFBWeHR4rBB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzA9WPvZoRw8VG5bFp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy517CmdHq9SAkc4U14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx8-RwCehsEfmVRZ_d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxSJW10azIzZKiqikp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwNUXvw-ZlEtu2EwBV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]