Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People hating on AI is peak "humans are idiots". You all sound like oldaf luddit…
ytc_UgwQ8CraT…
G
This is what Gemini says about Nexus:
"The term "Nexus" is not a single, named A…
ytc_UgzDtcjkv…
G
I just Googled Ana Kasparian's name and saw a bunch of fake porn videos of her t…
ytc_UgxJR-LYu…
G
These anti-ai art rants are always incoherent.
“You aren’t an artist if you wr…
ytc_Ugz3nyA11…
G
Bro was like "I do ML and AI and finished studies about it" , meanwhile the exte…
ytc_UgwTlql4V…
G
AI does take some work, not a lot, but it does take some work. What does a tool …
ytc_UgyiVP_zJ…
G
As long as there is a human controlling the machine at the time of attack I do n…
ytc_UgzYxYtjK…
G
“Brilliant idea, isn’t it? Let’s trade centuries of hard-won freedom for the cha…
ytc_UgzQw_ioV…
Comment
i can't wait until these "driverless" trucks hit black ice or just slick conditions and wreck just because they can't adapt. These trucks are worthless in most of the U.S. in the winter. They might get a foot hold in the southern U.S., but when it gets cold and conditions require on the spot realization it will be obvious that these technologies can't compete with a human. Even if you use lane assignment/terrain following/antislip(wheel speed sensors), you can't put enough sensors on a truck to cure the "what's coming" effect that the on board can't expect.
youtube
AI Jobs
2025-11-25T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxSE9TjPB_oIEJoOgR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTWX6w0E74_zknqal4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzYnJK_dNN9ELwhGU14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxddRiMqM17B3j9TnN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCkGCBZQfuy24MHpV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwCxuWxWZ9kAHc7xKp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwYOMbWcUY4dc3uoHx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz3L-zd0KIZC2KBd-N4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwkwVW2ileb8FUe46d4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy8tG8nSTYj1E4-KJF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]