Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m here for the AI art discussion but what I first noticed was your shirt and w…
ytc_Ugwl_bVMn…
G
One of the biggest risks, in my opinion, is far more subtle than weapons. They'r…
ytc_Ugz86Mc3s…
G
I think it is reasonable to say I intensely dislike these people like Sam Altman…
ytc_Ugz4lIEMC…
G
AI "Artists" are more like a person walking into a restaurant, stealing three ra…
ytc_UgwyxQCiH…
G
Personally, I would be happy to buy and read a book written by an AI if I though…
rdc_jdjx73n
G
For like a year artists keep crying about AI "stealing" their art, but for the s…
ytc_UgyCTglNp…
G
This is why all of our chats should be guardrailed so that they can only be used…
ytc_UgyGHo3UV…
G
I keep thinking about the personal robot that was folding clothes and then snapp…
ytc_Ugwi1VgGj…
Comment
Harnessing AI in warfare has a pretty central and powerful focus of having better situational awareness, while limiting as much human error as possible. This looks alot to me like how we view self driving cars. We have a system that we are certain will take the 50,000 motor vehicle fatalities per year down to near zero. It can literally turn 50,000 to a total like 10, and yet we look at those numbers and are far more comfortable with 50,000 human made errors than we are with 10 computer made errors. The fact that humans look at the situation in this way is exactly why we need to stop looking at our flawed logic as acceptable because we have a borderline sentimentality about our own mistakes, and zero tolerance with any others. We aren't that far removed from taking anecdotal information as gospel at this point. This argument is proving our own ineptitude to make good decisions anymore.
youtube
2025-02-26T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwJ0tHAquNoztxFBbp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzw23yspTWO88WW-sp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgweDQ6TJlzNpo5-edF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_LZpkZJGOBin38uZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyQBog6mlPsE2bDl3l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwuFo0Z7SwRpsOb6tx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzwyC89jUsTheA3hV14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwaAP54YjBEkTzywsh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw6Amk3ZV18ViAU0JB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugycpws6FNPhCNIfT5d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}
]