Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Like I wouldn't be as mad at the guy that just did AI backgrounds if he didn't t…
ytc_UgyjWY-9U…
G
I get some calls where some customer feel more comfortable with a person talking…
ytc_UgxEXDT2Y…
G
I actually just cried watching this. Thank you for speaking up on this topic - …
ytc_UgyPiZlB4…
G
I understand that AI chat bots can be easier to interact with than many humans. …
ytc_Ugx7kANND…
G
I expected little bit more from Tim, then spreading AI FOMO (Fear of missing out…
ytc_UgySHraBU…
G
DUMB.
1. self driving cars wouldn't tailgate, either giving the car enough time…
ytc_UggCEcSJA…
G
Now if they'd just replace the CEO with AI you can have an entirely AI company t…
rdc_m2akpvm
G
To be honest here.
I don't think AI art is bad.
But it's when people pretend…
ytc_UgzRZXM_R…
Comment
There is something my dad told me when he was teaching me to drive as a kid. He told me that driving is both the most dangerous and responsible thing we can do as humans. Dangerous because we are in giant death machines with other random strangers as fast speeds. Any mistake could kill people. But it’s responsible because of the human element BEHIND the car, US! We need to be aware of our surroundings and with the strangers around us. WE are what makes driving safe! That’s why I hate self driving cars. It takes the human element out of driving and it’s JUST dangerous. (That’s also why I don’t like roller coasters. I can’t control my own speed. I just have to pray that everything is right and functioning correctly. Any mistake could cause the coaster to fly off its rails and kill me.)
youtube
2026-02-16T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxx2dDcsXHjg1vbYNV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyoB5bIT2sHwEAq6GB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBHO2diRt1RWkNu-p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzmpFnpvXOiMhq20WR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz30kq62uKf8fNI9VZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyr09Sjf19iJJG8pa54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugyemg6m1_wlFXTEOxV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw1ii8uXK_ple6hYcx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwwJGBS6H_CEiNuHI14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjbTDyPAG1UVOhKNZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]