Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Earlier this week, I heard someone on NPR mention how the Teamsters union has be…
rdc_dmoyatw
G
Odly enough, I think inserting WOKE philosophy into AI is the most dangerous thi…
ytc_UgzqkB_oD…
G
It really grinds my gears the fear mongering that White men implore. Like it a d…
ytc_UgyDzU5mX…
G
A lot of bosses don't quite understand how to use ai it it's capabilities. They …
ytc_Ugy0asARU…
G
I dont think AI taking our jobs is a bad thing, if they do the job well enough. …
ytc_UgwVvSN0Y…
G
@goblincookie5233
Yet that tiny bit that does involve consciousness is what ma…
ytr_UgwAm8CsC…
G
"but it's numbers and algorithms and it uses that to tell what's going on and us…
ytr_Ugwk2rRCx…
G
Ok, I watch you're podcast alot, so I've been pondering and being an underdog wi…
ytc_Ugyya0PjB…
Comment
Waymo is driving 17000 miles between "critical disengagements", requiring no person in front, meanwhile tesla is at 250 miles between "critical disengagements", tesla is so far behind it's not even funny. (And yeah tesla what happened with geofencing huh, when you end up doing the same yourself, with even higher geo fencing AND also using HD maps which you said you didn't want to use). And what was the quote from elon, "it's not self-driving if its using geofencing and hd maps"?
youtube
2025-06-25T16:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx3LFVqMga3fgERjCx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfLkks0rBOYiaZW7x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBkXCvSvH6e_P4XrF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx61uPLeMljxl6Y8dZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxg3PEenEYQyUwsA7J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxdsPpveQhmvPAY19t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgztGtpnl6ShR3V8Kfl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxpjG_spMBCFwi7E1t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzOqBnstOPAMnBqEUt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxBU7hU6lRW2CDP3x54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]