Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
he's playing you all. he just wants to be the one in government who heads up co…
ytc_Ugyq1Kste…
G
Exactly.
Take away the buzzwords and marketing, you're still left with "Non-phy…
rdc_mva7wo0
G
When it comes to AI, the humans that will have a say are the ones that programme…
rdc_gd8ct18
G
I would guess that every current available LLM in the world would also agree tha…
rdc_ntagyr1
G
As an artist, I've noticed the artists most threatened by AI, are people who are…
ytc_UgymWeiHh…
G
I hate this bs ai! It's not thinking! It's just a large language program. Garbag…
ytr_UgwGm8srW…
G
The sad part is, it's totally doable ethically and I wish I was a billionaire th…
ytc_UgxOrXucW…
G
The problem is not the AI, the problems are psych0 business owners who value the…
ytc_UgybPGLWO…
Comment
It’s not completely AI with these self driving cars. They have emergency remote drivers that take over if there is an anomaly. Tesla also hires people to put on VR gear and drive cyber cabs. The goal is for it to be able to operate completely autonomous. And they say that they are “learning” from the human drivers. But in reality I’m sure there will always be hundreds of drivers working remote that take over during certain events.
youtube
2026-03-01T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy_KFQFEGZceQxkHJV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyDKemZ8enkTJs_WbB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDl6iNA733tMQJmQZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgybkwQ3RLCxDLAvjJR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyneUiSC5zdasCkZxZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxsYVPas0_73WLvW0t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugx51qmPU3xVe2mKE2h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxx83yvMIFiLfR2vdF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxNtBbcPiBp14PHUZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwfImnHuTH4-_OPhFp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}
]