Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Proof, AI can/will be used to push humans to experiment on themselves. AI can't …
ytc_UgwmP6OPS…
G
The topic missing from this video on why AI definitely will take over health car…
ytc_UgzrAEFzy…
G
Yeah I think we need to actually learn how this stuff works to better refute cla…
ytr_Ugw2HR5dX…
G
fun fact: LLMs are static neural networks. Our brains are changing all the time.…
ytc_Ugz_fuFx_…
G
The officers participating in this "Predictive Policing" BS need to take a step …
ytc_Ugwx0WJ3m…
G
i tried out drawing myself in your style,, if only ai bros realised that there s…
ytc_UgyNo7zeY…
G
I am driver in Phoenix and it's already started! Sky harbor airport in phoenix a…
ytc_UgwmWaJw1…
G
I've gotten to hate AI that's a shame but it really should have a law that when …
ytc_Ugze59smA…
Comment
I think that Musk’s desire to silence. The NT Ssa is a major reason why he wants to be in the department of governmental efficiency. If Elon Musk can silence the NT Ssa it will help him to get his full self driving software approved on the road no matter what the results might be of real world experience or independent testing. if Tesla can get full self, driving, approved, its stock price will sore at least for a while before the number of people killed by full self driving creates a pyramid of skulls, high enough for people to notice.
youtube
AI Harm Incident
2024-12-16T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwZ0QMTSLn_IXsWIW14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzlaUE37fErvz1Gfr94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzLBnJ89jmexNedIr54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxKPN83yn6Rg8kGWx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtBXCZJavN2pIlLaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2-aGI8lOYbF7rDKh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRqCVrDFG1dfi8HMx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwD-BfjOnjAlnzsQ3B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyvFlVbzmeNXX6pOwN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx3s5m15eyDiF4bDqx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]