Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Or maybe it entraps humans like by a simulation (I forgot the movie), in which a…
ytc_UgzciM4VU…
G
Gang, artists being upset at AI “artists” isnt “gatekeeping art”.
It’s drawing …
ytc_UgxZRBibV…
G
If you code the AI yourself from scratch and you create all of your own artwork …
ytc_UgyXxmZx0…
G
Nailed it on most points as usual - with the glaring exception of that "people w…
ytc_Ugzh5jD3y…
G
🤖 Why AI Won’t Kill Jobs
1⃣ Jevons Paradox
When technology makes something chea…
ytc_UgwrRdN9S…
G
In the Bible according to Matthew 24:22 CEV
If God doesn't make the time shorte…
ytc_UgzZcrDNm…
G
I like to use AI generated art as concept art when I'm trying to put something t…
ytc_UgyzoTRuf…
G
In 2 years I will join college but what will I even study , or what can I study …
ytc_UgzMAFRQe…
Comment
No matter how accessible Tesla makes its warnings, that "we only have a human for legal reasons" ad is going to create the assumption that those warnings also only exist due to antiquated laws that don't account for self driving vehicles. At this point the only fix is for Tesla to come out and say "we flat out lied, don't trust Autopilot," which would probably create other problems for Tesla.
youtube
AI Harm Incident
2025-08-16T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy6jc29xY9Yu2xo1Rd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzi1ZaK1Gdma3Qj8xd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxCN1JAymUM7Ha6oyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgynHWVGjsKdUpzhk5h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxTsi5fn5x6sSux5-V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgywHYWkZw-SZpDbkrB4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyQEC0FaXVdGxop7iB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw066AApXNjFtxCJ7J4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyb5ol4Qvmde5uSVCJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxUDZ4Yg-Nglm6wUTh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]