Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When everyone's job is automated, who's gonna have money to buy what the robots …
ytc_UgyerMCKj…
G
Ai shouldn't exist for platform because it could ruin a person hardwork for crea…
ytc_UgyukYKSk…
G
Meh, seems like a narrative any generic ai company is pushing to keep the money …
ytc_Ugyrff4wu…
G
What a load of nonsense. They barely have enough energy to run these things, bor…
ytc_UgzhALN2r…
G
Generative AI sucks but I'm playing minecraft while watching and your vid had me…
ytc_UgwxUXJ1X…
G
There are no native people in America today, stop the scam. Most people who call…
ytc_UgzS5iBC_…
G
If you learn art it's more beneficial than ai:
For one, it would be morally bett…
ytr_UgyVwwst8…
G
Could have told you that for free 5 years ago! Medical degrees are literally cop…
rdc_jl4yz17
Comment
Yes, we do. It will save far more lives than it will take. The vast *vast* majority of traffic fatalities are due to human error that an autonomous car would not make. And in instances like this, where a party other than the autonomous vehicle was likely "at fault," a human driver would have done no better.
youtube
2018-03-21T04:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugw-wvkKZc9KIwiNtj14AaABAg.8e1mUHM8HBe8e2Dld8YSR8","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxNSoZT7mB4mHEBUfJ4AaABAg.8e1gkiWiWJD8e1vaOKA7jw","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugz5dBIVBv9sS55rQt54AaABAg.8e1cQlxEJlU8e1f-gRY4nP","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_Ugz5dBIVBv9sS55rQt54AaABAg.8e1cQlxEJlU8e1jRVBNEtQ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugz5dBIVBv9sS55rQt54AaABAg.8e1cQlxEJlU8e1nOKZl3us","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyD1b6mLwQFFskEhzd4AaABAg.8e1cBq7EEz88e1nq_0j6Zm","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyD1b6mLwQFFskEhzd4AaABAg.8e1cBq7EEz88e24J28nNI9","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxb00GdocSAr288C6J4AaABAg.8e1cAXsVQ5T8e1gK5lR_tJ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzpHpTcsqDjgsCglxN4AaABAg.8e1bW4vmhAg8e1tKDxT-C6","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugyyf3r8R8WFb0kWMI54AaABAg.8e1bTumnqMv8e4GT2q8K9i","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]