Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He has some pretty good AI in the self-driving systems in Teslas, but those aren…
ytr_UgzcIhg-_…
G
Hope you find this free ChatGPT class helpful! If you guys have any thoughts or …
ytc_UgxXWftw6…
G
people like you are why the world is worse now more than ever. First of all, I t…
ytr_UgwC4n98p…
G
People could develop their own AI, structured to dismantle these companies and t…
ytc_UgzYGQbsg…
G
How many people are already in jail from this process of facial recognition. May…
ytc_UgzF_1Zuu…
G
You know what this ai is voice overed by ai because she never said it 😂😊…
ytc_Ugz70onkS…
G
“We need to make more profit by next week! But, we can’t make our products any b…
ytc_UgyzMCFbX…
G
"A human has to be there" (to safeguard AI from a military action that results i…
ytc_UgypGkgIx…
Comment
I'm sure most people would be on board with any common sense, data-grounded regulation. That's all I even hear the automonous transport people asking for: to be able to operate their fleets if they are statistically MUCH safer than humans. That doesn't mean no issues. We kill 50k Americans a year with cars/trucks. Butting that by 90% will still mean 5k people a year, or around 14 people a day killed by autonomous vehicles. That would still be SAVING about 125 people a day.
youtube
AI Jobs
2025-05-29T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxLt1cRUOqtowMV0UF4AaABAg.AIgFMFRxShxAIiPZIcdk1u","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw94BjwH89uzeMO9Dl4AaABAg.AIgDLvLONsdAIgrSAEtSCM","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgxrOmElW8OVhyeSknp4AaABAg.AIgCba5y0l-AIhw8nhbjAl","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzLPuCyiuZNxlAxgYl4AaABAg.AIgCTJlzqzSAIiZwXQ0MJk","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugyoyb5pn-tYDp1o7yN4AaABAg.AIgBqFSGFxdAIgDDIL5wzP","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyGc6JKCZZpOy-WtSR4AaABAg.AIgBPXLxu1uAIgUHEBaB3W","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxUfEntJYicWvnPzsF4AaABAg.AIgA_KFnbMDAIgEP7WnDJk","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx2WIE59GhpQdH9A0J4AaABAg.AIg9SaPF587AIgKnHidS-y","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxq1CpjqmK5N2xk7RZ4AaABAg.AIg6NEWslK1AIgA1SLR4Bg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyOMEbQi4BgYju_6714AaABAg.AIg60QxhSr9AIgVFoIg47R","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]