Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Here me out this is a good thing we all won't just starve you know we will evolv…
ytc_UgzD9Y6zF…
G
The only way that AI will work is by serving us entirely, so we can focus in our…
ytc_UgxA-YAE7…
G
Yes. Machines MUST be given rights based on there complexity of intellectual sen…
ytc_Ugh9iU4V9…
G
"Scary Smart" by MO GAWDAT ex Co.Google X . A must read 📚👌aĺl About AI .and ho…
ytc_Ugz78Fepk…
G
Yes and nothing wrong about both exist. Just that don't try to sell AI art lmao.…
ytr_UgxDjtma0…
G
"Ai will be the greatest thing humanity has ever made! The intelligence that the…
ytc_UgydOBXZn…
G
Shouldn’t war crimes be coded into the thinking of the Ai. People would have to…
ytc_UgwW2o3-L…
G
There is no law in nature that states that human's are defined by special things…
ytc_Ugw6Yk7EB…
Comment
No, they are not. For all their flaws, humans are still in general much safer drivers than current self-driving tech. The latter still have about 1-2 decades to go before widespread deployment may become feasible.
Don’t believe the hype: none of the companies involved are being honest about how hard an engineering problem self-driving tech is to solve, and none of them are championing the level of government safety standards, inspections, and testing that will be needed to make these vehicles safe. They are playing with fire, and it will probably take some extremely large multi-billion-dollar wrongful death lawsuits before they finally come to grips with the significant flaws in self-driving tech.
youtube
2025-06-15T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugw83CSHuR28PMDLvTB4AaABAg.AT3Gdovxs_6AT3PBzqWs00","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzKDdfvg_yUFrpHB494AaABAg.AT3GLlYtmT6AT3RLUq_e-z","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwvP-zzvNx9xKn8eot4AaABAg.AT3Fs1XYtRpAT51aZTM9Ju","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugzru8V_S1cD6cnURmZ4AaABAg.AT3FVrD2fuEAT3HBB8IKeM","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgyY0-IT5HiKD8Bp_x14AaABAg.AT3FVbPdcV_AT3U2CioKR5","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyY0-IT5HiKD8Bp_x14AaABAg.AT3FVbPdcV_AT3VpAYoozI","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxrmctC3PRbMTqf6iN4AaABAg.AT3DSrMYoxTAT3DuXEkEcN","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugy0J6cNhJrKMajhKfF4AaABAg.AT3DI3PmhjQAT3DoBaKcUD","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugx74bd-TCwyXHvq1rJ4AaABAg.9zp1veCYZRr9zqBriP378c","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy2GJCM1xcsxZTkzB54AaABAg.AFpSX3zUpRLAJOk2aGvdZq","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]