Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This will probably already have been mentioned, but one scenario would be that h…
ytc_UgwwuENLS…
G
I felt that getting into these AI self driving cars are an added risk to your li…
ytc_Ugx0rfjr7…
G
You look at this you bring in AGI and you get rid of all the jobs and then they …
ytc_UgxOm-jxq…
G
Pure fear and malice shall pulse through my veins should my friend ever use the …
ytc_UgxfNpxr1…
G
fail safes at a hardware level and other various security levels but also you si…
ytc_UgzB9uWXo…
G
Steve Jobs is actually the first proto-techbro visionary I can think of. Part of…
ytr_Ugx68-h8X…
G
The issue isnt AI. Its dumb people. No research to verify and no critical thinki…
ytc_Ugxv6A2Js…
G
Matt your analogy to the car emissions (or any) within a state boundary is flawe…
ytc_Ugwdyc6IX…
Comment
@quinnduffy6689 your mind is so limited. You cant fathom progress when our grandparents couldn't even fathom computers, or cures for diseases that used to plague society. The idea that FSD would regulate itself to localities is actually low hanging fruit for the possibility of AI. Its data. FSD needs data, and more and more of it, the more EVs and smart cars you see on the road, the smarter the AI becomes. It just needs mass and quantity.
youtube
2021-09-21T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugxcns2nbnDeb7r_w2p4AaABAg.9SPMA9tyCNZ9SSy0mj9tGI","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgxXGYkCeZB7gXM4x3Z4AaABAg.9SPLKSGjJF99SXFufejfKe","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxXGYkCeZB7gXM4x3Z4AaABAg.9SPLKSGjJF99SY79XV03wm","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgxyzIAU9mTlY2oDmh94AaABAg.8xA9OiJyj0b8xADPfBPRO1","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgybIYRcJK8CHRC96K54AaABAg.A3X44qXNuKDAFcfBREDjN0","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytr_UgzvVVS-osep1ufeatF4AaABAg.AIBnFn3H7v3AJ9lrzcpFAX","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgzvVVS-osep1ufeatF4AaABAg.AIBnFn3H7v3AJSsiizPFtE","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgypL-DcA5l9KawRqDR4AaABAg.AHtDREDFbyeAISIHT0Qe7b","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgwJBnN3cVrYwkWUC9x4AaABAg.AHr0cQyNW-rAHs-Cf4I1yQ","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyZr3QwnJPf5q6UOs14AaABAg.ARnfb960E-VARq5RsJky7c","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]