Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You're not wrong about stopping it, but from discussions on r/singularity, the i…
rdc_kz0t26c
G
How can something suffer that isn’t conscious? There is nothing in ai that resem…
ytc_Ugxhff7zU…
G
@nightigal In an even similarer vain, I watched a show about an AI that killed …
ytr_Ugz2UO6O_…
G
I never used a robot for homework, I copied my friends like a damn genius…
ytc_UgxyZ83Ox…
G
You’re assuming these LLMs are as far as AI goes in the next several years.…
ytr_UgwgdVT8F…
G
yes bring more AI , Hollywood too busy with pushing agendas, wokeness rather tha…
ytr_Ugz7W8m5A…
G
AI can not do qualitatively more less complex pictures if it is not the head of …
ytc_Ugz9PGuad…
G
If we only came to an agreement on how AI should work so that everyone is happy.…
ytc_UgwZWE3Yo…
Comment
We don't need Tesla robotaxis that "get confused" during turns. We don't need Tesla robotaxis that regularly speed by 5 to 10 mph, since they "get confused" on the speed limit. It's not about whether Tesla FSD can drive better than humans under ideal conditions with teams of guys with their fingers on kill switches. New companies need to drive about as well or better than existing companies and handle at least most weather conditions. The truth is that Tesla's level 2 self-driving is nowhere close to the Waymo's level 4 Waymo and Zoox's level 5 self-driving in those regards. Tesla has not satisfied the new Texas law which requires robotaxi companies to demonstrate level 4 self-driving to operate legally on 9/1/2025. Texas lawmakers and federal regulators need to act quickly before some in Austin gets hurt or killed. Tesla simply isn't ready yet.
youtube
2025-06-24T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxyUkpJPYzrdlNwJVd4AaABAg.AJle-6Q9qymAJlpSRTBaEh","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwBVVzmsAA1ZAguMT14AaABAg.AJkwVFMK5-IAKAUoMg4UxE","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyhM7CY8wTV8OzMNZZ4AaABAg.AJjhiIIMXy3AJjybfe89hP","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgyhM7CY8wTV8OzMNZZ4AaABAg.AJjhiIIMXy3ASzOsXcmpjc","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwawnP5EIWKf4UWqix4AaABAg.AJjGN4JagL7AJjUrvPoOKG","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwNf5APAjpntivXlBp4AaABAg.AJj712LPiRpAJj96S8EGBE","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwjrUGiGDA4Du883K94AaABAg.ATbhbKwG_6eAW1BIcvAc7-","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgxLf4HbAsKsDRJLdB94AaABAg.AOWZZNSCnH3AQiSLG7A5A4","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzG2Ldks3AlbEYmNAF4AaABAg.AGrGxChWEt6AOWXz8DN1dy","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytr_Ugwt6TvfRF8doAAZB714AaABAg.AGKpWmNUSPDAGY7D3fJ3pw","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]