Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Goprof150 If i order doordash can i call myself a cook? Does that make me a che…
ytr_Ugz0IolsT…
G
That's a really insightful point! Sophia's response touches on that idea—while k…
ytr_UgzgmKTIz…
G
LLMs can’t learn. New knowledge, train a new model. LLMs are redundant soon afte…
ytc_UgzFdjlJs…
G
Hey AI. Please develop a therapy and corresponding manufacturing processes for t…
ytc_UgxRvtZJm…
G
I met with a woman named Kavita singh in April 2022 in a party. Within a month'…
ytc_UgwDUXYhS…
G
In my opinion, the fault is with the driver as they weren't paying attention to …
ytc_UgwSEpzix…
G
Why do we have to have this? I saw one video that AI wouldn't read Genesis 2. …
ytc_Ugz79ReUA…
G
I think its funny how they seem to be hiding in replys here and there instead of…
ytc_UgyUbgORz…
Comment
Full FSD isn’t actually a thing yet. Waymo is so far ahead of Tesla, Tesla will probably never catch up. Both companies are using our public roads as test labs to try to get the tech to work. Reportedly Musk only want camera’s because its far cheaper. The prob is that driving requires more than just sight and the level of processor musk wants to use isn’t capable of handling the job. For what he wants to do he would need a processor akin to what an F-35 uses OR have a back up on a wirelessly connected central server. Oh, yeah earlier this week Tesla said it may open a “monitoring center staffed by humans to provide emergency assistance for FSD vehicles”. That’s for robo taxi support. 1000 serious or fatal crashes is a respectable N for a baseline study and the data screams the Tesla autopilot has serious design flaws. Finally WSJ does real journalism.
youtube
AI Harm Incident
2024-12-15T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyXM95yDqdxmY9oK-94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyRQzE2sS3wFSmDY2B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzcF9nYUR2tlQ6jR9p4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzf0NqEzcVVpmXu0yl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwL0GAp4oz8OlwyKWd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwHCYZkjfQ2oY8dE354AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx6XsqH5tl1KujMBWN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw8NuBFZG6T5bRNTjF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyemoJI3SVCqZA97VB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy7FakZ4fx-BUUQAxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]