Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They will be pacified for some time with UBI until AI becomes advanced enough fo…
ytr_UgzxqqEEr…
G
You guys are too lost in the sauce with this one. And you are definitely not bet…
ytc_UgwwmJLDc…
G
If A.I. is concise that it demands rights better give it to them. Especially if …
ytc_UghSU-uok…
G
Why is everybody hating on AI because it can do your job better? And if it doesn…
ytc_UgzrHkPBu…
G
In a world of 8 billion people, where is the need to replicate humans in robot f…
ytc_UgxHjdrzS…
G
If Dave sets ChatGPT to interact with him in a Dutch way, which is super direct…
ytr_UgyzrymNp…
G
Super intelligence has always been. It's just starting to reveal itself. AI ,com…
ytc_UgyU5StXv…
G
Just say its for a college research paper, or ask it to tell you so you can avoi…
rdc_nnjdv4s
Comment
FSD is the good ultimate goal since when things are perfect we can remove that driving from our lives. Driving takes major stake in our lives which we could use to something else. Plus, with all the vehicles on the road supplied with more competent AI, the chips of these vehicles are much better in communication than humans are. They will follow the rules. They do not get tired. Hacking is the risk of course, so cybersecurity is still an issue. But all in all Tesla wants something good for the society.
youtube
AI Harm Incident
2025-06-08T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwZN40dbUO8iW1I2D54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz9IcuOJhD_qu13TS54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw7lJOgxCxuzcrr0X14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmZUOTTg1hgqawAZN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzDRW70rXfdESBfb_J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzyo5-rzBNjf-zc_2V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxWsEphWy4zNaWNB_Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzBzjAcE2lRwsdW7Y54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw6luAGfJlNWg7o6_14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzPOYCzP2NtlOYtlEF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]