Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, all intelligence is interacting with itself, within One Field of Consciousn…
ytc_UgwWiwo6Q…
G
Didn't they find the cause for AI hallucinations? I read something about it some…
ytc_UgxNe6rOn…
G
Werman and Woman are the old English terms for the genders. Overall i think man …
rdc_cjoq0a7
G
Reason why A.I is dangerous? The great example is...
Humans "Us" made by god…
ytc_UgxMG1YSO…
G
I believe that if a robot demands rights, it should get rights. Also, if a robot…
ytc_UgjXXsfNK…
G
Actually I'm kind of excited for this robot technology, because maybe future war…
ytc_Ugx3nqq7q…
G
Wrong! We don't die! We go back the our pure state and have a complete understan…
ytc_Ugw60veiU…
G
This topic is so relevant! I’ve been diving into AICarma and it’s helping me opt…
ytc_UgwQwuXX7…
Comment
This is an old accident that was settled recently. I think the flaw is that supervised self-driving requires the same attentive ness as non. You may have hands on the wheel, but your mind will wander.
And now Tesla is trying to put people in robo taxis so what gives?? How do you go from supervised FSD to non-supervised?. Elon acts like it's already here but I'll it will need to be at autonomous level 4 rated before it is considered safe enough. Pretty sure that is not happening yet. What I see is Elon trying to rescue Tesla because the car business is failing without a good backup plan so he's over promoting FSD. I do not agree but vision only is the way to go for a robo taxi that you are paying to drive you from point A to point b. There will be just enough problems to make headlines often. And people may get hurt or killed needlessly.
youtube
AI Harm Incident
2025-10-30T22:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwu4rHzVpWCrLX-PDx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx7O0cKCI7CBztphUh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwBIz3LLF1v3ot_cx54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxLtdt4jgYA00Lpfr14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx7OWanwO7ttJ-l1wJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxKT6NMx2d-xvzoKiJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxKaEF8mBH_TNOXPFJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyzyXIx-HILl2sh6ip4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvDSlxL2FHVLLctiJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwmFboZijQ43Q3MEeJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]