Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is the next milestone for science. It might suck now but in the near future i…
ytc_UgzwoVZe4…
G
Ok here's the twist, how do you think ai got so good at art? Because people had …
ytc_UgwPKUeft…
G
Look, if you hand over your power to anyone or anything that doesn't have skin i…
ytc_UgzFTeyHG…
G
I actually find her talk to be misleading.
For example she doesn't compare the t…
ytr_Ugw4nBsUJ…
G
How are atonomous driver and equipment checks done???? Does autonomous know wh…
ytc_UgzAfNrFO…
G
Just want to say that I doubt AI picture generation will "collapse" from not get…
ytc_UgzJtTDS8…
G
Did you not see the paper that just came our that showed the exact opposite? LLM…
rdc_n7hgbua
G
So AI is becoming more like humans in real life.....big suprise! We program them…
ytc_UgxgmMdon…
Comment
It seems to me that to be a mostly vision-based autonomous system, you'd need REALLY good cameras. Ones with insane dynamic range and resolution. After all, the same camera has to be able to look straight into a late afternoon sun and also be able to process a poorly lit country road on a moonless night. I think Elon's comments about lidar sound to me like a CEO trying to justify a cost-cutting business decision.
youtube
2023-06-29T18:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxI0jJJ6v_j1fn1gGt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwD5YwE0ZKelEaagpx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyj3tZ89vOXbik8o6t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgzLA3d4ziEIOQ62fhh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwxRYGBaByHrM7XoK54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw6TrgIkg8WdOq2qJ14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgxsQ_60kDcwwIyt0iF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_UgzjfJxDxVPiewGKXx14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzs6BiV15-Eh0lwMv54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw3v970Tfbnh6lLRjl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"})