Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I only ever use AI when I need information or need some help with research. I ne…
ytc_Ugykqp8ac…
G
Somehow, I find myself relating to this robot more than I do most humans these d…
ytc_UgznfSlN-…
G
I will say that when it comes to generating AI just on your own it still comes w…
ytc_UgybDI-Fh…
G
Don't forget AI new VFX too old wait 10 more years than I ask you again 😊…
ytr_UgyuXZVye…
G
Grow food, touch each other, walk, breathe, connect with the natural world, take…
ytc_UgxDMrbGy…
G
You have to find the sweet spot. Too much context window adds noise and decreas…
ytr_UgzfpwK2w…
G
@pete5691stupid! Money will mean nothing in a world that’s not survivable. AI is…
ytr_UgxBd-zgR…
G
You can take a beautiful and detailed photo of a landscape that looks realistic,…
ytr_Ugznaj3G5…
Comment
You accident example is wrong, object falling from the truck will have speed and direction similar to the truck it is falling from, so you have enough time to slow down the car. And generaly, why is everybody asking these questions how will self driving car decide who to kill? How often these kind of accidenst actually happen? Once a week? Once a monts? In the entire world?
Self driving car will have huge advantage over human driver, it checks situation on the road 100 or 1000 times per second, it knows position of all cars around you, it does not get distracted, or sleepy, angry, drunk or sends text messages while driving, and it will react instantly, compared to driver who will need at least half a second (car can travel 3-4 car lenghts in that time).
youtube
AI Harm Incident
2015-12-08T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggmIyJ8SloWNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghTisOhXvg2MXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggJ7uf4xwzHrHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughd4nDqmE0otngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghTs3eIZEp4CXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiT1_uxg4Qf93gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiiYSCGtUOQQ3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ughs2ea7-kE5XHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj_gIAyUkWWl3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggO5i8Su4Fd-HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]