Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@abdulgill5013 A couple years back. When Waymo 1st started full autonomous drivi…
ytr_Ugx5rbt1m…
G
I love the hand poster so much, having it be inspired by cave paintings really h…
ytc_UgzuthWPX…
G
It is totally possible Today. Online shopping with delivery is just few endpoint…
ytc_Ugyvwe3fY…
G
Another reason not to trust self driving vehicles unless I know where the emerge…
ytc_UgxWRWIVM…
G
AI will do to the upper middle class what the middle class voters did to the wor…
ytc_Ugy5cd-6N…
G
There is no way I’m the only one that thinks you have to be beyond stupid to fal…
ytc_UgzWEFtxg…
G
The fact that Geoffrey Hinton regrets his work on AI is simply not accurate jour…
ytc_UgwEu_CBp…
G
LLMs are a dead end. They cost a fortune to run making them unsustainable given …
ytc_UgwZlT8gf…
Comment
The concept of a well-defined algorithm being a "targeting mechanism" is definitely an interesting one. But I would rather have an algorithm than silly humans under-reacting. While these discussions are very helpful, we need to understand that the rubber is hitting the road here - what is BETTER: an outcome that can be planned, or an outcome that leaves an emotional, unpredictable human in charge. Seeing the way most people drive these days, I for one welcome our new robotic overlords!
youtube
AI Harm Incident
2015-12-19T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugi_GtOLGK5NvngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UginTf9w9kAfAngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggPMgtotN-UAHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgggUF_2Qb4HPHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UggYd0QeaEjoT3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjFNMbehN7Mp3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiV6VqZ6rqWDngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgisBoqYfIeKMHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiL9TsF2gC4CngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiMNk6vckjqtHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]