Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> AIs solution to most problems will be less humans.
That depends entirely o…
rdc_mbv3hus
G
@derekfume8810 Why would megacorpos want people to be anti-AI? For them, AI is …
ytr_UgxJzoqS0…
G
Tbh the people who explicitly state they use AI I don’t have as much of an issue…
ytc_UgybZ_LYF…
G
What is the Ki-Hoyer Synapse?
Unlike conventional, purely software-based neural…
ytc_UgzXmY8JP…
G
The customers will be the other AI owners, it is possible to create such an econ…
ytc_UgyUjpzWR…
G
The programming is defective. Who would think to put a 'self driving car on the …
ytc_UgzDERWCx…
G
Before it was let AI do the work so humans do art, now it humans to the work and…
ytc_UgwlHNE93…
G
@ probably most or more than any that would come out of any propaganda school…
ytr_UgxSstzkj…
Comment
had to write an essay on the same topic during autonomous driving class last semester; conclusion was that waymo's full sensor fusion approach with lidar, radar, cameras and other sensors is the most realistic, future proof solution while tesla has to make near impossible inferences, too often with no negative ramifications, resulting in a cost over life calculation in favor of cost
youtube
AI Harm Incident
2022-09-05T05:5…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwBpAMugJFs56h7T5F4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy8nG_qlMITcCzVnNN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBlFeeAEBseE5VbuB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzRpRwK9qSerMyCerh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyR26KVaZ16mbw1Zh14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYy20RnDHPVC2cLKd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgynoJQ1rDqEg7CfyrF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxGFRO40uyLs743S5l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnbgRmoLULVYiPPPh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxUlvxor9_DHJI4YHR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]