Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans ARE robots. Albeit biological ones generated through random optimization.…
ytc_UgiVs4F-1…
G
AI is already mimicking the lower level of human intellect
but i suspect it wil…
ytc_UgwYhSLX6…
G
Recreational art using ai actually still harms people who create art for a livin…
ytr_UgzrMCaHi…
G
13:18 Bro, there's no lower cost of entry than an pencil and a paper. Like, you …
ytc_UgyJ0hA8A…
G
> the most corrupt soulless CEOs are the "victims" in every example
> most examp…
ytc_UgwMa763r…
G
1. These programs do not allow people to express themselves. They are fundamenta…
ytr_UgyyKi2k0…
G
18:18 i'm 7 years into designing and illustrating my own card game which will pr…
ytc_UgzQ219v3…
G
How do you regulate software? Remember, the issue is what you do with AI not AI …
ytc_Ugy5lcauB…
Comment
In the 1970's when I started driving, truck drivers were the safest and most courteous drivers on the road. It has deteriorated a lot since then. As with any new technology, driverless vehicles are imperfect, and will cause a lot of accidents for the first 20 years of their existence, until the bugs are worked out. Just imagine how many people are going to lose their lives until this experimental period is over! And they're no doubt going to use the excuse that driverless trucks are safer than human drivers, but it won't be true for years, and accidents will increase before then because truck the drivers of today aren't as safe as the ones in the 70's.
youtube
AI Jobs
2025-05-28T23:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzUPji0a3g2jkSPMKx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxFcU_91bILGDA6gdZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxRamGeBIg5BkS_VC54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzXoypCT10OVGmbAfx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzXVV3hPI7UfcigJ_J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyZw1fuVm7BnNbhfJx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzgvLOfbDsY9KkDqvZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwtwTuaTGb1uIh7Hkt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyNYErb2ExZP6A66Ml4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugw2IEUIukWXry2TGlp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]