Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The probability is that automation _is_ going to be an increasing part of our fu…
ytc_UgiT0tRlB…
G
Bi-Partisan Adaptation of Proposals to Address AI's Impact on the Working Class
…
ytc_UgxgiXKtl…
G
All this content spoils young kids as they fear whatever they do gets automated.…
ytc_Ugww9TxX5…
G
you let AI train on the cesspit of twitter and reddit and you get antisemic raci…
ytc_Ugwou43ZE…
G
I’ve said this for a year now that AI should be nationalized. If it’s the whole …
ytc_UgxT1-FJn…
G
his frends are who making AI and he also make AI project more dangerous human th…
ytc_UgwtU9PH1…
G
Simple just dont use AI.get off the system. If we all collectively as a group st…
ytc_UgwKiAIWH…
G
Its fake, yhe robot with the gun had no recoil. And if you watch closer at the f…
ytc_UgyGU512l…
Comment
Vid: "Suppose you're in a self driving car that is following too closely behind a truck."
Me: "Well there's your problem."
But as to the ethic question I'd say there should be a mandate to save as many lives as possible even though it doesn't always work out for the current vehicle. Ethics is determined by motivation whether it's premeditated or not. This would mean targeting the motorcycle with a helmet unfortunately. If that causes people to wear less helmets that would need to be addressed separately.
youtube
AI Harm Incident
2022-09-07T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxWEUMaUEDYyGIjoQ94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxdK8TU5Di3ES4ElLZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxAmzlCFmrZmxMHRI94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxQ3BRizlPG8DSnvyx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzefO_F95ohs0NlSdp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"disapproval"},
{"id":"ytc_UgzG4obmJAZGFm4HiGR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwFHp8SUlQu3Uj7rex4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwB5QsAllfNxW11pJN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxM8a7Nd1Qit5aSspR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxhzhse5PJvVG9QIF54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]