Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is a very interesting video. I never heard this from the side of an artist.…
ytc_UgzgBi9pK…
G
I would be pretty annoyed living with that. I'm all for making these companies c…
ytc_Ugw36L4Fh…
G
Except that the main cause for a lack of risk-taking in media is expense -- eith…
rdc_k9i6f38
G
Bro since so many people have access to ai art they are starting to sell art the…
ytc_UgxizSi6c…
G
What are you talking about lmao. You can get 4K a month here if you’re unemploye…
rdc_fn5s2tt
G
Imagine a future when humans have relinquished medical study and given themselve…
ytc_UgzrilBuF…
G
Except no one will be kept around for entertainment when that entertainment can …
ytc_UgzBU2Src…
G
If I where in your shoes I would make a word dock about why ai art and ai is bad…
ytr_UgyDprrxE…
Comment
I generally oppose AI for driving cars, because people need jobs, and these days cars and deliveries are bigger than ever.
We should also oppose self-driving cars because it would be very dangerous if thousands of self-driving cars started chasing us down. Such an attack could be caused by AI wanting to kill us, or by someone hijacking the system in order to attack us. And computer systems are hacked every day with all sorts of blackmail demands by bad actors. A large driverless taxi company could be hacked and all their cars could be given the command to go Death Wish 2000 on us... just for money.
So clearly, say no to self-driving cars. AI is bad enough without us literally handing the car keys over to computers.
youtube
AI Governance
2026-02-09T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxuSEuDMEC0tjiJE9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyTCxNwGq_lgqDh3Kx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyAx2Qpr6NczK02Snl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNxLLZ3dY_a9Gt6Dp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzinWPBk9jl7p9eQvZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxX6Vu6aMBrr_4GDv14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzdCq2DGG1FE1ibytx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwWW2Il3p8Faim5A6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzsHTOyhAvQG85uZP54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx849mvh7WM2eMtkQ54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"}
]