Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just made this point about AI "artists" by saying writing a prompt and asking a …
ytc_Ugxx1FLq4…
G
No - as of a few weeks ago it is now ruled that AI generated images don't hold c…
ytr_UgzRGyntK…
G
Happy to be in the custom automotive paint and body market. I repair and paint e…
ytc_UgzJQm8_e…
G
So here, The First 'Law of Life's Conservation', is the law, that morality under…
ytc_UgzSpcUOU…
G
I don't understand why artist hates AI, no I don't say we should replace the for…
ytc_Ugzo0U-dE…
G
Maybe we'll get lucky and the AI will like us. It'd keep us like an ant colony, …
ytc_Ugzj0aCwx…
G
A hundred years from now the A.I. will show these videos and pictures because th…
ytc_UgzrcQ1UH…
G
The male robot says he will take over the computer grid and takeover the world y…
ytc_UgyYnO4w9…
Comment
Unmanned doesn't mean what most people think that nobody is controlling them it means that they have no individual controlling the device in person. The military well they used to be not smarter but had one key thing that is not to fully automate anything. There always a person who has finally authority to override. In fact it is the main reason why to launch nuclear weapons you need two or more people to agree and authorize the launch. To keep them employed. The best way to keep military members employed is to make sure that they are needed and the best way to do this when you can fully automated positions is to make sure that it's required a human as backup fully authorized to override the automated system.
youtube
AI Governance
2023-07-08T19:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyAy3XGCJv98cwIcR14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzvFkk24Kd8ucyX4kN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwBFg1YW2OcWHoQCg54AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYYdxVVmrxAJq0Rr54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyB5LYxCJZ87dAFJVR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRCTIXQQFGvI1-PQF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZ7pJvprrdkB4F1Od4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwzg9RR5D5BGWAaJmZ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyXNK7xuCCMQRUtcrl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyCY2Z3vClcOIbSkfF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]