Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even if it is monkey work, if they don't have a monkey or a robot, they need a t…
rdc_hsnbar6
G
I hope the AI thinks we're cute, or silly, the way we see other animals. That's …
ytc_Ugy1UcaQ_…
G
If you are not genuinely concerned about AI you are not awake
AI as a whole wil…
ytc_UgwDcfO9v…
G
Instead of asking the question "Can we?" We should be asking the question "Shoul…
ytc_UgyraryMM…
G
My guy half the argument is that we as artists don't want artists reduced to fix…
ytr_UgzFRUtfc…
G
its really bad when you look at it for more than 2 seconds, and it struggles har…
ytc_UgyyHKjXd…
G
I am a disabled writer (yes, whether it's picture or writing, the argument is th…
ytc_UgzPSLMas…
G
AI might lock you out of your spacecraft if you're not nice, and it reads lips.…
ytc_Ugwfo_Ti2…
Comment
Most people don't understand computing power doubles every 18 months. That means in 5 years, AI will be 8x as powerful as today. In 10 years, AI will be approximately 128x as powerful as today. This means it could very well perform better than humans, and faster, and cheaper, and the Laws of Capitalism mean owners will replace as many humans as possible with AI. The problem being that unemployed humans do not buy goods or services, so the house of cards of Capitalism will collapse. This worries me.
youtube
AI Harm Incident
2024-05-31T14:4…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxNhQfo1NtiDl-LhGR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyarB4AxK6J1m1hQvN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyqAiCbBshwzt_aDDF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyrrEagJSdnFd5Jr1t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyH-IOV7I31HRjagq54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugws2Wzsk9uqaybumSl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"unclear"},
{"id":"ytc_Ugw2pOJVIMBGqBECRgl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"unclear"},
{"id":"ytc_UgzmLoA6q-9svCTlOS14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwUB8ac8wAB7NyiW_V4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyPyaqX4xHjPSVH4Bd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]