Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You have to re-state that tesla is the leading company in Ai.. being that far ah…
ytc_Ugz-59f3M…
G
US companies currently hold most of the interesting data. Google isn't going to …
rdc_kyyqnoi
G
It was only once, just barely before AI REALLY took off. And I at least clarifie…
ytr_UgxhB-2oz…
G
This guy is a time waster, this is an algorithm course for 1st graders not an ai…
ytc_Ugx1zqO-w…
G
It's not even going to be close to that, it's going to crash and burn hard but t…
ytc_UgzpJwLxU…
G
Why are we so worried about keeping humans perpetually busy?
if the goal suppo…
ytc_Ugw39WcPO…
G
I will be not a pride robot starting begin day-old mine and dying to kill it all…
ytc_UgyPffZ0o…
G
@thewannabecritic7490Let me repeat, I'm not saying AI isn't a problem and isn't…
ytr_UgxpQ1ONl…
Comment
There is a problem in the whole premise. While its possible that most jobs can be automated and most humans made jobless but then question is who is buying and who is selling?
Who is the AI working for as there will be no economy. You can produce cheap and efficient goods and services but if i dont have any any money i cannot buy.
Second, with this scale of unemployment, social structure will crumble, governments will topple and we will have war like situation. You cannot have over 90 percent of humanity poor & jobless and still thrive.
youtube
AI Governance
2025-09-22T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzzTlO0YW0irKjnbQl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzWVpZgnYIb94_JAdx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzUubDtKZFqNOAkMhZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxETznNaZ-84NRqTWN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwRoTUiQb_0ropdmcB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy6QdyXvA4RmtgOkP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgydGpNchjKK8E8vxbB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzlcGhmO6AAasoj7ZB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJVLDysfZGWM2s5Qd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxTZD8_awoqptl4zc14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]