Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just hate ai I litterally, at first (like the very first months) I thought”oh …
ytc_Ugx9rXdIy…
G
@kennethawesome She asked at the end "Would You trust a Waymo?" and I answered t…
ytr_UgxbApOya…
G
we would just have to ditch these companies and live like prehistoric era , cuz …
ytc_UgwQN1jWr…
G
Wow how easy it would be to put them in charge of a robot and then enslave human…
rdc_jp5keyh
G
This also explains why so many AI companies ‘donated’ to Trump’s campaigns, and …
ytc_Ugzt-4Lqd…
G
What do you think? Which jobs are safe from AI and which jobs are not? Can AI ta…
ytc_UgzvZu8SO…
G
Yeah what, it doesn't compute for me. He'd at the youngest started university at…
rdc_dv0llh3
G
This is what we are lacking in the dev space. People view AI too binary. It's no…
ytr_UgxpJXBgm…
Comment
I think this 99% unemployment thing is ridiculous. That 99% has something AI doesn't: the vote. So, if that 99% wants to be employed, they'll manage to do so because they can create parallel economies where humans employ each other. In fact, it will be simpler because there will be a greater availability of resources. The problem will arise if AI ends up destroying democracy; the concentration of power is the danger.
Other points to consider:
* Only killing a person will be a criminal offense, not destroying a robot/server.
* Charity is only practiced with humans.
* Religions have obligations that can only be fulfilled by humans.
I don't want to elaborate on the relationship between these and employment here to avoid making the post too long.
youtube
AI Governance
2025-11-25T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyYS6plwNwQ14U6aoB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyyIjmPmMuAw9vjkfF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzykSGMAMh-zmDt-Ex4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1w07PR1vc3QuIMgl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw4f_-nGLjD6d1NVJt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwbFbUX7NRzg1W4baR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyUVB1gIuquA_hU1KZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz7SWZ0JIkCYn6n8-h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzrNX20FtuPRfd7_h94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHfmoGZ3fQ00ZPXwl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]