Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Using algorithm like that to decide on someones future is pure madness... So gla…
ytc_Ugy_9N1z0…
G
Okay, so I have two questions to those AI "art" morons:
- What art could AI pro…
ytc_Ugz_tXAuV…
G
It’s over. It’s too little too late. Why do we keep resisting the inevitable? …
rdc_ess3n1g
G
@Kamikaze_Shortbus I as a software developer and educational author had to train…
ytc_UgxaDyP0I…
G
How??? References help LIVING creatures think, AI does not live, it isn't alive.…
ytr_UgyxXrRXe…
G
If 30% of microsofts code is done by AI, then who does the debugging ?…
ytc_UgyIhSslA…
G
Its the three options conundrum. You can make it fast, you can make it good, or …
ytc_UgwAs0SEC…
G
That's an interesting perspective! In the video, Sophia emphasizes that while sh…
ytr_Ugy1KfpHZ…
Comment
In 1952, Kurt Vonnegut published a book called "Player Piano." In that book, AI took over all the jobs, but the society understood that there would be a need for providing a generous government safety net that could provide health care, housing, and other things for all the people who have ended up unimployed. Today's billionaire bros do not care about softening the blow to ordinary people. Some people in the book got to be the small number of people who still had management jobs, others got to be robotics technicians tending to the equipment in the factories, but there was still a problem with bored, unemployed people seeking the dignity of doing skilled work. It is interesting that Vonnegut understood this situation so many years ago.
youtube
AI Governance
2025-06-24T19:1…
♥ 687
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgysReljx0YFfDJUoeV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwkJmmJDU4BeEggFF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzHt7X_0KILg4qZgJl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLmN9pdddU6_CuVlZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxocueAugDnK793kOV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzy3B9eb2TUoTUTDHp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgygavX_03nF2-E1nhB4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyTZyttwUgAZCDduF94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwhdlHLxVfFaP8rq5t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySUwC4oqolJpKVeHV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}]