Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lmao I can’t believe these guys actually thought about if ai needed rights. Of c…
ytc_UgxBsqmy7…
G
After I heard you said AI wrote that story I got curious and asked Chat GPT to t…
ytc_Ugymuw2sg…
G
They are working on AI taking over the human brain using existing technology lik…
ytc_UgyNXyni4…
G
Yeah i dpnt buy that for a fuckin second. Ive seen the AI, they're end-of-centur…
ytc_UgyGmwIyt…
G
The people who are sounding the alarm are the world's leading experts. Top AI sc…
ytr_UgzrRiJFz…
G
4:40 he says.."So, What does one do in such a world", Um how about NOT create AI…
ytc_Ugwcy-Zv6…
G
ChatGPT is just another tool in the arsenal of an intelligent person. Just like …
ytr_UgzatduPT…
G
@event__horizon yeah in 2026 its already gonna be voted on. I will bet on it. A…
ytr_Ugxkc8jJb…
Comment
Automation means one of two things: 1) fewer jobs making/doing the same amount of stuff or 2) fewer jobs, but only until we come up with more kinds of stuff that people want. In the first case, the replaced workers should be given UBI. In the second, we have economic growth, GDP rising and we continue to place ever greater demands on earth resources while the earth is already at its limits under our current tech&resource paradigm. The first is what we really should want, but our life-drive for growth (and our Capuchin-like fairness instinct of same reward for same work...part of our life-drive) forces us to do the second until physical limits finally force us to stop.
youtube
AI Jobs
2025-05-31T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxSth_AlZpW5N0yc3t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8FfBIZzUAY2Db8Sp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8DEjbbCsvlkH42Lt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzDE3lzgYgBmdg1mxR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxWoVDvW0w2L1ATyox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwcOJbV8YpzQLfg4qh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz0hdU-nQvvS9wsdtd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz4JP2iKJrm0u4GaZB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugwnjf1fGXGNXeRBNlt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgzwFAurcEwUYZWyVFp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]