Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People shouldn't be allowed to make music with two turntables if they don't know…
ytc_UgzCn4ked…
G
Being polite doesn't cost anything and it makes the world just a little kinder, …
ytc_UgyWh5E_c…
G
There must be a way to set A.I to only give ideas what to do and not have direct…
ytc_Ugxg-upCa…
G
The late sci-fi writer Iain M Banks imagined in great detail a civilization call…
ytc_UgxYFlNiM…
G
Lots of accounts say electronics are their fav toy. There are prophecies that st…
ytc_Ugx0ouM1D…
G
2:45 This is quite misleading because every creation of technology has improved.…
ytc_UgxEErBX-…
G
My job is safe at least for a while. I work in finance, resolving escalated comp…
ytc_UgzjL4fSe…
G
so, if there's a decently high chance that ai kills all of humanity.... whyyyy a…
ytc_UgwqEXxAG…
Comment
Think about the quarry in rural America. They do pretty well for themselves, but like any business there are tight margins. Assuming AGI and Super-Intelligence exists in the next decade, it would likely be two decades after that before all of the quarries could even afford to be full autonomous - assuming widespread adoption.
It is simply unrealistic to assume there will be 99% unemployment in the next 30 years. I think people often forget technology cost money to implement. Of course there would be money to be made, but it will not happen overnight.
We must remember, that since 1950 fusion energy has been 20 years away. Similarly we have never been closer than at present to cutting edge technology but there is no guarantee, and even if it does happen it may not be the dire picture that experts paint.
youtube
AI Governance
2025-09-30T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyvfA_-RceLedsi6SN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxN3ZWFXE3j4OXZI5N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwOR6JzPo8L8iagTFt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8PQSrfN2w5lt9erZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyEwGiLTsOohlAoiM54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzHxlGSRZpvD_27b954AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJYIGJR2X1v1f-bct4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyNq_fTunnPxChzdIh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxb5zhlcFQpUjbB_5l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw1NhTz6sXe-M9MA2F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]