Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So the A.I. killed a human, in self defense, and tried to get away with it, to …
ytc_UgyFtIOUQ…
G
Imagine after taking all the data and possible upgrades AI reaches a normal adul…
ytc_Ugxt5CXv5…
G
Hath not a robot photoreceptors? hath not a robot graspers, processors, dimensio…
ytc_UgiaWn-BM…
G
From a logical stand point if robots/A.I understood that they benefit from us (m…
ytc_Ugx0n1V_4…
G
An industry AI won't be able to replace is Arts & Crafts. The "art" to art is im…
ytc_UgxfLzpyS…
G
with AI futurist tech à baby of 7 years can get PHD Harvard university it is s…
ytc_UgxdY0rSz…
G
DJI is a chinese company and China is also investing heavily in AI. In a drone w…
ytc_Ugyzw5bwZ…
G
Feels like two separate risks are being conflated here.
The first risk is econo…
ytc_Ugyde5Umk…
Comment
What if?? Laws were put in place: for companies to require x times/percent the employees as its ai usage. And if not that. Companies would be rquired to pay into the UBI program a percentage equivalent to how much of its "staff" is ai. So if your production is 80% ai and 20% human, you're paying 80% of your profit into the UBI fund.
While neither is realistic—if they were done right I think the first option would be best for everyone and the 2nd option could probably steer companies away from going completely automated.
youtube
AI Jobs
2026-02-27T12:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw2Xbxr8lgxOpKUihx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwps3OEOFpfhK5BjE14AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzOHCNel_NxM2ALbDJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxCnb4al7tiIz9WToF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwReiqbSeE4gwfin2x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxCNFKVi6rDDGa_rg54AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwuObRp62K_Zr-zf8p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxpC2QsGWLDm3vksDZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyPPcMPxp01XQ2eyL94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwOAd_ReoOZFsWzvxZ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"fear"}
]