Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The first serious debates about whether or not killing a robot is murder, in my …
ytc_Ugz2aGP7D…
G
Ai doesn’t make art.
Ai isn’t smart and it doesn’t think like a human. It just d…
ytc_UgwMnLyCF…
G
The Waymo visualization is incredible. It shows a lifelike representation of the…
ytc_UgxRdXJI3…
G
I think AI needs to be regulated around the world. There's no laws regarding AI …
ytc_Ugy3L1Oyc…
G
Minority report showed us this was going to happen. AI doesn’t discriminate but …
ytc_UgwMW4b80…
G
The only thing that happens is masses of people are expected to do nothing profe…
ytc_UgzkeJPnC…
G
It’s pretty straightforward and I’ve been saying this for years. If a job requir…
ytc_Ugxp1Uwc2…
G
Revenge p*rn is already a huge issue, so the implications of this are terrifying…
ytc_UgwtKGDZ3…
Comment
Most of these discussions on how AI will 'take over' never go through the step-by-step process of how this would happen. Are they saying that at some point someone will decide that they want to take orders from AI and will obey it when it orders them to annihilate their fellow humans and destroy ecosystems to create data centres? It seems more likely to me that there will be a select group who decide that they want to cull the population to ensure the smaller group has a higher and long-term sustainable standard of living, using AI to achieve this goal, which, although as primal as the impulses of tribes of fighting apes, is horrific to 21st century sensibilities.
Another point worth considering is that since technology has developed we have placed human control points within the process. In many fields, the number of human control points has increased rather than decreased. There are many identifiable endeavours that would benefit from the insertion of these additional human control points, but this is simply not possible because of economics and the particular form of automation or standardisation which is the prevailing orthodoxy, perhaps this is the window of opportunity and we adopt AI as and when it appears successful, organically.
All in all it seems that technology is all too often being treated as a religion which is either ascribed or not ascribed to. Empiricism needs to be applied to specific scenarios so that we are not relying on helicopter views by people who, whilst highly intelligent, are experts in a limited field, a field which does not automatically translate into the specificities of myriad other fields.
youtube
AI Jobs
2026-02-18T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwhCCvdq6JMV0ogiq94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGG4uJVF7QEeWmiUd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYTr0BGgvhUu2D0m54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwNsU1i4npQwI2OXRd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyP1Shl0FobZD06wQB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxGUrWPsPp7VZbxFgB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyMK5wBGO2HLh2fHHJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzw-2_r86V41jdQoEx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxonJCc9XrH6VCunPB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzKZFXFPzAbj8Y6itx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}
]