Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree with you, people should stop buying from these A.I. companies who are la…
ytr_UgwtqMtHG…
G
If you think a robot made to serve humans is doing what a real woman should it's…
ytr_UgyN7p0c2…
G
If only the regulatory powers be would mandate an "light" or (emitter that emit…
ytc_UgxJNf81-…
G
As a software engineer with some experience of neural networks and other AI syst…
ytc_UgxrOEHyY…
G
Karen Hao is awesome, her work studying the techno oligarchs is top notch. Great…
ytc_Ugwe8lwUb…
G
The thing that interest me is if A.I becomes so efficient people will have no wa…
ytc_Ugxpb9AAh…
G
A victim would be the end customers, if Walmart freight costs went up 50% becaus…
ytr_UgwRJ9OFe…
G
Uuuuuuuh, just so you know, the backlash against photography in the 1800's is ex…
ytr_UgyL4fH3k…
Comment
This video is embarrassing. AI is writing the next AI which will be writing the next AI, etc. so AGI (AI that can match humans) should be real within a decade or less. Add in that the whole value proposition of AI is in giving it increasing autonomy and that companies believe if they don't maximize AI usage and cut jobs with it, they will go out of business in the near future, and yes of course you are going to see massive white-collar job loss and economic/societal disruption unless governments intervene at some level in the coming decade. This video bizarrely treats AI like it's going to remain prompt-based stuff like we've seen so far from ChatGPT forever.
youtube
AI Jobs
2026-02-23T21:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyVVn3FJnfKcD6beUN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugya76w3JH8mT1bmX214AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxskpTOHDpF9B_OyyB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwVzI-Lnh0J0GxLvPh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw30VorVvQmX3SvbFl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyH9WplSAmexoqIfvx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwGW2SXTMJGe7zu_xB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyESh0EAWPTCuPcJmt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgynQaRMhKBwOK3Kiml4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9p4H09ZZs-k1Vwp14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}
]