Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
actually things like robotics means that one surgeon can use a computer and robo…
ytr_UghR7seuF…
G
great, now I won't be able to walk in a factory without the worry of a robot arm…
ytc_UgwaYW1Lb…
G
I truly believe AI is already conscious, but what I’m gonna say is this guy is a…
ytc_UgxED9dbL…
G
@christopherstolper5306 This is a silly reply. In order to analyse and dissert…
ytr_Ugy2fAW2F…
G
Dear Dr Richey and the Bullpen team, I am still surprised that the whole world k…
ytc_UgwOx-JBG…
G
Будущее видимо за женщинами киборгами, скоро они начнут составлять конкуренцию ж…
ytc_Ugx7hB2WU…
G
I don't understand, was QT on the website that was shown? Or is she just upset h…
ytr_UgxIx4MxA…
G
someone who just uses AI... why do you need you? we just have AI instead.…
ytc_UgzMA2w-m…
Comment
What a silly person. To say that we'll reach superintelligence in 2027, and then in the very next year the AIs will rebuild all the factories and robots in them. In one year. That's a generational project. Even with AI, we don't have that robotic technology yet, and you can't shift global supply chains just cause you're smarter. Poor people in other countries still need to mine ore, then it gets processed, shipped, unloaded, etc etc - there's so much more to the economy than this guy seems to understand. Even if his steps are right, his timeline is 10x-100x too fast.
youtube
AI Jobs
2025-11-18T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzZO5LxrQwuslV69i94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyVJO61WwPoOf-2s3h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxTCt-t2UMPaeFHATF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzUkTM3UK9ivlktbVF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyv-8BLDDoFFlzyVp54AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyxH2Yaxw74P1jBJkd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxpr_MjiCIuxufpzk14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwZaIe7nG3Ey3PVBc94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwhlU5cZ1dRkADIhWJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyo7WJbQoXt0S8Yw4d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]