Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI in games development is unregulated, then that is the where the AI wedge w…
ytc_Ugw8wbAAR…
G
Ai art invokes the same emotion as you would feel looking at cheap hotel art.…
ytc_UgxwQeXpF…
G
Plus neural networks don't work like this. You don't program them to *do* specif…
ytr_UgzxZhqbk…
G
5:58 oh wow, Krystal making points about AI that Elon also made in their intervi…
ytc_UgzGKypew…
G
Had to look up reference image for a revolving shotgun and Looks so much better …
ytc_UgzxojKOq…
G
Was your part of the interview AI generated, Stephen? I wonder as some of your f…
ytc_Ugw9M1uYr…
G
i think its more along the lines that the US military doesn't feel the current s…
rdc_ic2t46v
G
ngl I'd only use AI shart if it's perfected. Yeah maybe now all it can do is mak…
ytc_UgwcjUAS8…
Comment
Andrea's comparison of AI to nuclear weapons is missing one key difference; it took the US government years and a significant amount of manufacturing output to develop the first two nukes, and they compartmentalised everything so that no one person had complete knowledge of how they were built. With AI it's not the government developing it in secret, it's huge corporations locked in a public race to be first - and there are no safeguards. Oppenheimer had the option of shutting the Manhattan Project down if the risk of atmospheric ignition was deemed too great. But these AI companies are ploughing ahead regardless of the warnings that many of their own staff and even some of their founders are now voicing. There's no putting this genie back in the bottle.
youtube
AI Jobs
2026-02-17T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzTZo4yTPRCyk5SHf94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFjpknWHdSEuAGD4V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwt2tYbuRRE96K9TkB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy_5X-HWjWmqtt6JS14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6pyy3RCtkPlutvHR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwdpizzBoaI0qsA6Zl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyr8ZOsDir9YBY1RDB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz--JZr0kYkDOTJ4kN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgybiY0pLiRJnnbTWBx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwVAulQj8fFiTwH5GB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]