Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just got to that part of the video....
Good Lord. If one robot learns a proce…
ytr_UgwW8rrdc…
G
You must be stupid to fight against a robot ,robot is set up to know all moves a…
ytc_Ugxhwk7Ke…
G
You just give a weapon to a psychopathic AI and wonder what is gonna do with it?…
ytc_UgzyvEjVP…
G
80% accuracy isn't good enough. If the LLM SWE is set free on a large codebase, …
ytc_UgxJ3ikTo…
G
Just because the topic is AI doesn’t mean you have to write your video script wi…
ytc_Ugw9C-dJB…
G
idk if it's the pfp you're using, but frankly I like it. nothing needs to be per…
ytr_UgwPxnvbR…
G
I love how he's judging the future of AI by ignoring its potential and speed of …
ytc_UgzIi6Vpj…
G
How about just outlawing the "autonomous" and "self driving" features on new car…
ytc_UgwOFrPXQ…
Comment
AI was developed to make certain AI tech bros vast sums of money. It was not developed to make the world a better place.
Past social changes of similar significance occurred over decades or centuries, the time to adjust has the potential to be much more than just a "disruption".
The benefits of the industrial revolution resulted a century of widespread suffering before the benefits were widely felt. Australia as a penal colony was founded on decades of poverty and social dysfunction in England at this time. AI promises similar changes within a decade. It's a concern!
A lot of "if's", "shoulds" and assumptions in this presentation.
youtube
AI Jobs
2025-12-28T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwUTu0nezoeDWunmj14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy650hVarRbsm6TvSV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxXLGAvXs4nckVB3Cd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNh1Q37dLg26TAd0J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzS1yEmdVIUVw3xskh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzzAAF595-zNC76LPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzIrUaQyOSwBpHQ2up4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgynSkj_Jc8Syzcw2bt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzvB4DzJ-QdwRmHKQ54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgypGgI1RoCq7tVjdXp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"}
]