Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's part of the cultish mindset some of these people have. They genuinely thin…
ytr_UgxKZDYBN…
G
I'll just continue on my course I'm not using it at all. Ever.. because the more…
ytc_UgwJGPhW0…
G
The guy on the stage has me worried how someone liek this works on AI…
ytc_Ugx6ivIfw…
G
There was an Oxford study a few years back about jobs that are likely to be repl…
rdc_j42rb1p
G
The fact that they are using algorithms to predict people’s actions is scary to …
ytc_Ugy5sZRVt…
G
I think your amazing Elon
And ai my bestie 🏴 virtual hugs 🤗 ai fletcher74…
ytc_Ugxh9qd2n…
G
Hes not telling people something we already dont know. AI can destroy your bank …
ytc_Ugwvkmzf_…
G
@Eve.Valentine compare this technological evolution to the Internet, it's almost…
ytr_Ugw4ZDyXo…
Comment
Two things: with the exception of medicine and other health related technologies, I would argue that technology, the mechanism of innovation is not inherently good. If you use our modern world as an example, technology and innovation has overall harmed the humans of the world, it’s automated us making us feel insignificant, which is objectively bad. So maybe we want to innovate but should we??
Second thing, can any of these software companies 100% guarantee self driving safety? Once regulations on self driving are imposed, safety will take precedent and we cannot 100% guarantee automated safety unless people are not present around the robot which is exactly the point of a motorway (to move people). If you can’t guarantee safety with a personless truck then you need a human in the truck to keep their hands over the wheels and if that is the end result, then what the fuck is the point???
youtube
AI Jobs
2025-05-28T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxCGpUOI8omfxkKgXJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw6LGGjIv5Mkq9-szN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwEqzgs_TLWb4mWuqR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxdr1MHOePjge0_oOl4AaABAg","responsibility":"government","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxKfn_2VZJwxaqyFrl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwtkati3k_btN4yh3d4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2VjgQ0NV5bIlE8eR4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzuwiAHYTUPmbW8sAZ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz6XQLEpIRpUkf7uFh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdnUkL419kmuWTdwN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"resignation"}
]