Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why does everything have to be so dramatic? AI isn’t some evil villain waiting t…
ytc_Ugxemz0CT…
G
AI as it is now is much more like the latter. It's much more prone to reproduce …
ytr_Ugwh-zMuC…
G
True...i only got criticism from my mum after my daughter died. No compassion..n…
ytc_Ugwg2dTXz…
G
@retnuhytnuob4068I love using it for automating tasks that REALLY ARE BRUTE FORC…
ytr_UgxCX80X9…
G
AI is a tool. If not local only, it's rife for abuse, misuse, malice, poor moder…
ytc_Ugw2x__-T…
G
Is AI dangerous to humans? Depends on who is using it. Nuclear weapons are dan…
ytc_UgxSwuFHV…
G
but a lot of people can drive cars so why should they become self driving…
ytc_Ugis3gL-v…
G
The working class is almost all dead and dying and all thats left are people who…
ytc_UgyqGqfQP…
Comment
The word "could" is doing a LOT of heavy lifting here. We live in a world where about 8% of the world is still not electrified, yet we started electrifying about 150 years ago. He talks about all the drivers being replaced, yet we've had autonomous vehicles operating for over 9 years already and how many drivers have lost their jobs ? Technically we COULD have replaced every car and truck on earth with self-driving electric vehicles already.
This guy lives in a world at the very edge of technology where adoption rates are at their highest, but that's not how the world works. In 20 years there will still be farmers driving tractors in corn fields.
youtube
AI Governance
2025-09-29T11:2…
♥ 24
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxq-ZzVIMokCTgAECt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyopJbrFg7WCOFFJtZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTl3NV3ZZ6geBhteN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxRMuT82hQAR6oYfxN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwKrWoTeRMTSoBqB_54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzS0b54dzYUEEVdfXl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz3FWfj7AYCCjAjjJp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx4qat2yp44XrfSbeh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKZbTX6dF4mkNFDnB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwsVA-ajokJJC7P2-J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]