Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its about time we've started taking this seriously.
AI programmed by people with…
ytc_UgwCYZ26K…
G
So all this stuff about 'reducing our carbon footprints ', the COP meetings etc.…
ytc_UgzCbQ0WL…
G
Oh, we'll know - algorithms can only condition us but full blown ai will be pilo…
ytc_UgzWoKGgM…
G
its kind off useless like you spend on it at list couple hours to make 3 poisene…
ytc_UgwTnzNUQ…
G
I, Robot or the background story of The Matrix, especially as explained in The A…
ytc_Ugwu2R9aX…
G
If all art is just taken from other artists, and has been for over a century, wi…
ytc_Ugzcd0F1k…
G
So basically they’re like Let’s build a bomb and light the wick, we’ll be gone b…
ytc_UgwTi5wIX…
G
@Landgraf43, *"everything will be automated eventually"*
I certainly hope not. P…
ytr_UgxbkuBhZ…
Comment
Question. When has any tech company delivered flawless technology? I mean flawless, working as intended 100%, always, without fail. The answer is never. Less critical technology is dealt with by the average consumer when there is an issue. More critical technology has experts to monitor and maintain it. Unless Aurora has a “virtual driver” diligently watching and managing every 80k truck on the road as a backup to the onboard tech (or cloud based monitoring), there are going to be failures and deaths. If the autonomous driving companies gain enough power, they will lobby to encourage no regulation, and no ability to hold them accountable. It’s not just a matter of people losing jobs and companies realizing higher profits. It’s a matter of public safety.
youtube
AI Jobs
2025-05-29T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx4s9u7NIEQkHOn90l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyRhipDiO8LDTX82094AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwplnN4kRf8QrHGLfB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxIu6XdzC13XGLux9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzqsvqD1iVWFtPpGJR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz-JIpFrpyM-QA1CTh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwxt-MUovwb7_5EbHB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwIhPBFWhQBys4qKSp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw26Fd3ZKs8ZqiQxqV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw16Y2lQwHmBkLlzTR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]