Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
chatgpt tells you what you tell it to tell you. It is a great role player. It do…
ytc_UgxNwl3XK…
G
The airplane didn't replace the car.
The battery didn't replace wired electricit…
ytc_Ugx4yd68J…
G
well I don't think these people have read literature because in brave New World …
ytc_Ugx-jESTP…
G
So mega corporations lay off thousands of workers because they overhired during …
ytc_UgxaZ-N4F…
G
1:09:15 Hank your sci-fi quote about the robot wars is remarkably similar to Dun…
ytc_UgwVsKH3y…
G
The connection between canopy coverage and cardiovascular health has been discov…
rdc_m2dbbvj
G
As an artist myself I don’t really care about people generating AI art for perso…
ytc_UgyuZD2fU…
G
I heard people say throughout various videos, that it is the same as when street…
ytc_UgwRKG-Ve…
Comment
This has been the case since the start of this madness. Silicon Valley doesn’t give a fuck about you, or anyone else. They care about two things:
1. Money
2. Creating a “worthy successor” for humanity.
Sam Altman, Elon Musk, Richard Sutton, Daniel Faggella, even Eliezer Yudkowsky has expressed the idea that if we make superintelligences that are good, it’s completely okay to flush humanity down the toilet and have us be replaced. These people would put a bullet through your head for another dollar. They’re rotten all the way down to the core.
I would highly recommend reading The Growing Specter of Silicon Valley Pro-Extinctionism and its subsequent essays to get a grasp of just how ingrained pro-extinction rhetoric is in Silicon Valley
youtube
AI Jobs
2025-12-30T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy2C5k_qqBPrpO7Q-F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzNYDsYl0WR54nFv-N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxU5Z7kM7yAOUwZ_1l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwgKIg_hIih-8xsoK94AaABAg","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxMyjksdPBMLSYRvGJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy7yEMB21vbiRAzrtV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwKgaKN1bHvt6UHMoN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxeUJQQlbX0MolN5SN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyfgsCX9QBS-IxYUKh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxd6f031g-dihBWn2x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]