Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@rubefink In reality I find nothing with convince a dedicated skeptic. We put up…
ytr_Ugx4KY6sb…
G
IT WAS NOT AT ALL THE PEDESTRIAN'S FAULT. THE CARS HEADLIGHTS WERE SET ABSURDL…
ytc_UgygZAHb0…
G
1: Why are we planning to make A.I.s emotional?
2: HAVE NONE OF THESE COMPANIES …
ytc_UgzRKrn8b…
G
Surely there's a very valid reason for that to get more popular. "Does AI have …
rdc_ohprofz
G
I don’t know which one I am! I just RP on some of the RP world stuff and flirt w…
ytc_Ugx2iQmA4…
G
Yeah. And here's some of the example prompts for their "sensitive questions" che…
rdc_jslo95t
G
I don't care if they voted for trump in terms of this, if they don't like AI the…
ytc_Ugw5srQNe…
G
What do you expect? AI has been trained with human data, it mirrors the human mi…
ytc_UgwbqvzHX…
Comment
Here are my objections at the start of the video
Most manufacturing companies don't actually know how they work. With piles of legacy code, processes and procedures stored in some old machinists head AI will not be able to fix it.
Super intelligence won't make a lot of difference either because in today's world we do not listen to those with the most experience or who are smartest. Societally, we could be told exactly what we need to do and we still wouldn't do it.
That being said, I do think an AI poses a drastic risk to humanity, the worst ones we haven't even thought of yet.
youtube
AI Governance
2025-09-05T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyKAB6CCBqM4JzNh9t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzk6MSmk-L-IO3N_gN4AaABAg","responsibility":"system","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzTJtOjNAEjDebovk54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx1kYOjUDojpnMgWZh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzGXaHKBhjXzNTYrad4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx-NBkKU4pPGXtnX-94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLa7fAQzDseJxwvWh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxETg0IbX9hEoeWnwF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzRihl4Y8guW8bDH2l4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-uCP7HMsYXfQDstF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]