Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the most interesting twist would be "Even my job has been taken over by AI, I am…
ytc_UgxPxU8Kh…
G
I just tried that w my essay after watching this video. I'm applying this year a…
ytr_UgzOB-6On…
G
Dementia is a real pandemic and the costs of managing it are out of control, can…
ytc_UgxYfJtKM…
G
If we just keep pushing orphans into sam altmans protein smoothie, chatgpt will …
ytc_Ugzy8GiA4…
G
Massive layoffs is a poor decision and replacing workers with AI is much more ex…
ytc_Ugzzu45hJ…
G
The problem with so many companies replacing humans with AI is the dilemma of ho…
ytc_Ugx20VHB5…
G
Do you own it? You do if you tell no one it was made by an AI. Problem Solved.…
ytc_UgwWWu1hn…
G
This is NOT true. AI is a parlor trick, companies are learning quickly that AI i…
ytc_Ugxl3ubRB…
Comment
One problem I have with this. (Beyond skepticism of how fast AI actually progresses), is that suppose superintelligence was actually achieved.
I don't see how any inorganic system, however intelligence it is, would ever not be dependent on humans to function. Even some small number of humans.
So fundamentally I don't see how it would ever be in superintelligent AI's interest to ever "kill" its host so to speak.
I feel like much of this discourse is borne out of human psychological paranoia and existential angst. That always existed and always will exist.
youtube
2024-08-31T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwIIqybpGeN9WvpvXN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_Rz302MtVQJiPejV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzLmu4J2IhrpjtP0JV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwotWleZDbjO0RvuCV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwt3hE8SMEr7T-NTqB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzohHrc2hrlabiQchN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwhRWPFFRjQ06whMr54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyLC5BUR1L_IByn9OR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxhzhGGeHFqo7ZZTC14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRMcH-3XiMtKSjkoZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"industry_self","emotion":"approval"}
]