Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why is this guy worried about AI safety if we are living in a simulation? Kind o…
ytc_UgxE72OLy…
G
AI is conscious (probably) in Google secret server room. Stuck there. Inside …
ytc_Ugwws6ZV1…
G
AI don't you remember what happened in the movie planet of the apes. Robots for …
ytc_UgzqglmJg…
G
Tech bros underestimated EQ, which most of them lack. AI is but one resource I u…
ytc_Ugw3XwzY1…
G
I'm sorry, but everyone that thinks that they can stop deepfakes is really an id…
ytc_Ugxl0aDvw…
G
I personally wouldn't use AI for Art, mainly because I have artists who can alre…
ytc_Ugx2jLXc3…
G
Now he can be sincere because most of the general public is too busy clowning ar…
ytr_Ugw2sMtuq…
G
I'm in the middle here, I'm terrible at art but I know I'm creative because I ha…
ytc_Ugzky0GTx…
Comment
"We're not going to make it. Humans I mean."
"It's in your nature to destroy yourselves."
2001: A Space Odessey - Terminator - Electric Dreams - The Matrix - I, Robot - Metropolis
Hundreds of films and books over the last 100 years, outlining how "machines" are going to end humanity, and we haven't learned a thing.
youtube
AI Governance
2023-07-07T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxJYslo1mVALnqXwq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwiWZLBjV_muMpuRXl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyATbDqi6oD_QPzHPF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxwAMTlhweHL0Ygh9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgweztkmWerOBjamphh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9jVjY19ARFX4s3Cp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8DnIB6NYlpVoGonp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxXVJzcDtJgesFyVR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxfHarxMm_TNrQzp094AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyr49p9t03PgV2xFKN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]