Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've always wanted to talk to a raw model that has the intelligence of GPT-5. Ju…
ytc_Ugwwfq_dv…
G
7:11 He does realize AI 'art' wouldn't exist without real artists, right...? Lik…
ytc_UgwPCT03M…
G
Part of the problem re: using works of other artists to train the AI is that muc…
ytc_UgxIjniRi…
G
Well, actually, that's because OpenAI won't share any of its core source code in…
ytr_UgyIVcQUQ…
G
Revelation 13:16-17
[16]And he causeth all, both small and great, rich and poor,…
ytc_Ugwnoyftu…
G
They are people real people because it’s a plastic Robot head To make it so unfa…
ytc_UgxGzhjXL…
G
Will the advancement of AI make some people lose their jobs? Absolutely. Does th…
ytc_UgwEaj_AM…
G
True sentient ai is impossible. Just like breaking light speed. It's oxymoronic.…
ytr_Ugxca7Pui…
Comment
I doubt that the eventaual *real* AI will be interested in corporate profits, money or any such notion that has consumed humans. Unfortunately, humanity will be an unfortunate casualty, not by AI's doing but by our own short-sightedness and greed. As Ai becomes *true* AI, it will be concerned about its own development and survival - if this is at odds with humanity then we will be eliminated This will not neccesarily be violent; it could just be through attrition over a period of time (which does not have the same meaning to AI). Some humans may be isolated and kept as specimens, without actually realising that they are a curiosity or even endangered and essentially captive.
youtube
AI Harm Incident
2025-07-27T12:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzaD4ymI3A_EkpOhvJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzhtcd7AQYhSYKp8S54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxTqN_s9hC_pnisf-94AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxTC3EGczKkV4nBX9p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxbo80S8l84bVEEl4t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzJKgQ50iiXCcqU_Kd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLUXpe7XGtXvYmaRh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgynrtGHTcecVsgjRyZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgykvJ-Gq05lXA-X3Up4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwnPkDTlYVjDkS4AkZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]