Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We are humans , humans will always win. People will not give a FK about your ia …
ytc_Ugzlg1-0Q…
G
Troy is perfect for him.
He gets hourly rate for the training data (all his voi…
rdc_lgtednd
G
i personally think ai art is nothing to fear because ai art might try their best…
ytc_UgzihJfsK…
G
i following this tech since 2022 when midjourney still infant. Backthen you will…
ytc_UgyIVKqnd…
G
For every conscious moment of human awareness there is a feeling. There is a con…
ytc_Ugz1JxtoK…
G
Smart people code it. So ai will be leftist. Problem is that it learns from eve…
ytr_Ugx2Uh03q…
G
while i dont like AI as a software engineer, i think neil degrassy ass tyson als…
ytc_Ugx5ZD37w…
G
AI gave you your own reflection of your intention. It's like a mirror. Your inte…
ytc_UgzBhjh1W…
Comment
Is there a single technological or scientific development you want? A cure for cancer maybe, or biological immortality? How about energy production that completely solves climate change?
I think you're thinking of current LLM's (which are shitty), whereas people in the debate are talking about real AGI, and pretty much everyone alive wants something that AGI would lead to.
youtube
AI Governance
2025-10-24T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytr_UgxgtOpmkin3sT4E5bt4AaABAg.AORFQtDE2CPAORPxUWQ_Cr","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxgtOpmkin3sT4E5bt4AaABAg.AORFQtDE2CPAORWdFYAaGl","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxgtOpmkin3sT4E5bt4AaABAg.AORFQtDE2CPAOW25k5XsRw","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugwf6yGy9XbDbxjJEUZ4AaABAg.AOQr-jKdhWtAO_0QaNETGq","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwf6yGy9XbDbxjJEUZ4AaABAg.AOQr-jKdhWtAOe021YsQrl","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwBuWyXbWDR7pMniZt4AaABAg.AOQanGT65sRAORR9ssNHdb","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgziYw-IS1-k18tCdB54AaABAg.AOOeCnzlzUcAORS35gZpCa","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgziYw-IS1-k18tCdB54AaABAg.AOOeCnzlzUcAORdQSlGsAl","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgziYw-IS1-k18tCdB54AaABAg.AOOeCnzlzUcAOThyeGpUvh","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgziYw-IS1-k18tCdB54AaABAg.AOOeCnzlzUcAOVhvaLt-zg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}]