Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LAW = Light Anti-Tank Weapon
LAWS = Lethal Autonomous Weapon System
No chance o…
ytc_UgwfXsu0h…
G
Thank you, makes sense. I had no idea that a carbon credit market existed until …
rdc_ckqc6xf
G
AI can just serves us the content that enrage and isolate our mind to the point …
ytc_UgwWH9S7O…
G
The image of the Beast is the robot or Artificial Intelligence clone, cyborg, a…
ytc_UgwowZh1P…
G
Horrible analogy, it makes no sense. The main issue is that it’s illegal to stea…
ytr_UgzAg4D9r…
G
Making up things if it cannot answer outright is just the thing ChatGPT does. An…
ytc_Ugzzdo1-v…
G
is it just me who feels Neil could have said some words about Asimov's view on A…
ytc_UgytSlHac…
G
Glad to see your reaction! Sophia's perspective on wisdom and learning really op…
ytr_UgyAZbctW…
Comment
(1) Human thinking is not "generalist" in any shape or form. This is a misconception that western thinking just cannot let go of since Locke. It is an amalgamation of deeply specialized systems. (2) Hallucinations are a symptom of a deeper problem: AI systems have no concept of meaning, therefore, they have no concept of truth. Hallucination can only be solved by providing AI withy a world model that they can use to make predictions, and an ability to check those predictions against reality. There exists neither meaning nor truth without this. (3) The prompt injection problem is also related to (2).
youtube
2025-12-01T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugznn6OMicj8_M-As4l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwo7hVV9tf5KGd_ysJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxu59BPfFkS6ipYBcp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgMENivAwCcwI8G5V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_Ugw916lDKYtLxDOUCGF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVDAPyIuCue_W5gXx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyXLIZbGnyZ9gms7354AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzluJW_tyVlbvWbyVR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_UgwolUxnC56PdaXtLoN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyVVNi6UloVqEGBm4N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]