Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We should have international law against allowing AI to make a decision whether …
ytc_UgwYCc-uD…
G
AI needs to train off of something. The issue isn't really RDJ's likeness-- if …
rdc_lubezjp
G
Amazon losing money and suing when they did the exact same thing with their "aut…
ytc_UgyL3X1UN…
G
Funnily enough, we also technically steal art, our brain takes a load of informa…
ytr_UgwUv13Xh…
G
Robot : Hope i made you proud.
Human : Hell yes!
Robot : Now you make me proud.
…
ytc_Ugwccth72…
G
She can't compete with a real woman, she hasn't got a leg to stand on…
ytc_Ugxkm3CWM…
G
*AGI may simulate personhood. That doesn’t mean it possesses the grounding that …
ytc_UgxcLw-k4…
G
It might take 50 years or 1000 years for Ai to become self aware but it could ha…
ytc_Ugxsb7sLH…
Comment
Hallucination is an analogy, not a literal term. But there's no evidence that consciousness is needed for intelligence.
And humans are constantly and confidently wrong because human memory isn't that different to an LLMs. People think the mind is a database, but human memory is reconstructed every time based on neuronal "weights" and is very sensitive to what "prompt" you receieve. It's why eyewitness testimony is so unreliable even when witnesses are convinced of what they experienced.
youtube
2026-01-25T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzaFyUx__pwPmBai0x4AaABAg.ASKKm78Jm6NASR8cGQ_8Av","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw3akF6Z0hYhNST1mh4AaABAg.ASJn2yJApqiASQYy3JSIot","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzmR1fXcItpfixKs0l4AaABAg.ASJYHOX7H_WASQGA0xZczY","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyYSu47-_ZvBUHwOJp4AaABAg.ASJ1vgVG1pbASQGHkgQRi1","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyYSu47-_ZvBUHwOJp4AaABAg.ASJ1vgVG1pbASQI0Ia7MfW","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyTtnbsDrCRj52umzZ4AaABAg.ARlJ-u9p2gKASQIGDdm8OB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgxKjiXlPpsmKLuOdGp4AaABAg.ARl5nqxmxd4ASQINw9pt4K","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwNR0pu_6IxR3-PeXt4AaABAg.ARKXtCh_tvPAS7pdydpqJR","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwNR0pu_6IxR3-PeXt4AaABAg.ARKXtCh_tvPAS8wjyUw2TG","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyECy1JrJdYgzogrut4AaABAg.ARFbR5wAQsGAS7pnrRsvAy","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]