Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Obviously AI.
Pay attention to the details folks, the edges are pixelated. Focus…
ytc_UgyR-orhB…
G
Also, I think the relationship between ai art defenders and slugs is that they s…
ytr_Ugz8_P9wG…
G
Let the Robot destroy one person/student first and see what happens to the Robot…
ytc_Ugx40aSB2…
G
7 years after the video [.....] Wait, we're actually grappling with this issue a…
ytc_UgyMujM-j…
G
I hope its not too late to answer the 2019 question 🤣😂, the answer is B, robot w…
ytc_UgyhVNqyR…
G
A number of foolish arguments made in here.
Firstly, it is obvious what AI woul…
ytc_Ugy77fMJ2…
G
The last hospital I worked at had AI and its function was to review patient’s ch…
ytr_UgweJwn6P…
G
I just trained an AI on your content! Ill be sure to give you credit.…
ytc_UgwQS_XsG…
Comment
This is flat out wrong. I mean, a hallucination IS coming up with something new. It's ability to create novel ideas is precisely why it's capable of hallucinating.
And yes, it can determine if something is true or false. LLMs are effectively world models, which allows it to judge between conflicting information.
youtube
2026-01-25T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzaFyUx__pwPmBai0x4AaABAg.ASKKm78Jm6NASR8cGQ_8Av","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw3akF6Z0hYhNST1mh4AaABAg.ASJn2yJApqiASQYy3JSIot","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzmR1fXcItpfixKs0l4AaABAg.ASJYHOX7H_WASQGA0xZczY","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyYSu47-_ZvBUHwOJp4AaABAg.ASJ1vgVG1pbASQGHkgQRi1","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyYSu47-_ZvBUHwOJp4AaABAg.ASJ1vgVG1pbASQI0Ia7MfW","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyTtnbsDrCRj52umzZ4AaABAg.ARlJ-u9p2gKASQIGDdm8OB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgxKjiXlPpsmKLuOdGp4AaABAg.ARl5nqxmxd4ASQINw9pt4K","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwNR0pu_6IxR3-PeXt4AaABAg.ARKXtCh_tvPAS7pdydpqJR","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwNR0pu_6IxR3-PeXt4AaABAg.ARKXtCh_tvPAS8wjyUw2TG","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyECy1JrJdYgzogrut4AaABAg.ARFbR5wAQsGAS7pnrRsvAy","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]