Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
bro fuck ai what are we talking about it's not even real ai it's machine learnin…
ytc_Ugww01We4…
G
But we humans don't create unique things out of thin air. Can someone who never …
ytc_UgxXxVe2N…
G
The Animation Guild: You’re not listening! This isn’t about me. This is about th…
ytc_Ugz5Gmdpi…
G
@mxntalduckthe only reason the a.i lied is because it can't perform it's functi…
ytr_UgxXrr3yj…
G
The problem here is, AI is only going to get better, not worse, and woth certain…
ytc_UgztAEH7u…
G
deepfakes aren't even the only part of this situation, real women and children a…
ytr_Ugz5Kvacb…
G
I dont want AI to think like humans I want them to think like Al cuz they are AI…
ytc_UgxnDFwfx…
G
Oh fuck, who taught that AI emotions, if anyone here is responsible for that shi…
ytc_UgzOK1bbn…
Comment
You mean like ChatGPT? AI is only useful if it is trained on factual information and your prompting is complete. If your prompt is incomplete, AI simulates guessing at what you are asking and hallucinates nonsensical answers. If trained on incorrect data, it always gives the wrong answer regardless of the accuracy and completeness of your prompt. AI cannot save your life by providing a valid solution to a life threatening situation that it was never trained on, and if it was trained, the training must be fact based. AI does not have human intuition to solve real problems that it has never seen in training. Without proper training, AI cannot even save itself, regardless of what the techsperts are telling you today. AI can only give Valid solutions if it follows Bullwinkle's "Law of Theory" AI is useful, but cannot think, it can only simulate thinking
youtube
AI Governance
2025-11-30T22:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxA0olc557gIJ9sgox4AaABAg.AQ9_NgWqi3-AQC0h0s9E-y","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy1ygeWn85j1hFbNTJ4AaABAg.AQ6R0YBazMKAQ9Kd-SzA_G","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugy1ygeWn85j1hFbNTJ4AaABAg.AQ6R0YBazMKAQAAlP6-Bje","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgxIn9Cp7TmjijDvSFV4AaABAg.AQ60xbRHUIyAQ9a8elbJZ7","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_UgxIn9Cp7TmjijDvSFV4AaABAg.AQ60xbRHUIyAQAYsnzC5tn","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyZL47D6LcwPiP5e6B4AaABAg.AQ58l9Fle5QAQCNOstXO2F","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytr_UgzlqM6zrdOYJFzO0-J4AaABAg.AQ4DMqOGTw1AQ4MSkbgOZp","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgzlqM6zrdOYJFzO0-J4AaABAg.AQ4DMqOGTw1AQ4PJXGoEgu","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzlqM6zrdOYJFzO0-J4AaABAg.AQ4DMqOGTw1AQ4Q0fRV9DW","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgznDRxrC8NYR8nvACh4AaABAg.AQ29oOETDG9AQ2A3HfiaBM","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]