Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I get the argument that society panicked when the PC and initial robots came out…
ytc_UgxXypqrY…
G
At the minute, Tesla is running FSD v12. They've virtually eliminated all the ol…
ytr_UgwTRzLwo…
G
@GhastlyCretin
I don’t know if they always do their job right. AI is very new …
ytr_Ugy5PX2wW…
G
I can't understand why people are so stupid and think this AI would help .
What …
ytc_UgzJS_QOD…
G
AI psychosis is a blanket term for a host of mental issues that can be caused by…
ytc_UgxkTxXGl…
G
😂😂😂😂😂😂😂
IA diabolisée pour servir d'ecran de fumée à la dangerosité et la dérive…
ytc_UgwrksZBI…
G
AI deployment is not just about the technology or electronics itself, but about …
ytc_Ugy_WsPbz…
G
I keep trying to tell dummies not to support AI or use it for obvious reasons. …
ytc_UgzASRHk6…
Comment
wouldn't the easiest question to get a direct answer just to ask something irrational to a robot to rational to a human? like the old joke "ghost, host, most, roast what goes into a toaster?" most humans will say toast, because they have been primed to answer that way, but a robot will always say bread.
youtube
2016-08-09T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgjlGx8FR-EgZHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugi-CMHZ6z1IiHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugi7CjBupUbtHngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggnfU6yPgq2B3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggU9g4favmQ-3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgihvWXlqNA6T3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjV46XtY-kr1ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggFxLep9Z31AXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggloAGB5WNOMngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghR_DYsydJIdHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]