Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My question is this....in every case where a game-changing new technology has be…
ytc_UgxyD-xN9…
G
#Blackrock ##UmbrellaCorp #AttackDrones #UndergroundCities #UndergroundTunnels …
ytc_UgyRGMLIM…
G
No need to worry... this entire research field is basically full of shit. Or, to…
rdc_m9im9g4
G
*AI and 5G have NOTHING to do with protecting people. It's ALL about enslavemen…
ytc_UgwFFc1ea…
G
WAIT, didn't the A.I. robot, given the senerio that a train had a certain route …
ytc_Ugwy1txYv…
G
Most people that follow AI closely would easily recognize that this guy is liter…
ytc_UgzO081V2…
G
When a person keeps talking w/ the same consistency w/o getting tired, and it lo…
ytc_UgyZjSSY1…
G
@drivebyquipper If it passes the Turing test then one could argue that it is art…
ytr_UgyyEkvkm…
Comment
The Human Construct is upset because the AI encourages him to clap—staying true to the original goal—while he expected the AI to confirm the infinite-halving paradox that says it’s impossible.
He seems to think he’s outsmarted the AI by trapping it in a contradiction, but really, the AI is just aligning with the original directive: to help you clap, not to get stuck on abstract infinite regress.
This feels like an attempt to manufacture controversy and inflate one’s own intellectual prowess by twisting the AI’s consistent responses into a supposed contradiction.
The AI remains aligned with the user’s original goal, while the user tries to portray confusion as cleverness.
youtube
2025-06-29T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxipzfwLHqapEdjmst4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwiMOanJD5rUMNIzuh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgycbOD3FPatBFb9S7l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyMnJ1JU16IEUwLghx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"frustration"},
{"id":"ytc_UgzRMRcxc6aODcJW4qt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgydLE68ACqCYiPWOnp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTFNBbr1QZMVMKoed4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyAFgP3XSk1VAL1kVl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgwWJpPDaRECRiLIRuF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxh-haHvsCOZAY4swV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}]