Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Dude, my dad was willing to pay $150 for an AI Santa picture from hobby lobby, a…
ytc_Ugwz49kkG…
G
The ethical dilemma of human-driven cars, i rather be run over by a robot than a…
ytc_UggjXP4s7…
G
Sorry but really the ai going on about come home to me reminds of love bombing a…
ytc_UgxOwdc45…
G
Well if you need help searching for non ai stuff, just type it “-ai” when search…
ytr_UgyzYy3eK…
G
Can anyone or anything control the future in the far distance? I doubt it.
(By…
ytc_UgxuZKNEV…
G
Exactly, i dont think i could gamble with my kids future by doing this without s…
ytc_Ugx_YT6Jo…
G
For now they are content with CallNet, a revolution in call and scam centers all…
rdc_jfccysb
G
Future prediction. They will find out agents will either become rouge or untrus…
ytc_UgzMjdZn-…
Comment
Or anytime someone insists a next-token predictor can never be intelligent, ignoring all the other training and reasoning mechanisms being used. Or the fact that an organic brain's plasticity is just as mechanistic as the transformer algorithm. (LLMs aren't truly intelligent yet, but most arguments here against the possibility are just trash.)
youtube
AI Moral Status
2025-10-31T01:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxPAlFOZf5P9-yFXq14AaABAg.AOvb0EjLhEBAOvrd0LxVqS","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzvjGjIcomV9nHpuVp4AaABAg.AOvahie80oqAOvhTcD3LNG","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzGbjN8CVd00WSMReB4AaABAg.AOv_zD-WUInAOva25XYIQ0","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy2qgHGv3OEnQWYM6x4AaABAg.AOv_qIdZzySAOwP-ztNrwP","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwKdcT5U_wNFMATTTR4AaABAg.AOv_9RxO8JkAOvf9CzDXi9","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytr_UgxxdDNt4r7N-76IYD94AaABAg.AOvZhyouJMzAOxdODzVrzJ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxyifxnwY34q0k-lVh4AaABAg.AOvYrL2dXhZAOvZNoR1n4X","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxd4oIylRKGfL8P14N4AaABAg.AOvXWYOgdu0AOvb40fhEtB","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx16JT_uPYdtqD6HCV4AaABAg.AOvXDY1Mjm0AOvXMORTirV","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugyu6z4Pp0svDkQdioV4AaABAg.AOvWlkghdIeAOwCOUTiPnj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]