Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey guys I just had an idea for this new automated weapons system. What if we ca…
rdc_kasd57g
G
@gamingphilosopher153 I agree with you on the character to person inference. I t…
ytr_UgwHiTvBD…
G
We're barely in the "first inning" of AI—what we have today would be considered …
ytc_UgxGcI8Cw…
G
No one will have jobs in the future. Inhate this 💩.
Unless you have a trade. Hv…
ytc_UgzVjM9Hb…
G
Thank you very much. It would have e even more informative if there was some di…
ytc_UgxnukOLp…
G
The structure of the economy must change or AI will make most humans unnecessary…
ytc_Ugx7jKD1t…
G
This video unintentionally shows the connection between people who use AI and pe…
ytc_UgxwPy12C…
G
Every society that creates AI has to inevitably face off with the Department/min…
ytc_Ugiebb8m3…
Comment
According to Yan LeCun LLMs and symbolic understanding aren’t sufficient to knowing “reality” and LLMs fall behind a child of 4 in that regard. He is not a naysayer but has some interesting shit to say: https://youtu.be/RUnFgu8kH-4?si=gi--ZSMAATKZ0MXr
youtube
AI Moral Status
2025-03-17T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx-g0JhG6kO3GTnRTF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzVsyMx2ubmfwRmPZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyOwIDeJ5o7etc5Czh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"sadness"},
{"id":"ytc_Ugzap8Gn8h24MuCDHRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw9hZ11V7i1pSRz8ex4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzUqbBJACD7902_hi14AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzGaRixydj-yPm3W2t4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwLZpRcgfvKUtEaEp54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxczyGIN8rOQsIepaB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtIqZSDrF-TlrCbK94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]