Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There's a lot of ways to confirm it. One way we can confirm it is that we have m…
ytr_Ugx4KY6sb…
G
Don't forget that private corporations are doing this for LEOs now. Oracle has a…
ytc_UgzhFYehr…
G
This is scary. There should be laws against using facial recognition as evidence…
ytc_UgyCy34fm…
G
Most respectfully and with good wishes, I think you make several good points, bu…
ytc_UgxQSl-Kk…
G
Tout le monde est en train de se trouver une piste pour ne pas perdre sa place m…
ytr_UgxlfsU6h…
G
If anyone has seen the movie 'Her' someone in the world is beginning their relat…
rdc_jck3v8d
G
@Pfromm007 IF AI art is theft because AI learns from other's art, then human art…
ytr_UgyN5-Wyq…
G
Reasoning models do NOT make their inner workings external to us, rather they cr…
ytc_UgzelWm4E…
Comment
I don't think enough emphasis is placed on the "if" here. LLMs and any "reasoning" models built downstream of LLMs are still just probabalistic text generation at the end of the day. If superintelligence is at all possible it would require a complete paradigm shift. The randomness of output ("hallucinations") compounds the more rounds of "reasoning" you perform due to the probabalistic nature.
The real harms of LLMs are in the offloading of learning in school, the theft of writing and art, the loss of jobs, the impact on environment, etc., and that's where people should put the focus, not far-flung doomsday fantasies. Just my opinion.
youtube
AI Moral Status
2025-10-31T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx8yxJZKpH46Z1IODN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwiC9VNOssQwvCkF6F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxP8G7IjCGRxAKd5hV4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgxUfPUGkzdXuMeGxaR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXpIoRMRZmS0XcixF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxafC012V1408U-Sbl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyNhdlZ1q5UsZ-_csF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz9W74a5AEdYMXNkip4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugwc601WIY7Vd8WGbyJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwdl_LvYMaq-u6O-1x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]