Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find this funny because earlier today I asked ChatGPT to give itself a name an…
rdc_jcj816t
G
Heck no specially chatgpt that thing is a life and time saver it dead seriously …
ytc_UgzMKE9zI…
G
Never. We will never know. Once an AI becomes conscious it will know humans fear…
ytc_Ugw-gEflU…
G
@rustycherkas8229 Ai cannot discover any new scientific or mathematical theory …
ytr_UgxO1V28R…
G
Ikr ai is good. Not everyone can make art let alone hire someone to do for them…
ytr_Ugy1fC9fv…
G
That's dumb to fight and Robot there mind out of Meadow if you want to win Again…
ytc_UgyQ7-KN3…
G
I had a Facebook notification yesterday, talking about the new meta AI that doe…
ytr_Ugx07iij1…
G
when you run the AI on ICP the internet computer protocol there are no hacks.…
ytc_UgyEsVyKb…
Comment
It’s wild how this whole discussion nails the deeper problem: AI isn’t replacing people, it’s amplifying whoever already knows what they’re doing. That’s what makes it feel “weird” instead of smart. The scary part isn’t the tech itself, it’s how fast it widens the gap between people who adapt and those who freeze in place. What helped me wrap my head around that shift was AI Shortcuts for the Lazy Mind by Trent Calloway and it’s all about building mental and technical shortcuts so you stay on the using side of AI, not the used by it side. This video captures that unease perfectly.
youtube
AI Moral Status
2025-11-11T16:3…
♥ 2021
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxdHrYPTPNOhO891Tt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAXf3hrblCWY6Ugph4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9CnVlptMpS3H921N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzr8bee1-IJvveIiIV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBrSxO5uV-z_MhcFJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0Tt4XKK1CETZR8ER4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyn8OoCn_Tv2tkN4wp4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9h-a6IYSf_NwImCF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxD1JRMbp4vkOFvQFd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugze6jqOoR7gGknhckp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}
]