Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Although I believe everyone should be able to have their ideas expressed in art,…
ytc_UgwMM-YEq…
G
If women start doing deepfake gay videos of them, will it help them understand t…
ytc_UgzJ0nYq_…
G
It's the year 2025, and I'm still waiting for A.I. to make a hot girlfriend that…
ytc_UgwcK-fNa…
G
So how many employees does Artisan have? 50-200.
Maybe they should get some AI …
ytc_UgwESoVi0…
G
The irony is that, None of the following artwork looks as elegant and clean as t…
ytc_Ugxpn5B8D…
G
my career as an escalation specialist for a major canadian telecom of a decade:
…
ytc_UgxPQ3rvT…
G
"if u hate AI u hate technology!!"
and yes as I draw here on my apple iPad while…
ytc_UgzwEBzcu…
G
Most of software engineer is repeating the same patterns of development, which A…
ytr_UgzQ1WE9P…
Comment
What if Consciousness is a paradox and we'll never be able to replicate it? What if it's just a thing that can appear in the process of evolution? What if...and hear me out on this one...What if consciousness manifests through the million year long process of development? What if consciousness is more a spectrum then an actual yes and no? Maybe it is a blurred line? Like I've seen videos of monkeys looking at themselves in the mirror. Idk man.
I'm sure there's going to be some smart fuckeroo in the next couple centuries that figures it out, but I have a feeling that the definition will be argued against (philosophers have never really agreed on one thing so why would consciousness be any special?)
I think the biggest field of study that will need to rework some shit is ethics. If we arrive at even AI that pretends to be Conscious then ethics has a lot to figure out.
youtube
AI Moral Status
2023-08-20T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwhGVarkQWJf1CkTJR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCQRVPNch-SzcQR214AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxXD-NOrbyBi6pjIpl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx-N8L7QNKgo4rz_up4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyfdTs3QxSXh8Giwt14AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzIKUC96Ht3a5xNAUR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyuiwS1FhS0folRS7V4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwF_U_bq_WnntzQwsB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy_EJuifQUx9YCfRIx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz9eFRvPxlBDPOZP0R4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]