Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
whatever your thoughts on ai, i just want to say: if/when ai becomes conscious, …
ytc_UgyG6QBgL…
G
If i dont get a job bc of ai i swear to god..
Art is literally my only talent…
ytc_UgweQDJCx…
G
Exactly the reason why people prefer reddit over Google search. Google search is…
rdc_jj933d0
G
Those privacy concerns are not that irrational.
1) limit the age for kids
2) c…
ytc_Ugw6jKuid…
G
AI can do all the 5 skills mentioned if you ask the right questions.
Also AI c…
ytc_UgwbXmlxA…
G
Yoy have to be Pretty Dumb in the first place to rely on AI too heavily! Thus...…
ytc_UgxCZJw72…
G
there’s an artificial intelligence called sora that makes animations that are al…
ytr_UgxJ5eHqP…
G
Such a waste of this beautiful young mans life. I really feel for his Mum and Da…
ytc_UgyGlmTRN…
Comment
AI will never be awake unless we somehow figure out what consciousness is. AI can seem as human as possible to us and we might perceive them as conscious but are they truly conscious if they are just running of a script in the most basic sense and their actions are just a result of software and hardware inputs and outputs that we wrote and designed. Basically how can they be conscious if they’re running off our code? Their code isn’t unique, we wrote it. Our code is unique, we don’t even know how to understand it nor who designed it, if anyone at all.
youtube
AI Moral Status
2025-03-14T01:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx-g0JhG6kO3GTnRTF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzVsyMx2ubmfwRmPZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyOwIDeJ5o7etc5Czh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"sadness"},
{"id":"ytc_Ugzap8Gn8h24MuCDHRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw9hZ11V7i1pSRz8ex4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzUqbBJACD7902_hi14AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzGaRixydj-yPm3W2t4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwLZpRcgfvKUtEaEp54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxczyGIN8rOQsIepaB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtIqZSDrF-TlrCbK94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]