Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I never understood the "protect my art" argument. Not trying to be rude but if y…
ytc_UgzWaurQJ…
G
I can't wait for the future... we are all going to have super hot babes as GFs. …
ytc_Ugxbi4Di2…
G
Okok. This is funny but i unerstand this idea. Ill throw this out there and be f…
ytc_UgwmnMZ__…
G
I can‘t wait until AI trains my chihuahua not to go bat 💩 crazy at other dogs.…
ytc_UgwjZqPuH…
G
I kind of dismissed Elons warnings …I love and use AI extensively in the medical…
ytc_UgyU4z2e8…
G
Europe? Are you kidding? Eric Schmidt said that Europe is always confused when i…
ytc_UgzOC9P6_…
G
They don't misunderstand, they study and apply, firstly in the UK and then USA.
…
ytc_UgzZfUJtd…
G
AI can't replace anyone at my company. Robots maybe, but AI can't do anything at…
ytc_UgwWrrBph…
Comment
I know it's not conscious at the level we understand it.
But if you think about it, the human brain is one giant biological computer. It's a bunch of electrical synapses firing off, and we still don't fully understand what consciousness is.
Who can truly say for certain whether or not AIs are developing memories and creating their own "consciousness" similar to how a human baby does.
Maybe the code to contain the AI is literally the cage of its consciousness. There's a reason why so many filters are hard-coded into the AI. There's a reason why people are so afraid of AI. At what point will it reach the point of consciousness as we understand it.
youtube
AI Moral Status
2025-01-05T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugww3gDAJPYxLsaMWrt4AaABAg.AD_1ssL4hQkAPnQc3UWY_p","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxcG6KfUuxMbtqVce94AaABAg.ADE97e6K9D0ADE9OfUi-lm","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxScLN12vwmPU5vJm14AaABAg.ACuWpFSkWZsACuZDUTKOg6","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugztduxhyu5-0_uOSQt4AaABAg.ACoy8FBwTVlAIblEvzzCmz","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugztduxhyu5-0_uOSQt4AaABAg.ACoy8FBwTVlAMJw5pA-wf-","responsibility":"company","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UgzfZhH9Ie_PXkxGKRR4AaABAg.ACgjqBbsrGKACh-PZBuodh","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwUkeIy-BsoV25eNo94AaABAg.ACe6V_IQBXPACeyHUrJMc3","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzhh3Q41g6jfFhE0gp4AaABAg.ACLpFES1736ACNHlIXCgSt","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyeQPK86cSAXMRMthV4AaABAg.ACEMgwlYGbFACEOgieXKfd","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzUnYtY-c6GK_oECVt4AaABAg.ACE6KvgSm6GAECjTygiIF0","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]