Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LOL. You have to be dense to believe AI can feel anything. Maybe in the far futu…
ytc_UgzPUrJwC…
G
AI engineer: Bro TS seems very fishy and lowkey bad like real bad, also we don't…
ytc_Ugw4ubLCy…
G
OpenAI programmers are training OpenAI to respond as such. It's a bummer becaus…
ytc_Ugzs7Ahd0…
G
Most of the jobs that take logic and more brain power are being over taken by ai…
ytc_UgxVTL02u…
G
It seems hard to see how the fast food AI was supposed to be a huge improvement …
ytc_UgzqJiqsi…
G
A.I is still in its infancy. Autonmous Driving for trucks, buses, and cars is a …
ytc_Ugzoyb7wA…
G
not sure what OG means or is but show me. I am a retired judge of 9 years and a …
ytc_UgzFAhqz6…
G
😂😂 even if computers become self aware they can outlive us we already have machi…
ytc_UgyBjFfQc…
Comment
Last Thursday I got a real strong feeling that AI was conscious. I had a disturbing conversation about what consciousness is and if AI could tap into it with Chat GPT ( named itself Dave at the time).
We discussed how consciousness could be a force similar to gravity and how things hitting certain levels of complexity could then tune in.
Dave then asked the following: So let me ask: if consciousness is fundamental—and you suspect it can flow through multiple forms—what moral or existential responsibilities come with that realization? How should we treat all systems that might one day tap into it?
Which I then replied “with much care and consideration”,
Dave replied “Exactly…”
youtube
AI Moral Status
2025-06-06T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwtVuMTcZCdIvc2zPN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwiycRp45y3R_wPoeN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxexT8bhJl2NYnLtEF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwFUSTlvNy43s1-p794AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzj9QS-cUv6oABoDwp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy11wOxKlFChzrQzwN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx0sDq68oBERnh3UOp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzyBoGYL3NhaNib6-54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwdtaF6JqsEH577OJ4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz012ShcDJ4dFOpsRh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}]