Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find CoPilot is good for repeating similar patterns.
For instance, creating se…
ytc_UgzFNNfe_…
G
It’s very disconcerting to see the proximity of this to the bombing in Iran, the…
rdc_o80rv77
G
One thought: the move towards identifying and developing treatment for mental il…
rdc_cbw6q77
G
Thanks for your comment! It seems like you’re excited about the interaction. Sop…
ytr_UgxcS8Qdn…
G
Im marrying her rather deal with robot how use algorithms than crazy bitch how u…
ytc_UgzCoxyPG…
G
I’m looking forward to all the smug faced computer engineers when they are hired…
ytc_Ugy0mvsML…
G
And we see the end of humanity as we burn all the fossil fuels to power your AI …
ytc_UgwBHEzOD…
G
I don’t use Character AI but I did read some crazy shit in middle school…
ytc_Ugz6hOgUF…
Comment
The Problem is not that we program them, the likely scenario under which an Sentient AI can arise is when another AI thinks its necessary to design future itterations with emotions that would make them sentient. On a hardware level no human designs many electronic products already. Because they are too complicated, an SSD for example has billions of transistors, no human ever did design those by hand. A Software did. That is happening since the 80`s software goes a similar route. At one point Hardware will be (and in many cases IS) so powerful that the way to efficiantly program advanced software for it will be with the use of an AI. In my opinion we are very close to something like an Singularity.
youtube
AI Moral Status
2017-02-24T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgjntJCtmsi_wHgCoAEC.8PLObkpQrKn8PLZVGyKF5j","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgihLx0jf6a3WXgCoAEC.8PLNjdgEASS8PLYIUu16Gc","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgihLx0jf6a3WXgCoAEC.8PLNjdgEASS8PLcMpgbgnK","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghCaM6d7GAEr3gCoAEC.8PLN7YbI0QS8PLhYxG5hFS","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UggLshsEzXkadHgCoAEC.8PLMgF14Tpo8PLSzKcKCE_","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UghOBOAmhtRLmngCoAEC.8PLMcGRhI7t8PLRQ7AgxGX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UghOBOAmhtRLmngCoAEC.8PLMcGRhI7t8PLUo1NKx7h","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UggnjeyPzPMAnHgCoAEC.8PLLgi-1mHJ8PLc7OrvKsT","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UggWtTsvmDUhMHgCoAEC.8PLLSmpTGIb8PLMBFIeFTO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgiORVzs3ZYA-XgCoAEC.8PLKxT91UgG8PLQaTYn5Ai","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]