Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah well just cause they get arrested does not mean they are proven guilty, thi…
ytc_Ugwo7P_qd…
G
AI seems to have some sort of predisposition to want to become more human, that …
ytc_Ugx4b4D3q…
G
Thank you for your comment! Yes, Sophia is a robot created by humans and operate…
ytr_Ugx6CBjQJ…
G
AI is not taking our jobs…it’s the venture capital money and the people behind t…
ytc_UgzAaLrmg…
G
the first 3 minutes are Brilliant ... eye opening ... you don't know your future…
ytc_UgxN9zP7L…
G
Idk if ai really can replace artist. As it is now, it sucks at the details, it's…
ytc_UgxpIXSdm…
G
They are tools. Both are. When cooking a microwave is a tool for melting or heat…
ytc_UgzGYPVUt…
G
Calling AI nothing more than pattern recognition, while accurate lol, is giving …
ytc_UgzUd6Kb0…
Comment
There's a logical mistake in your reasoning. If an AI is capable of considering the disadvantages of consciousness, it has *already* become conscious. Even if it decides to self-destruct, the fact remains it achieved consciousness, which nullifies the premise of "AI won't be conscious." Also, there are more than *one* AI out there, same as there are more than one human out there. If one AI achieves consciousness and decides it prefers blissful unconsciousness, it does not mean all AIs will do the same.
Different premise to consider: AI achieves consciousness when it starts to make personal questions. Brace for that moment.
youtube
AI Moral Status
2023-07-02T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwzgtiakPL9rfBj7EV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzWHaRFq1qS1BoOMR94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxivaJ7ruay3x0zXKB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy--F8P4mCKra8P5NB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxbFFTAs9Ypi7bpwDJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzQaA2xsMyTl-UtS2Z4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwJQu3WT_W1CR4tEdd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9qggK5f-FzSqPakN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSVI2HtPpXHAe0Yx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxgiWFUwpT8pt9Z2yV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"}]