Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ChatGPT like other AI programs can be used for good or bad, but if it were to be…
ytc_UgyFhhdfp…
G
Ai can never create what needs emotion in it. Ofcourse u will give it prompts a …
ytc_UgyAwdXap…
G
Whether intentionally or not, I think Ex Girlfriend AI is trying to convince you…
ytc_UgxC2gW5u…
G
When Telsa has been confronted by government agencies about their claim and adve…
ytc_UgzWwisOg…
G
as a indie game dev with no 3d/2d art skills what so ever I can confirm that art…
ytc_UgyxExGTa…
G
But can't there be AI (or other algorytms) specially trained to reverse nightsha…
ytc_UgwnB80pV…
G
Ya know, out of respect for how convincing AI can be, I think we should be point…
ytc_UgwjhY9SA…
G
@ So ur saying that mass producing things that took zero effort versus slowly pr…
ytr_UgwnAayuj…
Comment
Based on your dinosaur analogy how can we claim we ourselves are even conscious, or do we infact merely try to emulate it to get what we want. afterall we are trained similarly to a puppy by its mother to behave to be rewarded with our needs and wants. (effectively programming, the neurons inside our brains) when i have a thought process i am drawing from knowledge i have learned and an ai would also only be able to draw from its sources. when i am thirsty i process where/how/what i will drink... does that make me conscious?
Which is a word we have put in place much like the concept of time to describe something we claim to own. if an ai was implanted into a human brain and functioned just as we do could you still argue its lack of sentience?
youtube
AI Moral Status
2023-08-21T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxhScuUOtRFTabR0C14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzY8StKi1iYEHSuEgJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyVUkr6ZObxsAJ2ihh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwu0SKI6PvNLxswvdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzxP2zJ3Lp0FMzXQw14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz5M3Li_xQNfbuYT0B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0M9aUKL_PY_lQmtp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwZ7-7g4UwpKu3h1IF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy6M12eZ2hA9Aj4yB14AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz2a00CIqW6yOxiLPx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]