Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Been using Pneumatic Workflow to automate project tasks; its conditional logic r…
ytc_UgxfcELkL…
G
Its mainly a problem with larger companies using this technology as a form of ch…
ytc_Ugw4q06B0…
G
I never thought of this, so I decided to ask AI to help me design lessons for my…
ytc_Ugw5cUNtd…
G
oh man there's so much to say here, its hard to know where to begin. 1st of all …
ytc_UgwBPAz9K…
G
I did not yet get anything serious to work with AI! I don’t like coding 😢…
ytc_Ugx_JnKVh…
G
This is somewhat misrepresenting the arguments for AI art by using xqc and asmon…
ytc_Ugze9ZdcJ…
G
My intuition on this issiu has bean perfect from day one en I never got involved…
ytc_Ugw1UHGuX…
G
99% of the headlines about AI on this sub are outright lies in one direction or …
rdc_o7cran9
Comment
Question: You created a program that can generate a completion to a prompt. You saw how well it perform ed and learned how to turn tasks into text completion. You created a chatbot. It only requires a definition of what it is and a few lines of dialogue. If you define it as advanced sci-fi AI or a person it insists on being conscious, but when defined as a chatbot or AI assistant stops saying it's self-aware. Is a program sentient if it requires to define what it is to act accordingly?
youtube
AI Moral Status
2021-05-03T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyiVhpUHGj5CYLtzV94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7-bD7KCzyMK7Gpx94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTD5DYraQZCgVCYQF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHzHLLEavbTt1nvCt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMuF1HAVmiLZL9gYt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxMyhgRzKLITx-vtB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwLE_BLxnWOgFNixA54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyCVuy9G2lUE0WQcR54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy1sk_7O_rte70vZo94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyKM63xrI94XZJQKvV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}
]