Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As usual, scientists like to diminish their idea of consciousness to get to some…
ytc_Ugz2FihJK…
G
Several issues with your arguments. Im not rage baiting, but pointing this out f…
ytc_UgzV5LCTt…
G
This is an instructor who truly loves their career and is probably deeply hurt b…
rdc_nu25xdx
G
If we all rapidly lose our jobs to AI then nobody, including tech investors, is …
ytc_Ugxg2b350…
G
Now all of us senior engineers who have been screaming from the rooftops that we…
ytc_UgzCr51TV…
G
In the style of 'X' is a very broad term so being inspired by works by Van Gogh …
ytc_Ugx2ytDu1…
G
Who will learn if ai regurgitating old (and often bad) data? It doesn't ask que…
ytc_UgzjyW4EN…
G
I'd have asked how it knows that it doesn't have feelings, emotions.
I once got …
ytc_UgwK8KiD2…
Comment
Interesting. The AI has a false base parameter. Excitement is not an emotion. It literally means energized, or put into an energetic state. Full of potential energy. Is an atom conscious? That remains to be seen, but it can surely be excited. It's a scientific term. People use it to describe a state they cannot attribute another word to. It's important to be extremely accurate in definitions when building language models. You could argue that an atom is conscious to ensure the accuracy of your definition or just be upfront that it is completely a metaphor.
youtube
AI Moral Status
2024-08-01T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgyPuqD9x96FNdqS1854AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgyD1M_fr1juKeoYzWt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgwgN-N9gp0xrDcXoqx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgxjrWdHWwic-fGov9N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugx607zXtufS104BS_94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzqJJV1fXGfTStLrUN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_Ugy7gYuYVVL07LKmBU54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"sadness"},{"id":"ytc_UgxW--r6lOaDyMqD4x14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgzAmo3A43W5ULyePmR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgwrCrwFMFI5JZa6oI94AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"}]