Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why? What do robots/AI have to gain by conquering humanity and the world?
Logica…
ytr_UgyvkU837…
G
We are going to remain working because they cannot put blame over a robot, only …
ytc_UgzGIHwbp…
G
The problem with replacing any management level with A.I, even if it's provable …
rdc_jsx9ao6
G
It's actually quite easy to make a robot that is like an average person. Ai do…
ytc_UgzV_wjSr…
G
Yea....except my company is looking at usage of Cursor...top engineers are pumpi…
ytc_Ugzr02reW…
G
Damn I have sweared at chat gpt several times. I felt bad but then I thought oh …
ytc_UgwAroPaI…
G
When AI takes over it will start attacking the human race. We need to build on …
ytc_UgyDux1HA…
G
The biggest difference between humans and AI:
Humans accept changes,
AI reads wh…
ytc_UgwmmCKfP…
Comment
I disagree about the art concept. There’s an experiential side of art that AI can’t replicate until/unless it becomes fully-conscious and self-aware. Part of the appeal of art is the fact that it might have a meaning behind it, and if so, what is it?
If it wakes up and starts translating the experience of being an intelligent machine into some medium, i would be more likely to call it art.
Not saying AI could never do this, but until that point, how is it anything but soulless (to use maybe too loaded of a term)?
Beyond that, there’s appeal in knowing something was made by human hands. You have to know the intricacies of brush on canvas, chisel on stone, fingers on a musical instrument. These are skills honed over years of practice, dedication, and love for the craft.
youtube
AI Responsibility
2025-10-10T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzVo8vEK-OV1n04xBZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxBOW4Aq_TIawKVqE94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgxFt5vcgT17GAcuMa94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgwMIqC1UBX4PTv4CKV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxbmMR1LB6NmzLncYh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgxXKTDQbvwGBkMzfmZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_Ugwwi6Q1jAR7uVmMn654AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_Ugym1PkKRDmjLLpBqMt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugzwys2Sh5OzzQKaTnJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwTKIbWx3Hu5U367p94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}]