Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As soon as he said Musk has no moral compass I am done listening to him. Musk ac…
ytc_Ugx9pIoXp…
G
I agree with your conclusions and am not trying to argue the dangers of such adv…
ytr_UgzAUgaIp…
G
Like even has someone who severely struggles with art envy and personal jealousy…
ytc_UgyYdCNWE…
G
You think this is funny you just wait AI Will be having the last laugh…
ytc_UgwA83E-c…
G
Here we go again. 50 years from now people will start protesting for robot civil…
ytc_UgwjSiIt-…
G
You should us cynic chatgpt, it's much more direct and more so forms arguments r…
ytc_Ugw9L2LN5…
G
Honestly I feel like if a patient is ok with getting surgery from a robot, a pat…
ytr_UgzX4_vim…
G
Sounds like fake examples of things ai never does. ai is not reusing your storie…
ytc_UgwXLumbm…
Comment
OMG SOMEONE ELSE MENTIONED THE INTENTIONALITY OF HUMAN ART
I use it when I'm literally talking about intentions though; an AI can't put things in a work on purpose. It doesn't have a mind as we understand it, so it can't do things for a reason, if that makes sense. I'm more familiar with writing, so I'll use that as an example: an author can reuse specific phrases in specific ways as a callback, and they'll be more consistent with their themes and literary voice. You can ask AI to input that, but it can't really capture it the same way.
youtube
2024-10-25T11:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyT8-7hJsJmP3-qTvF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwQO8u0_02lImOQHeJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgzGvTmFZcaCklI77Gx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgyWuA74snFJAkuNN2B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugx9neFc8iCzUIOA4HN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_Ugy5jjvajfTuim6H5BV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_Ugzq4icI2UxiEftW7rB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgyGf2YScnVs8ys71XB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_Ugxy40i8LSyrl-e0AZt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwYPbo9yg0mWo1-U9B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]