Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the main downside of ai is that when it will be sentient it's gonna be way smart…
ytc_UgwGCiOmZ…
G
The funny thing about ai image generation models is that they fall apart after y…
ytc_UgyspqrKF…
G
Lofl Laura got a little frustrated with William for interrupting...we have a gl…
ytr_UgxWEVEtp…
G
No point in adapting. There is no merit for the artist to use AI. AI art can't s…
ytr_UgwVvVwY0…
G
We need to hear from Russia. What’s the opinion on AI from Russian intellectuals…
ytc_Ugwyuyi9g…
G
Well I guess it's time to start an AI cult and pray to GROK. ALL HAIL GROK…
ytc_UgzlIC69o…
G
"It's gonna be a long time before this stuff is that advanced..." AI's (the best…
ytc_UgzzYtTud…
G
i dont understand why ai learing from others art is so bad like humans do that t…
ytc_UgzaXYFur…
Comment
That is true, but only in the near future. Further ahead, the future is bleak for software engineers. It is true that AI does not have emotions or ethics, but they are actually easily be trained to follow those traits so as the end results will be as we wish it to be. All the human has to do is check and confirm. AI becoming better and better is inevitable. Coding will eventually not need much debugging, saving much time. I wouldn't be surprised if in the future, AI themselves will eventually train and educate us humans on how to operate them properly.
youtube
AI Jobs
2024-02-01T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx31T1AH2I4LIO_msp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzggzYGKWa_Kv956M94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxgfs5wJuRLwohOjgB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyRbn8LkIWc5hvzHxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrFAdiroVnY1_b-xV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyzdQKwYUvnbo9fIqp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9AZnflDYuxDrD4qZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5jMNqe8NBvh9QcyF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgykoQXupThpQRileV54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxOrKvQKslG9ZGwFMt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}
]