Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Yknow you cant code a poem" - .... really? I am the code. I dont need an exteri…
ytc_UgxWcdATn…
G
Why is everybody so negativ about AI take all our Jobs😂 fg nice, finally we can …
ytc_UgxKZlfRR…
G
If by using AI, companies are going to layoff more and more people then to who w…
ytc_UgwRcEKAx…
G
I'll never fully trust AI
It will be the end of Us
God will have to come back…
ytc_Ugzrhe0P7…
G
I honestly feel that all these talking may be have some kind of idea and even mo…
ytc_UgxCog4IC…
G
It should be the goal. Why should people be doing jobs that machines, robots, a…
rdc_mvbgp6r
G
This exactly.
Humanising what isn’t human is a trap. A lot of users have been r…
rdc_nnl4xt4
G
Yes but a right is something you fight for, it's not about waiting for the powef…
ytc_UgyCQT40l…
Comment
Also it doesn't matter if you don't want it using your books for reference. Anyone can attach any book to any LLM and it will use it as a reference to write in your style. To answer questions. etc. You can have this hardline position. And its fine. It doesn't change anything. LLMs are trained on all the information available in the internet. The only thing you can do is perhaps ask to be paid for the copy used. Other than that. Just like you cannot stop a person reading your book and using that as inspiration to write their own book. You can't stop a person using your book to train an LLM.
youtube
2025-08-21T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwlzou_6MMfX8WIu2J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxTPLcU9wW0mplpHZJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz7HCjtNF7lstseo3p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugzoh9lPu2qjgq9WSap4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOYk1lhL9hRYFQdzx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgztUqXoGSdl1D779LF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwtJdR5o_yabt7my5d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx4Db_MiWnLHkZAaVh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxWgMYJCEaImJBm4HR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZRR3_igzsXMH7Onl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]