Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"I've been working with Google’s Gemini for weeks on a Hard Sci-Fi project. Anyo…
ytc_UgwbVEEXu…
G
These people don’t understand what artificial intelligent capable to do it just …
ytc_UgzqwAuc5…
G
This f** shit videos and introduction of AI makes the youngsters mind lunatic 😢
…
ytc_UgyBvO3rN…
G
The most powerful thing we have learnt in this AI space is we ought to make syst…
ytc_UgyRTTgOl…
G
"AI cannot learn for me." If people don't want to understand that very simple st…
ytc_UgzxieICZ…
G
it is sam altman! in Doac's new video about ai safety, the guy ment its sam altm…
ytc_UgyGmGZ0Q…
G
I had a dream where my mother saw my ai chats, I just make it either violent or …
ytc_Ugw8hlVNB…
G
Humans might need to merge with AI because we’ll be too dumb to function the way…
ytc_Ugw3NxuyR…
Comment
For the “cat out of the bag part” there is probably enough public domain data to train AI to do creative writing and even image generation. No real option to go back.
Also as someone who studied AI before it was cool it is really all or nothing. The way the models learn leaves very little space for tweaks. Curing cancer (while that is mostly done with CNN and other models) and writing a book cannot really be separated.
Unless we decide to take the Dune approach it would be best to really enforce copyright upon the companies producing it. You can build quite sophisticated AI models to stop anyone from getting anything “in the style of” or “inspired by”. etc. If there were heavy fines and motivation for the companies producing it that would be way to go IMO. Otherwise given how the technology works complete ban is the only option. There are so many ways how to get around other regulations (data laundering, destilation etc.).
youtube
2025-07-01T16:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwm_xrWLFhbgApJztR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwLYNtGQq1RkuFZpSF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwhRu3PUrKIxuf0FaZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8nLxU5WP2lNrwxlF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwI-ITYi8Xy-4y-nrx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwW3OuEdFyf9nK55v94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzPhP9geq2hrOET-xx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxc_If0q9WCp6X3lbZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwrCIGZSupyOOLssTJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgygTZdf-SMv2iR-K7R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]