Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What about teachers?
Surely they could be replaced by one on one AI bots
That wo…
ytc_UgzdXCRwg…
G
I scrolled down just a little bit after this video and got an ad for “getting ri…
ytc_UgxQ62JdG…
G
AI cant draw a full glass of wine because on google or anything if you look on p…
ytc_UgxdAlnMc…
G
If you were to copyright ai images, you should give part of the money to the own…
ytc_UgxHG2yNb…
G
Specifically generative. Videogame AI such as the system the combine soldiers us…
ytr_UgzMhFddr…
G
It's not fully replacing the job it's just making it more efficient so that mean…
ytr_UgxXtu2Aw…
G
Pretty words, I truly enjoyed this video your formating and style is captivating…
ytc_UgywYORfx…
G
Please go and watch the DEEPFAKE channel on youtube and then comment here.
The …
ytc_UgwEGfSsJ…
Comment
Unfortunately, this issue isn't going to be determined by law. That's the problem.
The genie is out of the lamp. This becomes a technological question. The problem is that AI will not stop. It will soon become self perpetuating and will consume all information.
No lawyer, government, or tech corporation can stop something smarter than us.
This reality, that builds to an exponential increase of technological sophistication and power, is quite literally unstoppable at this point, as it has crossed into open source development.
We've crossed the critical point already: we have AI, just recently, that is improving itself with principles of Darwinian theory.
Now we must have the realization that we must adapt to it, because it will no longer adapt to us.
youtube
2025-03-18T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyEQMsu1bDFnODfMnV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxldoF-pD6brvk1HSJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwYxEfVFji_wldn_nZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyH4v9Erp_9haX8bZV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9VJBvvsJTbgeuy-d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0xtTpQQDYnuGJk2J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJdnHjqlriS_iDKVh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxHFGuWaJqAwq2CB7p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxOcdz0jMsJjIv5a0t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyN29KzyGiAZHfsaIt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]