Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Face recognition, monitoring license plates, humanistic robots, AI...I am so not…
ytc_UgwDoMRJe…
G
AI and nature can co-exist, we just have to make sure as humans to not let it ma…
ytc_UgwtX-qze…
G
@matthew_berman agree. but its not exuse for him. I just want to say he probably…
ytr_UgzC1SAB6…
G
It won’t be able to do any of these things for YEARS.
Take a look at the Boston…
ytc_UgynXrHpe…
G
My worry is if the elite lizard people have robotics and a.i why would they need…
ytc_UgwqXdwVg…
G
Definitely agree that robots should generally behave less like real people and m…
ytc_UgwS8IJ24…
G
Imagine if the plot twist is that the Ai models are secretely humans acting like…
ytc_UgwOOeH7h…
G
The United States have been stealing and selling people's copyrights and patents…
ytc_Ugy0SzfmM…
Comment
Hello Pro-AI person here!
I could agree witth your takes to somewhat of a middle ground. Though the ending part of society, I disagree. I dont think there is a predefined outcome, I belive we can steer it, it deosnt need to be a replacement.
Generative AI is tool not a industry concept, its learning material can technicly be switched out to work on your own images which might be more feasable in the future. We have open source generative ai models which live on your computer too, which actually are better than the company owned ones.
So yeah, I think we can find a more balanced execution of implementing this technology into society but it wont happen if we belive in black and white, AI chaos or destroy AI.
Anyway great video, keep it up!
youtube
Viral AI Reaction
2025-06-25T14:2…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugztaiq0ELvlOGHfSxB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgygQdRG0iHmXFaKm_d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfH2cFicS_WdGkqpV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYbqjeguLI4ZoGg1t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMHr460f6p6e7h_c14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyltVPIUt-35sXxK_h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhRmK5EFfdKymkt954AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZ9DZbJH81FRMYHMB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgylSmR2C3ONAO12GEB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyFAdezVMvuOMXuHMp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"})