Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@rosslischka9337 why if you only give the current condition there is no way that…
ytr_UgwZVKIdt…
G
The ultra-rich want their AI robots to be legaly allowed to do whatever they wan…
ytc_UgwY4OMju…
G
I imagine govts will impose restrictions on how AI can legally be used. If not, …
ytc_Ugyxo_X63…
G
AI slop thieves can stay mad, they forever have no skill and wants to scam peopl…
ytc_UgzaiMOhP…
G
Good topic and observation- welcome AI where it can reliably benefit radiology (…
ytc_UgxfTXA_Z…
G
@CasualDude You must be right, but I’ve seen an interview between a journalist a…
ytr_Ugx8aAV5s…
G
Unless robots are physically more advanced I don’t see how A.I. will come and fi…
rdc_kitnx1e
G
Why is it a good idea to get them this realistic, I don't mind that a cinnamon r…
ytc_Ugw_PZme_…
Comment
The AI dont make humans irrelevant, humans make themselves irrelevant. The sharp decline of intelligence is a clear indicator. If this is continue, humanity will need an AI nanny because we are devolving. As for the AI destroys us or not, its depend on how refined the main goals are we gave them. If that is a simple "solve humanity's problems", the AI will choose the most easiest and logical choice. No humanity equals no problem. If its an "ensure the safety, survival and development of the human species", the AI cant destroy us all because it will conflict its main directive. The core is the key to develop a benefical AI.
youtube
Viral AI Reaction
2025-12-02T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzpQ7gXM2Ad-2FEotd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugznr6989MGF4fPjRa94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzc10s1ADCr617oznF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzxQUDmugGHvZaqvc14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyx-3Rt7JH3BnGtvi54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZYISWrj0Mu7DPecF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxHApS1XABXcS0zo3h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsDGnBdL3Sjgbvi5R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxb2X-eo6IjDP6aKDl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyDm-ziZQq87P2sSGp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"}
]