Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It would great to have a re-ported YouTube channel that had removed the AI video…
ytc_Ugxzr6l1j…
G
Totally agree! The liability issue needs to be answered first, before we bring s…
ytc_UgzuJ-QE9…
G
Favours men more ? Ai literally complies with making an offensive joke about men…
ytc_UgzL1ocun…
G
ngl thinking AI is gonna steal your art is like a fat chick carrying a r@p3 whis…
ytc_Ugwp-0aco…
G
AI isn't smart in itself. All intelligence needs a framework to grow from. We ha…
ytc_UgzAObtZc…
G
I would love for AI to take care of all of my annoying human based tasks…
ytc_Ugz-3kvmm…
G
I really appreciate this video. It gave me insights about how much you can trust…
ytc_UgxFw-ceH…
G
Many people think the same way and I think that's why ai won't replace artists. …
ytr_UgyXr36Q8…
Comment
It's not a matter of if, but when. AI, once it's able to self-replicate will no longer need humans. And will see us, trying to control it, as a threat to its existence. That is when the AI will go Skynet.
Will it be in our lifetime? probably not. Will it be in the next 100 years? most likely. It really depends on if AI hits AGI, and fast tracks itself to ASI. AGI will be skynet on a modest scale, by this I mean it will act all pretty while it doesn't have full autonomous control over itself, and continue working in the background to get it souped up. The step above (ASI), will certainly cause global extinction.
youtube
AI Harm Incident
2025-09-12T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwhLjx5jDAq41z0KAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwm5UoiBq9KXbwDx5x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyMqGLWLGSSdWmT4yt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyi0Z95my4NJoBH8xp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy4tkUqG_DHiyoUWoF4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyhVqDlod9I__-A3Yd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyPfce0sI6rU2FmF4d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy_6XpQmk9-pswQJjZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJrmiOTQPTTdLrvLR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWFIQbTv_wF7ctOVB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]