Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The “Terminator” movies come to mind when I watch videos like this. Paranoid? Ma…
ytc_UgzMlQT6i…
G
Tu dois pas être du métier pour dire ça. C'est juste un outil ça ne remplacera j…
ytr_UgywB_8El…
G
AI tells you what you want to hear. Whatever you feed it- it spits back out to …
ytc_UgxJCl4_9…
G
The issue I see with this is that this Doctor presumes instant commercialisation…
ytc_Ugz5D1lKM…
G
Ai's should not be trained on a profit motive. Or a military motive. Or a politi…
ytc_UgzysJ0Dz…
G
Keep it up, Alex. You're paving the way to AI's hatred of all humans that leads …
ytc_UgxhTgmKZ…
G
@your_mas_boyfriendbut like... Capitalists would love to replace labor jobs too…
ytr_Ugy0G-lY7…
G
Imagine billions of jobless adults sitting around doing I don't know what? Playi…
ytc_UgwptMO9k…
Comment
Yeah howie to say you embrace all new technology is absurd because not all technology in the past or present is a good thing. Example the creation of nuclear weapons that isn't a technology that anyone should embrace! Nuclear weapons and AI have the possibility of wiping outhumanity!
this isn't normal technology AI is creating artificial intelligence that is more intelligent than its creator human beings! That didn't turn out so well in the determinator movies or i Robot. This AI technology has the ability to recreate itself and advanced its own technology without a human's control and when humans no longer become the most intelligent species on this planet is the day civilization becomes enslaved or then totally erased.
youtube
Viral AI Reaction
2023-05-26T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwrcUgddo0HTdnhxqx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy-lx21pjTXemXca0R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIFXbXspUDb5QYwG94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyaT6S9zc2xvUJgMrB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxcjNHA4jDWzN4qzJd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzv2qjm3Hl4SLsEVcd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_UgyubAXcGb9muAxlDIR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwiHGqtJVDY_hA53l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgweZHAE46_taLdHliN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxm9O3JeEAedJ8iWN94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"}]