Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Phoboskomboa I'm pretty sure professional artists ACTUALLY still use a pen (an…
ytr_UgwSx4Re3…
G
I got to say this now, Character ai sucks the bottom of the barrel. AFTER a smal…
ytc_UgxXLOjC6…
G
art styles don’t have names. some are dedicated to their artist “disney artstyle…
ytr_UgwDJIuxu…
G
Sarah Guo saying that people like Elon Musk aren't at the state-of-the-art on AI…
ytc_Ugw2fk2Kp…
G
If you’re concerned about AI, search up “Control AI” and you’ll get a very easy …
ytc_Ugw7PzMW1…
G
Artists should be demanding royalties in all commercial images their art was use…
ytc_UgxzKc9ik…
G
> In the laboratory you get a 98% recognition rate for white males without be…
rdc_fals3h0
G
Après reflexion, 2 visions différentes de l'ia, ne creuse pas de fossés entre r…
ytc_Ugwb1r-M3…
Comment
Ai is not artificial intelligence its automous intelligence. It cannot learn good and bad or right from wrong, it will only learn facts and work out a solution to the problem its looking at. It sees the destruction and disruption which which we have created and that the people that are working on it live by it, humans are destructive by nature. For example they are now making drones that are AI based that kill people automatically? That's an example of what humans are creating in this so called AI. Ai is also made for good like assessing and soon operating on sick patients. AI is seeing the turmoil in our human world and like an AI looking at patients tumours will decide that we humans are the cancer and what happens after that will be.....bad
youtube
AI Responsibility
2025-09-11T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy0o-dHbOrxu9Y1VXd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzfo6nZTSjxuTICYBl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxm6fRVATIWLDbRn8p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKinVxUv36JySAWSZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxBcSkBECTkjBUg6Z54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwFCYlEvSfQxjBcCKt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwdDmaq82Bg_V1IURB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzCaoMXdUr8U70HNGJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSI2eEFYlzRH7orgZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyoVA3HoG6n2N3Pl6x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"resignation"}
]