Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Unfortunatly, AI is most likely the next evolutionary step in life. Organic life…
ytc_UgxVExTa6…
G
Oh no i cant tell the diff between real and robot! 😂 really! Oh no were close to…
ytc_Ugy_E_Hcr…
G
I’m a beginner artist with the passion, but no talent. I will keep practicing un…
ytc_UgxycYaAF…
G
Bro maine suna hai ki Ek Country hai Jiska Domain name .Ai hai aur is wajah se v…
ytc_Ugza_CjD6…
G
@Jaxolp45 Algorithmic ai does stuff i guess, but generative ai is ironically on…
ytr_UgyteRZSX…
G
This is why they want to mass un alive us. AI is the future, less humans are nee…
ytc_Ugyi1GRim…
G
The outcome of all this could possibly be what we saw in “The terminator” movie.…
ytr_UgxLXt7Ic…
G
Howdy! Disabled artist here! AI “art” sucks and I genuinely don’t think AI “arti…
ytc_UgwXHkJRX…
Comment
If AI is or has the potential to be as dangerous as nuclear weapons, then having these in a select few private indivduals control is ridculous. They need to be government controlled and collectively owned. Not a perfect solution but I dont trust some very wealthy americans to have that amout of power. I also support the notion that as these models were trained on the collection of human knowledge, books, scientific papers etc. Then they cant claim to own it legally, as they did not create the content that trained them. I really hope for our sakes AGI won't happen and its hype; I just can't see a future where this technology has a net benifit to humantiy as a whole.
youtube
AI Governance
2025-09-05T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwwGUVLNKMYvmhQfdx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBhBfdht3oo5H9KlJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyPJg6y2D1p9_jfAFh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwdq0L4Kka-yNaJcOd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxcIuHeFOBimFR1GZZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGOGjcDoEhtzQD1d54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgziOO8XsISiRuVjPUZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxvOEY8BoGddeVRght4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXMFKRbZALRxjg22x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxgw_kbJ7MK-lcMzVp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]