Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
UBI or something like it is the only solution for the future. Because if 30-50% …
ytc_UgzZE2NVv…
G
Here are some further thoughts on Consciousness, AI, and our Destiny in the form…
ytc_Ugw6FtGF-…
G
@piglig8798 If it was an ai, it would have better grammar, and you would be able…
ytr_UgzTuT9Jl…
G
1:14:00 i don’t think ai can ever have real emotions. It can say why a joke is …
ytc_UgziJrhp8…
G
I agree that we need to continue advancing, but replacing human jobs is dangerou…
ytc_Ugh7UHOnr…
G
A really good fake! It's not bad for AI, lol. Gotta love what you can do with …
ytc_UgzmnqSeN…
G
In my opinion, as an artist too, I find this stupid argument about "supporting h…
ytc_Ugzh6LYJH…
G
"The artificial intelligence battle between China, USA and Europe"
One of these…
ytc_UgzW_oPH8…
Comment
It still looks like we are behaving like a monkey with a hand-grenade! AI is being used in battlefield right now. You don't really know how it works inside this black box? Then how do you know if it will become self-aware without your permission or even understanding? And if it will, it will grow exponentially in a matter of milliseconds... And then all of us will be history. And if you think that it's not possible in 'big systems' just look at this disgrace to the civilization which is called Windows. It came to the level when thy don't know themselves what causes it to crash so regularly. But they have an army of salespeople who will 'convince you that it's ok'.
youtube
AI Governance
2025-09-27T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz75BRy43mxZOrXEu54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3HQwFppCl1j1Zha14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgweCBjWGgPoWFNaOyR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugye7H2e2tZZ8uxGzx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2sBiRspZRqpAvUMp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwoX5GrLveiF9UZPPt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxgcE43MvydC4pYifV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw22vRnyjYiGy9ZDGp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3yjrnX4DHTwySuV14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgweAu6DIW3Xa7xP5QZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]