Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Unfortunately not a realistic approach. - Yes there is still time for AI to whol…
ytc_Ugx7BzHAh…
G
We can only hope that ALL courts in future deal with AI 'artists' in exactly the…
ytc_UgxHwlEgJ…
G
What..? She never posted nude photos, there's a website that makes it look like …
ytr_UgwKb4k9w…
G
I agree with your fair use opinion but disagree with the 'AI can't do that becau…
ytr_UgxMLCYT5…
G
Technology is evolving but the laws are not. Forget deepfakes , it is extremely …
ytc_UgxrLEXyJ…
G
"Yes, you will be unemployed and homeless, but you'll get to use chat gpt for fr…
ytc_UgyvpzHW_…
G
The problem is that she should be able to be a influencer without being deepfake…
ytr_Ugyk5Qftf…
G
True, AI is replacing careers and will continue too. This will happen mostly re…
ytc_Ugz3Wo4Xt…
Comment
Elon Musk is already among the first on the cusp of a best answer to risk from AI called Neuralink leading to a merging of Humans and Ai. It will be us and we will be it. First, we should use AI to reduce human aggression levels and increase intelligence; and then, to merge. The power of future AI and technologies will be so powerful that the only option will be for mankind to become the meek that inherits the earth, or we will self-destruct. Sadly, most media accounts rarely mention merging as an answer to saving mankind from future AI amok.
youtube
AI Governance
2023-12-04T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyyXjTzLPsiVvs1f-V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwdrxT1yIN3xcrj4q94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxd1vvjnuueT7e6dPt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzWWlzCnaE1ul-0zaZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxk8hyXb-KQj8RqVJl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxazyOYSiLsGQM2x9t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRRuR2zqqe1_Om22x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz93K-qM5KUdXOFvtx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgwHb1tUzIvVBygiQRd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySoa9MwZHrkHkuDaN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]