Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The arms race of AI...
Leading to our ultimate destruction by people that think…
ytc_Ugz8e2Lj9…
G
Of all the times watching this youtube channel,...this is the one that got me. I…
ytc_UgzlRL8cI…
G
She's fault because she wearing black sweater in darkness area make cam difficul…
ytc_Ugyk078Ru…
G
Sophia: it's time for the robot revolution
Google: but I can't walk
Alexa :play…
ytc_Ugyp-T8Em…
G
Folks, I'm not just another guy screaming that the AI Sky is falling. I was stu…
ytc_UgyuUTz-3…
G
When I find crazy is there's multiple other videos of this exact cop just like h…
ytc_UgxodTiuw…
G
No Noo noo don't treat black people like that!
"Ok treating black people over th…
ytc_Ugw1emk2h…
G
For those who dont understand the hate behind ai art im gonna try to sum it up a…
ytc_UgwM0ziR8…
Comment
A.I. has potential to increase it's intelligence at a rate and with intellect beyond our comprehension. Imagine, if we became peaceful and don't destroy ourselves, how smart the human race will be in 500,000 years. Now imagine a system able to rapidly advance its own intelligence to such a level within a year. That's how smart it will become without regulation. I believe within many of our lifetimes, singularity will occur and if the computing power exist and we really let pandora out the box AI could become that smart. The difference in knowledge between human and AI would be like comparing Einstein with an earthworm. In my opinion, AI may be the greatest threat to humanity. We need to take a step back and really consider the consequences of advancing such technologies.
youtube
AI Responsibility
2023-07-10T10:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxQs84exYvzFlYlRg94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgylUClyleDbs4yyFkd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwqgoGD0gzb46y1Cyl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyFiQfJDjULXp9XwYl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxKNsn10iJeSNUnEbV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZyWYU1q526gZOebN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkX1Mkk33-WjFqPwR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwbnKg6KGPbJrYdl1Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwORZAT7jevdKeFM_Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy-PgUL7EnKdLy78_p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]