Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Because progress. I am in art and design space for more than 20 years, doing bot…
ytr_UgzlKeePX…
G
Black people start creating A.I. as well. Because A.I. is seeing whtt the media…
ytc_UgzgOnRC_…
G
When an LLM is confronted with a HAL–class dilemma, then it would probably act l…
ytc_Ugzz_fsaD…
G
Sophia simply responds to the statistics of how we feel about Ai and robots
stop…
ytc_UgxGmmYgd…
G
Can’t wait to see someone use AI to Ghibli-fy this video. It’s gonna be awesome…
ytc_UgzPNDq3v…
G
I'm starting to believe that AI might be the antichrist....but I'm definitely no…
ytc_Ugy-TdbLY…
G
@AzureWolf He means that from now on you should consider cp a possible casualty …
ytr_UgxbdwNQq…
G
Just saying, but I think most AI bros don't care because some of them don't have…
ytc_UgyHRBBSt…
Comment
Having knowledge doesn't automatically make a person smart. Newton, Tesla, and Einstein mostly had the proper knowledge of science and physics and were able to create, inovate, and calculate. That's why they could be considered to be smart or geniuses. Nowadays, we have quite a few famous people who do have a lot of knowledge, but no smarts.
Because the knowledge that they accept is highly questionable. So, if you use bad data, then your results and conclusions will always be incorrect.
But heah, if it supports your idiotic narrative, well then, I guess it's ok, right?
I see us sliding down a slippery slope, heading for a soupy concoction of stupidity.
We live in a society that shuns the smart people and praises the dumb ones.
Since AI has no feelings or a conscience, where does its motovations come from to do anything beyond its programming?
youtube
AI Governance
2025-09-10T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQgX1fy86QB4Rmbyp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwbtsOP-2bk96utyFV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwQn1s1_dAtld4AcRV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyHle1ukcaqzMOkMqh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgztToBUc5VVNHIgNh14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugywy59jwYrGAARp-2V4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwNanLtri5pVUDEBCZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzzLBu5TcIXQ9cm5EV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzpdFFUkIMXZwuQduh4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyJQFF4QyvVcsMl4OB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]