Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
yeah but if you poison the well enough that growth goes into the direction that …
ytr_UgzjfXXKL…
G
My husband's workplace replaced all of their devs and PMs with AI. They were ha…
ytc_UgyMq2fS7…
G
Isn't US doing the same thing that China is being accused of - or people are bei…
ytc_UgweOmR95…
G
We'll here in Los Angeles, CA. That technology has already been implemented. Thi…
ytc_UgxPsPqem…
G
Yesterday I watched a news report from Australia where the Australian government…
ytc_Ugw59QPzI…
G
As a camera technician relying solely on cameras at this day and age is simply m…
ytc_Ugymxs7I-…
G
I think this is where the "AI is going to take our programming jobs" argument fa…
ytc_UgyLVqCn0…
G
AI is not the culprit here its the person behind it with a personal agenda!
Besi…
ytc_UgzCOal3w…
Comment
The real danger with Artificial Intelligence is the possibility that we (humanity) end up creating something that we cannot control. There is a difference between “smart” A.I. and “dumb” A.I. The latter can only behave within a preset of specific functions and does not have the capacity to deviate from those functions while the former is essentially a digital version of a human mind; capable of developing it’s own thoughts and making its own decisions, regardless of what its creators attempt to do to stop it. So long as we are only creating “dumb” A.I., we will be fine. However the second we try to create a digital recreation of the human consciousness(one that has intellectual capabilities and mathematical foresight well beyond our own understanding), we will have effective started the clock of human extinction and there will be no need of nuclear weapons to achieve it.
youtube
AI Governance
2023-04-18T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx74Br9oydJd3ps6XZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvY1hfnjeLm3r_HD14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyDlZ-cCnS2naov5mt4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz4GPMFlbo3Fc8AfF54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySZhUpf8EwLIgoFhF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWWUrULDl-FwpXK-p4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwoplpX5w0C9GdTDuB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwAd6pLoxsRbfoAEEl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwNur3NILoLnNetvup4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwJsrmRtJMBzwN0Xut4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"})