Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
art is MEANT to be hard. it is MEANT take days, months, or even years of practic…
ytc_UgxKQZFAB…
G
It's a sad moment for today, but as IBM already learned. They'll have to rehire …
ytc_Ugw2OA8s7…
G
As a writer I feel ya, those books are nothing but a jumble of words tryna sound…
ytc_UgzMuMcNj…
G
my first impression while is saw that thumbnail was: wow he is THAT dude.!😮 in g…
ytc_UgyWGnIee…
G
If we do not get a handle on the 1% absolute drive to impoverish, de value, and …
ytc_Ugx8DULjJ…
G
I really like having a conversation with ChatGPT, as it takes on a fictional per…
rdc_jhu2m9y
G
AI (in its current state) should NOT have personhood.
TLDR; Humans are the cau…
ytc_Ugz566HDV…
G
Thanks
The AI safety expert is very knowledgeable on AI.
However he doesn’t un…
ytc_UgxnURh1G…
Comment
Evan Conrad, or any other person who preaches AGI, are just believers. We don't understand our own minds and instead of getting there step by step we try to create tools that look like us, but are vastly different and never will be truly intelligent. AGI is just as far away as 60 years ago where the first ML and RML algorithms start appearing. We just throw more power at it and slightly improved it.
I don't deny that it is a cool tool, but don't make it bigger than it is.
youtube
2024-11-10T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxxKcCVSDIVn7iGSXp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzJDubiGLE2ddMN-Gl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugys0VDEHgsQ9IplOTh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzMgToZpE1G9z0lAGx4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOqaSFPObUmcnFzMt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy59mC-HSL94FY3Ph14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzvBwCNB57qeibbnTl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgytC9KFBQYDkux0I7p4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgystNKEq5LAmRARgtF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw67Exsh8uGF67uxFt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]