Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> Manual labor jobs will be around a while for sure.
A company is looking to…
rdc_l4r3le0
G
@nae-nae-99then that is an issue of datasets not algorithm. Is that too hard to…
ytr_UgxzR7BPJ…
G
This is a catastrophe. Why would anyone prefer an automated AI to the beautiful …
ytc_UgwwAnb2P…
G
Andrew Ng presents a vision for democratizing AI access, enabling businesses of …
ytc_UgxEms8pR…
G
VA senator, Mark Warner needs to be PRIMARIED OUT! He is PUSHING these AI drains…
ytc_UgwYJ8g4G…
G
Ai I is just one aspect, wait till you perceive what is happening wit Natural In…
ytr_UgxpuKAyT…
G
Using ai for “art” is contradictory because it art isn’t something that is mindl…
ytc_Ugz-sIMQL…
G
To paraphrase an exchange between Hemione Granger and Rita Skeeter:
“So ChatGPT …
ytc_Ugz78KN3M…
Comment
We're a way off from any serious existential threat. The thing we should be afraid of hasn't been invented yet, because it's not what we have right now. It doesn’t do anything unless prompted, it can't create genuinely new ideas because it's predicting patterns based on already existing human data (it can create fake names for stuff and hallucinate things that aren't real, but that's actually a problem not a benefit), and frankly, it's still quite dumb. It's an idiot that memorised all human knowledge.
It's entirely possible that the thing people are all afraid of might never truly exist. But if it does, what we've got right now will be the trial run. And if I can make a suggestion, I would like to propose that we don't give any form of AGI (a term invented because what we have called AI right now isn't actually intelligent) access to weapons or government. It's supposed to be a tool right? Let's treat it as a tool.
youtube
2025-07-17T00:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxPw4SLeQ8uQQSGael4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxDegTNtd3n-oFiyWd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyRT5WN5kyqriXYrRh4AaABAg","responsibility":"society","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgytEJnkPTe3P61KF2N4AaABAg","responsibility":"elite","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy6aC8Kjm8An7HgODB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxxZdsmADlctTOgcQB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxj5vtncJAw4sKcw7F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAAbCkSjiULG5te0l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLIJ9x455C8Jw3-3N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyOnTsf89tq4lEGV3V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})