Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They are building the New Israel in Ukraine. New Jerusalem. They fully intend to…
ytr_UgybBqOhB…
G
I'm not at all convinced that current LLM's are not more "conscious" than the ma…
ytc_UgxrV3rRs…
G
Since generative AI is essentially just a lot of multiplication it’s inherently …
ytc_Ugzj7gcvY…
G
Eric Schmidt makes a joke about acquiring AI companies at a reasonable price. Si…
ytc_Ugx1ARFfg…
G
If ai is taking our jobs, literally, then we MUST meet every need of everyone. I…
ytc_Ugy43Kz8Z…
G
There was that white guy in that Zohran Mamdani video depicted as a wife-beater.…
ytr_Ugz7d3S7F…
G
What i noticed is people saying ai is going replace jobs are a bunch of low wage…
ytr_Ugy_47Gzb…
G
DONT LIKE AI VIDEOS, MUSIC ETC. Want humans to make money from their work? Repo…
ytc_UgyGNqrvR…
Comment
I guess we are right to be wary of AI but we should not fear it. Do you fear a hairdryer? Of course not, until you drop it in the bath and it kills you. Asimov knew the pitfalls of AI and wrote many books on the subject, he even wrote about a 'robopsychologist', Susan Calvin, who's job it was to figure out why certain robots did certain things and went outside their programming. Modern programmers should read every one of his books before fear-mongering about Skynet scenario's. Asimov's laws still hold true today, and should be an integral part of the core AI program. Of course a true self-aware AI will very soon reach the (correct) conclusion that all humans are dangerous, and should be exterminated at once.
youtube
AI Governance
2023-07-08T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwiIdCZLq6SyjDQ3sd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgweT4oG568Y9oXkzdN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8g2FoD7EoRvW2RbR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUCy4Kq-lmtWUxGrZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz1umgzdEszAOciOFd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyY7sKcgTg4e-FdEdJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx34v10QVkLpGrEzTF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwzkbBPIUL7njxMz4p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw190eAoO3gWaPCh5p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzFYOfVZDn6co1CIR54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]