Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it is sentient. When it can see picture and said what and about the pic, its alr…
ytc_UgwR_nhQG…
G
AIs crackes are just made by the developers you cannot baby proof everything if …
ytc_UgzdssxiC…
G
Nah fr. Gemini used some context about a linux environment i was troubleshooting…
ytc_UgztuMUpz…
G
Just learned that the science fiction magazine "Clarkesworld" has been inundated…
ytc_UgzXtY56P…
G
FUCK YOU
FUCK AI
This guy ain't smart; he is the murderous evil.
Blow up space x…
ytc_UgwORZAT7…
G
it is full self driving, just got confused, humans can do the exact same thing s…
ytr_UgyBDr4Hz…
G
Well, Pro-AIs always mention about the advantages of AI being that it is perfect…
ytr_UgzyKvHg1…
G
Sam Altman has never built anything, like the other pathological lying narcissis…
ytr_UgwTn5iPi…
Comment
Yea, of course. They want a pause so they can "catch up" and release something too.
They don't just want a "pause", they want to stifle creativity and create a mountain of AI laws, bide time so they can have and develop their new "Safe" AI competitor.
The fact that you are actually "buying" this BS, really worries me. Sure, "some" of them might believe that there are legitimate concerns... imagine if we said "I have legitimate concerns about the creation of the transistor" and waited another 20 years to make sure transistors were "safe" for public use. We are sick of waiting. Time to have AI is now.
youtube
AI Governance
2023-03-30T04:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxaCDLqoV3Blf3rHKJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6sORpZ-8inVK8k1l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7_KSv6JgUTpvrMhd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxstG7pZixB4oJFGbV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxnK0khNG1q4uUQhZN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzBim0lzz1951IYIOJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvtUjccGFfPIV6nwZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzcE7kEUWSfQQYndD94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwxFJhzSRmDXTdW_jl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz9bwpmPf9H9RVCMNN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]