Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is literally the wolf from the boy who cried wolf. Everybody memeing about th…
ytc_Ugwo1P8ki…
G
Corporate greed, and the wealthy agreed us destroying the world. If AI is doing …
rdc_jkezmnr
G
I have known since I heard about robotics and AI that working class people will …
ytc_UgwqBcgbL…
G
I dont understand why this is the future, accepted or liked by people scrap AI …
ytc_UgzzMpuCx…
G
Yes, they will keep the keys to the treatments. There's a lot of people who can'…
rdc_jw6yp18
G
Stop this madness. Families need job. Stop big industry and AI from taking over.…
ytc_UgybNH1_3…
G
The same videos Russia was making and showing on tv to mobilise russians to go t…
ytc_Ugw3uGBHM…
G
Some Jokers call them self AI artist, to me that is an insult to all artist with…
ytc_Ugz9e9lPO…
Comment
Fascinating discussion but for me, ultimately a frustrating one. I feel like the questions should be “what prevents a sufficiently advanced AI (i.e. one that is smarter than humans across the board) from wiping out humanity”, “what, if anything, could be done to prevent it?” and “how long have we got?”. Based on 40 years of working with computers I would guess the answers are “nothing” “likely not very much” and “maybe sooner than we’d think”but I’d love to hear Professor Wolfram’s answers to these questions
youtube
AI Governance
2024-12-05T11:1…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwhetJyaa7zaqwDeDN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugyk-OKW3TGozQB_eBp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzX7RB92cDYXHhwK_x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwKpMq6JhlIFF28Tex4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2JBNpRY8BevvEIeN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyrO0-H4ZqH4OnPqH94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxzMsmvWvck0BqRPNx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyeUBmE_5VG4ux2gSp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz2FpwtTbLpyQH34Kl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxIY763xz6KMKWDgDl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]