Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@GharacDurac. Unlikely. Early 80s assembly was easier like on an intel 8086. But…
ytr_UgyRNM7S-…
G
I think this is a bad take. If AI actually takes all of our jobs, some sort of U…
ytc_UgyPuOmpi…
G
One major issue with this line of reasoning is that rich people spend a smaller …
ytc_UgwBqftVD…
G
AI is like expanding foam. Once you let it out, there is no taking it back. We’r…
ytc_Ugy_49QCH…
G
If it's this is the case, then maybe, the main reason that past advanced civili…
ytc_UgznyaFt-…
G
My question to the supporters of generative ai “”art”” is: what’s the longevity …
ytc_Ugx_Gbruf…
G
Some day soon someone is going to hack into the Waymo system and then we will se…
ytc_Ugzx3utB6…
G
I can't see UBI as being implemented as anything but a temporary transition whic…
ytc_UgxGqVJRv…
Comment
Having worked on AI in the 1990s (IBM), I would add hardware is as important as software. Attempts to put theory in to practice could fail because the hardware was too slow, owing the software. Guess we saw something similar with computer bandwidth and computation power over the last two decades.
Transformative ideas might have been there in 2000, but the hardware simply did not support the idea or overlying software. Had Covid happened at the turn of the century, we couldn’t have supported digital classrooms, even if, programmers could write the code.
Today, we can do, what we can do, both hardware and software, in tandem, have increased in efficiency at an astonishing rate. Now imagine, success in the development of QM computing with networks driven by a million quad bits.
youtube
AI Governance
2025-06-17T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwYjZ6x9huB-rhc5kB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgznBYQPjMZnwho7-YR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz73Uw60G17e3IVmE14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzIQJDVSafc2Ggs9PN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzc-XzTkYysdOT4w9V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzG3e3o534v9sjO7V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw4eRshe3Rul-HYavd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyfetc2Z0RWB0SVr1V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzyRkmBI4IjxQaDl8J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxag7w5tMTlrTZVcPt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]