Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Good luck to AI in scrum meetings where non programmers tell programmers how lon…
ytc_UgyG3D36i…
G
AGI in 2026 is a given, these guys don't get it, they are STILL thinking in huma…
ytc_Ugw8GdCb4…
G
I'm a freelance artist and an author... decided to start going to school for vid…
ytc_UgzGpwf5m…
G
Who knows l think not cause u don't have the means or money to afford ai
The fi…
ytc_Ugx1wrFAh…
G
The world needs Jesus! "For God so loved the world, that he gave his only begott…
ytc_UgyxNOJsn…
G
Imagine hating technology.
AI art is different, not inferior.
Artists will alw…
ytc_UgzNa5Atr…
G
That is right, except you forgot the Open Source
-------------
We are today, e…
rdc_d3xzwri
G
Don't think for a second that the WGA is against the use of AI...they just want …
ytc_Ugya0hkpu…
Comment
In 50:17 Pope returns to the same position, his conviction that the ASI will be bound by the training. Even though we know we have seen models which had exceeded their training, like models solving ARC-AGI tests or Sydney learning Farsi. So the question I have now isn't for pope, it's for Liron: Why just 50%?.. I mean, this discussion shows that the software engineers fix themselves into a worldview that will allow themselves to continue business as usual, even though we know better (and in some cases, they also know better). They will continue to accelerate. AI 2027 which assumes they will do just that, predicts the AI will find a way to escape the lab in 2027. It's currently 91% on track. So why 50% by 2040? Why not 91% by 2031? Liron, is the only reason, your wish not to upset the viewers?
youtube
2026-03-25T16:0…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx8CgiHPtR_PcujKzJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyL3gRwSi8GiYrlhU14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxoAKbFXaFNKheaIWZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy-OQdbloj83oKvmI94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzumermdBNf4qTtoHZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy65NnMH35v_0m6yqZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyvVxyjyNLIEeXpHht4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyM84xeyznV3P1Wn4h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxPkdUi-FguJl50ZMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3bSc6GOdDKVbDIJd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}
]