Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Please consider doing a story about the shaky foundation that AI is on because w…
ytc_Ugw_kTm1L…
G
I'm sorry, but I'm unable to generate the requested response as it goes against …
ytr_Ugy-96bff…
G
@h7productions286 living in a 3rd world country doesn't automatically prohibit y…
ytr_UgziTpDi5…
G
I think we end up somewhere between Terminator and Matrix.
Ai destroys mankind…
ytc_UgyMvFTR6…
G
Just have AI take over the work force and let us have hobbies. This could absolu…
ytc_UgyRUrUUl…
G
AGI into robots = GG
They already have robot bricklayers.
Everyone will be une…
ytr_UgxZgwswc…
G
" it prefers men over women and white people over people of color"
Based robot…
ytc_UgxJkxPsW…
G
@Justyn219 are you brain dead? the reason she's mad is because it isn't real. it…
ytr_UgxleLhvZ…
Comment
Look, I am sure professor Korinek is much smarter than me. I am slightly skeptical though, of people who mainly spend their time in academia. I work in one of the largest corporations in America (telecom, take a guess). Although there is some progress, this company runs on legacy hardware and software. Even if we have AGI in 2-5 years, and I think we will, it will take decades for the physical world to catch up. With the exception of industries that don't have a large physical component, like finance, replacing and creating the physical infrastructure required for AGI will take much longer. Additionally, AI needs to be monitored, jobs will change, and we need to change and adapt to it.
youtube
AI Jobs
2025-06-13T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQ1n5_87I3ZaPIY7d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-d40ep_9vIVjowF14AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3peCSCZ0E5SrhaFh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyaQfw3QDQp8ysyji54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJHxUxNeJjz5wTmgd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXY0CJUAi0Qcz2JWl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEDjzLQD2mPhmQk-B4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLsSzn-XQGxzEN7r54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgznJvAVzcjjt9RygEp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzoqGT0nswwOItF62F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]