Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
...as I write this comment and click the like button I'm promoting the use of AI…
ytc_UgyBOZuX5…
G
If AI is a possible danger, then turn off the computers !!! The REAL danger is n…
ytc_Ugyip9cpn…
G
I'm doing very OK with ChatGPT. I don't let it decide architecture. I lay out my…
ytc_UgzoXZJ3m…
G
L' AI menace des millions d'emplois, est non ethique, incroyablement energivore …
ytc_UgxQRntgt…
G
Actually AI will slow down with less people in the workforce. Because their data…
ytc_UgwU81YUr…
G
My concern about AI is the data they are training with. They are deleting giant …
ytc_UgxztNG-V…
G
@jamesderr1344 You do realize there is such thing now as "teaching" AI, not nece…
ytr_UgzqmvvM4…
G
My ChatGPT response with respect to Geoffery:
Geoffrey Hinton is one of the fou…
ytc_UgziPsjiQ…
Comment
How long would it take mini AI agents to be infused in every part of system's design which tracks users' anticipated features and build and deploy them to match as fast as it can , also context window sizes can be increased right?- as this is a major update which stops it from doing what it can. Also , a single prompt -> FInal design is a lame way to propose the use case of AI , if we could actually show the entire repository , contexts, and much larger vision of what we aspire to be included on what we are engaging, wouldn't that should be a starting point of expecting better outputs from it.
youtube
AI Jobs
2025-12-20T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxDdhFVYvUf8A4Sxw14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyrFpnat3XXPb39LJB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyT0tWKm9CEVtiH3iF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxVWQxDnI-wnlh8PuZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmjsCMTw9lPRKBaGh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnXYj0A4jNaHb4Q7Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwYA56_A67iAswNNy14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhIslT_8bXlB98iLt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxOBzxkFxcMl89X_KB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzuhJUTeSJnmabZ1uV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]