Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wrong we have had robot cheap ones for year that can do that go to Japan the pro…
ytr_Ugy48Z2z2…
G
hmm talking abuot AI is kinda complex. 1st AI are composed from of programing la…
ytc_UghIRXz-y…
G
There is no need for AI regulation until after we have already blackened the sky…
ytc_Ugw4oTldv…
G
Se viene una crisis/ ola de desinformación gigantesca donde no sabremos lo que e…
ytc_Ugz3-RbYs…
G
The Waymo taxis use 3 system which is why they are so successful. Cameras, LiDA…
ytr_UgwoTRtqu…
G
AI is a scourge upon humanity, I'm not really an artist, I'm primarily a writer,…
ytc_UgxC0Q3I4…
G
John Henry died at the end of that story. Robots are taking blue collar jobs an…
ytc_Ugzz3cXWp…
G
the thing about the second one is theres a difference between being inspired and…
ytc_UgxJ6dOJV…
Comment
Actually, I felt after hearing about the term "Context Engineering" that AI models really need to be instructed in a technical way in order keep it in its working limits. Which is because I had gone through this as a web developer while working with various models and frequently with deepseek. I had to properly define with long prompts as possible. So yes, Context Engineering acquires telling AI what to do and how to do. This can lead to the best and meaningful prompt engineering instead of blinded coding.
youtube
AI Jobs
2026-02-14T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwEt5MlUG2jIcZcF4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaRK0srbK95XbRZ1F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy25rHtKOVjGbu-bON4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgybqwhtwpZtoBHMYvx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAEtpvdCTtBrxOKoV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyqtTDty1rjLgCtAtF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw2yhgZVTXd5hk9QqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxDZXqn5Qy2y8N1FOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXeWpGbRM4yYOhvxN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxtPraVG3unuy9-0Hd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]