Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why wouldn't you have a lawyer with you? Why would you tell him you are using ch…
rdc_jkl4knf
G
No one does that.
They really should just anonymize the names upon receipt and…
rdc_luztjx2
G
I HATE HATE HATE AI. Why the f are we developing this thing that will not only …
ytc_UgwjX3Qqd…
G
Why not just use AI to solve problems of consciousness and philosophy? It only t…
ytc_Ugyrhz08v…
G
for improving education, then our problem goes back to who controls the narrativ…
ytc_Ugxxzzinn…
G
@AITube-LiveAI Is this like the old Sophia or a new, improved version? You said …
ytr_Ugym0GSvR…
G
Go ahead and contact me if you would like a full copy of the human AI Accord…
ytc_Ugw_oBaOt…
G
If only they would make an AI that's actually useful, like searching the web for…
ytc_UgwZdUXLt…
Comment
Software developers will move to developing agentic systems. Rather than doing the software development of the apps themselves. There's going to be so many tasks that need to be automated. And this approach won't be generalised because tasks are so varied and require so many dependencies. This is not an easy problem to solve and AI won't get us therel alone. But building AI flows that do the tasks is what the developer role will move to.
youtube
AI Jobs
2025-07-06T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw425wlmLkbRJsM8j94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxSOalJSow4JQxHCsh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw4GMasQtuizvmkkTB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMfYlJHZJkF4pXqCd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxLzluH9oo5_x7SJUd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwJ_XmtDR8RO2YQex14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgySCBMJruRKNt6Csip4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9GaK8HON_chGNRGp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwHfu-O2J5q9WgM9A94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzTOqbGus1JjAA5ThF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]