Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not programmers, AI cant make a complex code, its like if humans are creating li…
ytr_UgwC9dbhd…
G
The palantiri weren’t inherently evil as they are innate. Didn’t Aragorn and pip…
rdc_ji4e9jj
G
Humans are better because we created ai and the power goes out and so does ai…
ytc_UgzoQbnO6…
G
OpenAI also has it‘s policies. But when you know how, you can still extract answ…
ytc_Ugymuaw00…
G
Oh god here we go again. Huge eye roll Mark. Tell your video editors to stop ste…
ytc_UgysBlszr…
G
Time to found an AI church! Hail the holy circuits!
"In the beginning, there was…
ytc_UgyBKNMh4…
G
LOL the Computer Chronicles had an episode almost 40 years ago about AI with the…
ytc_UgyOOymSl…
G
L'IA n'a pas d'intelligence propre. Vouloir contrôler le monde ça demande d'avoi…
ytr_UgyJdo0QM…
Comment
I am pretty sure the last path will happen. A.I. already has many uses, and it is being pushed hard by companies. Outside of niche cases, it is insanely difficult to automate the an entire function. That being said, the current economics of A.I. is not sustainable, and they are not charging nearly enough for it based on the cost.
youtube
AI Jobs
2026-02-05T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyKAZjAGQp-cR0Lhh54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxp5bFyMMwNqfQGJUN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxY9DJ6aj2ISYXWOK14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwba-9Llf9M__ATMvx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_USF-ALM2DKxaP3N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxDnm_4k0JvyYDbVMt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxvLsw734-8M9o2Wxd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwHfEROaVr0-TGq1MF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyGluGoYwHEfpviGBl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwqWVCuORJf0SQvrr54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]