Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Reality is out delusion is in. Avoid academia that has political or corporate in…
ytc_UgyukIFiE…
G
Seems like there will be a growth in Hackers, if more developers are replaced by…
ytc_UgwVwgb19…
G
automation and AI 'taking peoples jobs' is not a bad thing, and is in fact a ste…
ytc_Ugzt_ePgu…
G
Stopping support for cloud computing is a proposed, though drastic, approach to …
ytc_UgzXYqE5p…
G
@chezburgerE ok then.....if i have to start all the way back at factory workers…
ytr_UgyRO4ANd…
G
Its not a bad thing if youre willing to spend it. Im not
And 2 weeks for just o…
ytr_UgyhotVE1…
G
@group555_ But it's still not really expressing themselves if it's a machine doi…
ytr_Ugzv-jdi4…
G
> but investors have not been doing enough
Understatement of all time.
>…
rdc_et8b2b5
Comment
Is that true though? The major considerations regarding AI aren't all that technical. The major questions are this kind of nature: If we create something far smarter than us, can we control it? Can we avoid doomsday? Should we build a machine god anyway? It would be a higher life form after all, maybe that is the purpose of humanity? If there is large uncertainty over whether it will remain benign, do we take the risk? If we can control it, and all jobs are automated, how can we ensure society doesn't collapse? Will the ones in charge of AI even want to avoid it? That seems dubious, they are investing heavily into doomsday bunkers... Will they have any incentive to provide any goods for the rest of us? Current AI companies are already clearly not aligned. These are more political and philosophical in nature.
youtube
AI Jobs
2025-06-13T08:5…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgzSu1HcZk6p67hBdb54AaABAg.AJIdOFUna9lAJIq0GuuF5J","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwroUIfAmKaOwiHMtV4AaABAg.AJIXX91ziLyAJJPggqu_Qn","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgycFpCROyoKVhBC9E14AaABAg.AJIRoZ6hE4oAJJOewo0AaC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgycFpCROyoKVhBC9E14AaABAg.AJIRoZ6hE4oAJJlHAUJS0I","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgycFpCROyoKVhBC9E14AaABAg.AJIRoZ6hE4oAJJvSHdzojG","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgzU65YUZGLOWMPEyO94AaABAg.AJIR2iJMVofAJIxdbFzabe","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugyn5ps2Uq02zycHfOl4AaABAg.AJHy6Aax32AJIryefAGlA","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugy4Ow0AGzWAHF7Jr_d4AaABAg.AJHfLfZBe_FAJJOclqdcRz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugw13-szwsZNSx992H54AaABAg.AJHW95ClHJFAJIGllL4rGF","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw13-szwsZNSx992H54AaABAg.AJHW95ClHJFAJIrp1J8wWc","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"outrage"}
]