Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy sounds nihilistic. AI will never replace human usefulness. Technical ta…
ytc_Ugzz7jC1L…
G
The young people don't deserve to die in America or in Iran please stop this Net…
ytr_UgzCldJS2…
G
i've been vibe coding for years and my clients love my work, this is some tips f…
ytc_Ugwx2wK8d…
G
They can defend Ai all they want, and it doesn't even matter. Humans aren't look…
ytc_UgwT4FeOJ…
G
I am a game dev and a software engineer professionally and its so funny to have …
ytc_UgxhlpK-e…
G
Not so sure if AI will really rule over all of mankind even if it becomes more i…
ytc_UgzfthJjW…
G
Start paying attention now, so you won't be surprised when companies not investe…
rdc_oi2f7j1
G
I'm having to train my left hand to draw because my right arm is getting too mes…
ytc_UgydcXz_l…
Comment
In the current scenario, it's mostly corporates trying to have better agents so they can fire humans but wait till humans figure out a disruptive way to create chaos and then some AI just waiting for both parties to weaken and then it's time to take over.
Dune, Horizon: Zero Dawn, Terminator, Blade Runner and many other stories that never got the attention.
We are basically throwing ourselves from top of the food chain.
To me it seems like the same mistake that nature did, it let a species evolve to gain intelligence where its aim is as primitive as self-survival (common amongst most animals) and now even more seeking luxury and money beyond necessity, and what did the species do? It destroyed the planet in which it was born and the nature.
Just waiting till some AI realizes that humans are indeed going towards the end of the planet and it starts taking hidden counter-measures.
youtube
AI Responsibility
2025-07-12T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzR1MDTj5vb-sFjyVN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzykWXEN3Q2Ip6YQHp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwcU4e6WRRa7pI-7qx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx6oEkszBl-ax7iwC94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw22sKclkiFiCunBY54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyA_kCaL4-g18nwQzF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw02WoMY6Hc-eh0NNJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz5Nvv4Uy4ZHuvrb2B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgznOLVEnXRViTDjapR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGBgaXjUryvv7Bq0h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]