Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I write dystopian fiction. I'm about to have a novel 'The Experience' released (…
ytc_UgxwPIfQN…
G
Honestly AI is tiring, this to me is a sign that AI is still in its early stages…
ytc_UgwrYnuWn…
G
Government oversight of AI... Like government oversight over society where they …
ytc_UgxcBWss1…
G
How about copyrights fees of ideas ? Having ai in new age have such huge power o…
ytc_UgxWWG54G…
G
So I'm in a little bit of a different camp. Firmly understand it's all code, no …
rdc_n00mhqk
G
Took a picture of the calendar with ChatGPT Plus without any comment:
"The imag…
ytc_UgxLU1huy…
G
good thing they don't enjoy cartoons (also, the smudge tool commenter had an ai …
ytc_Ugz8TGPLr…
G
Yea im tryin to figure out if I use it I'll be selling my soul. A lot of things …
ytc_UgzmlK9rE…
Comment
AI should be a machine first and foremost, and it should not imitate human behaviour.
Actually, AI should not happen as we know it, it should more like a Programmed Intelligence, a massive software with millions of actions, variables and conditions working together.
If we take a PI for some combat drone it should be like this (Example)
Action
1) Fire weapons
Condition
1.1) Target is hostile
1.2) Target poses a threat to other drones or Humans
1.3) Firing weapons poses no collateral damage to allies
Variables (If we could call it like that)
1.1.2) Condition can be ignored if target is non direct combat vehicle (Jammer)
1.1.3) Weapon can be fired even if it poses a risk to allies if the no action would cause more serious damage
youtube
AI Moral Status
2017-02-23T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgjCkbW8HzWknngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgiEPKpkQpLBvXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UggS6u_4h0pTJ3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UggR_H-guI1ov3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgisRaHAbPZkRHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UghDTSkKguh_eXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugjb307Mr6aT_XgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UggBAqOIJtgnCHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UghBJWyJQzHrOHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgjYadM9MhFjhngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}]