Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Whatever comment you leave will be used by your future AI overlord to determine …
ytc_Uggmwyliw…
G
They "love art" but are so disrespectful to the artists that make it. They want …
ytc_UgybrggeR…
G
The only thing worse than someone trying to sell AI generated bullshit is someon…
ytc_UgzDe06-v…
G
Im glad that this is getting legal precedence. For me the issue is not that its …
ytc_UgwzC1bac…
G
What if AI is benevolent?, what if in those 25 microseconds, Echo chooses to nur…
ytc_Ugxtvl5wY…
G
As an actual AI developer, I assure you it will take your shitposting job away f…
ytr_UgxXorIDO…
G
@Diogo85 Of course it was made to help, but unfortunately that's the way with lo…
ytr_UgyBZSuSD…
G
Waymo doesn't use wifi to drive lol they have an online map in the system. So ev…
ytr_Ugznyt0r8…
Comment
The First Law: A robot may not injure a human being or, through inaction, allow a human being to come to harm.
The Second Law: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
The Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law
youtube
AI Harm Incident
2024-01-05T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugy07hyGB0pJod__Md94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyzvq7GtdSMeWXd1xJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5alMLIOPe5sAN3V94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwn5LwEsORtsedjd9J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzEmYc7AJ86YGzAgKN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyEQ-E6fJNRN5s54kp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwHj6hv-qGE5sE3d614AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0eGi6eyfGud8T5Bl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz34DIBmjm_14lY8ht4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx6mH1UC7UzStQQMxB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]