Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ok I want from a normal person asking ChatGPT give me a pic or whatever to getin…
ytc_UgyBjY69G…
G
I just don't understand the need to develop an AI which 'creates' 'art'. I feel …
ytc_UgwXjxLTb…
G
This is the 2nd time apologies have been made with ai if you know you know…
ytc_Ugwhsa-6p…
G
I understand why people would consider it silly to copyright an AI generated pho…
ytc_UgyU-NsAA…
G
This is great and all but all but it still prevents me from speaking moistly... …
rdc_fn5mw2m
G
AI's: Primary Objectives:
a. How to survive literally throughout all of futur…
ytc_Ugw6Fc_-c…
G
I was recently in an interview for a graphic design position.
My first impressi…
ytc_Ugy8FDNrc…
G
I mean, Trump recommended drinking bleach to cure CoViD.
Is the orange man an AI…
ytc_Ugyz3KSJg…
Comment
Isaac Asimov's "Three Laws of Robotics"
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
AI Harm Incident
2025-07-29T12:4…
♥ 76
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy6Wstd_6Y9SS78h1t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugx15K1cZowNuIyjfiR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzOZ8-di15Nhx3Zkk54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz7bdQaU177bWxdpB14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwlx12ure6Aq6lXXT94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwBl2t2haYv8AEYoct4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxQP1kaz1d8fTVVAal4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwuKG5OyDpCKFQWsxB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw1r6Isf8897AJwM654AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxBC5Qstgo3iB3dg7p4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]