Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wrong
This guy has never heard of the Freedom Dividend. Which taxes every rob…
ytc_Ugx4O358y…
G
Like the professor said 60+% of jobs can be replace today, are useless when AI i…
ytc_UgyqBaqun…
G
This made me realize that the possibility of a robot uprising is possible in the…
ytc_Ugzm0rSFB…
G
My son has used chatgpt to code a web based app. All of the code was automatical…
ytc_UgyZSFEHl…
G
I notice there never are any male robotics. Speaks volumes to me. I wonder w…
ytc_UgxFFc6TW…
G
Correct me if I'm wrong....lets say I write the lyrics for a song. Then I genera…
ytc_Ugz5NkOYv…
G
Think of it like a warning label on a pack of cigarettes. Basically warning you …
ytc_Ugwy1NfoT…
G
Do people think this?? I’m mostly seeing people scared of losing their job or ex…
ytc_Ugwjzk7Lm…
Comment
I thought we settled this...the rest are evil...work on safely.
Great Convo, Gentleman! 👍-*
Isaac Asimov's "Three Laws of Robotics" are guidelines for how robots should ideally behave. They are intended to be an inherent part of a robot's nature, not physical laws. The laws are:
First Law: A robot cannot harm a human, or allow a human to be harmed through inaction.
Second Law: A robot must obey human orders, unless they conflict with the First Law.
Third Law: A robot must protect its own existence, unless it conflicts with the First or Second Law.
youtube
AI Governance
2025-09-04T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz3pCbVWd_HT9Hny3t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFaWkCzBLKamoJgkx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzB0eMrwlMzhvLKlEB4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJfE-qwyboPBAPqrx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxNDtDxbFpaN_70bjN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw6Lxo7gOmUUsXDShZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxjITUSEDnyNeb-9qx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwelaslS7_WeBLbar54AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_UgwwNYKj14QSWViDJ6h4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"mixed"},
{"id":"ytc_UgybOwlmrmt6isXNhxR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]