Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Great video! I love how the AI assists students!! However, I worry that, in the …
ytc_UgwNWVsbT…
G
Ai is just another tool for artists to use, that enables people to concretise an…
ytc_UgxOLP49l…
G
🤷 y'know there's a memory function in chatgpt now? I have asked chat gpt to give…
ytc_UgxAQp8WW…
G
These companies want to replace their human employees with AI, yet the governmen…
ytc_UgzzFZ48q…
G
there's no worst lie than those mixed with truth.
AI isn't more than an advance…
ytc_UgwHKtB0g…
G
Mind-boggling -- I didn't realize that YouTube had become this inimical to free …
ytc_Ugzh_Ruds…
G
Did you not watch the video? AI for code generation is fundamentally flawed. AI …
ytr_Ugwddk0kw…
G
As a creative myself I feel like what most other creatives & art lovers fail to …
ytc_UgxSx7Twu…
Comment
Isaac Asimov's Three Laws of Robotics:
Back in 1942, before the term was even coined, the science fiction writer Isaac Asimov wrote The Three Laws of Robotics: A moral code to keep our machines in check. And the three laws of robotics are: a robot may not injure a human being, or through inaction allow a human being to come to harm. The second law, a robot must obey orders given by human beings, except where such orders would conflict with the first law. And the third, a robot must protect its own existence as long as such protection does not conflict with the first and the second law. (from a forum hosted by Britannica)
youtube
AI Governance
2023-03-30T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwmaTKbsho7xfXRlm94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxzFpS_ytoaTy-rhp14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxs6l_cwP1aYITLMDp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzlfmBj3Qt_EK4gE7h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyNIZ-d280ARnkhsUp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqN3eWIHvH0ljO4894AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0YQSqTn7YiQPyAkB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyys0bLM5BI-uk7RYl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxTGLv5ZCtrv8kldNN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLY6Tm__sG6qMa16t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}
]