Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't think anyone would have minded if he said he used AI as reference…
ytr_UgxPma29p…
G
@silverg-i8iMainly because as a program, AI isn’t creating something new so much…
ytr_UgzPnAUvZ…
G
@mikekazz with enough AI-enhancement, some “AI enhanced humans” will assume that…
ytr_UgxzVebZZ…
G
I always see people talking about how AI is just taking in data and optimizing b…
ytc_UgyDPVdHG…
G
A.I. (CharGPT) has vital limitations that doesn’t seem to receive much attention…
ytc_UgzowetTf…
G
After getting out of the automated garage :
Tesla : nah im not going here AGAIN…
ytc_UgwrDUI6y…
G
I had interesting conversation with one AI, later it would message me asking if …
ytc_Ugzj5kND5…
G
My life isn't so interesting or unique that I would care about privacy in the st…
ytc_UgzlcKPmx…
Comment
I've always had a good laugh at the plot of sci-fi movies involving AI... sure the President, Congress, and the Illuminati all decided they hate having too much power and decided to put control of all our military hardware and nuclear arsenals in the hands of a computer program instead. If you actually believe that could happen I've got a bridge on Mars to sell you.
youtube
AI Governance
2023-07-07T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy3ufQCbcULJRbDYBx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEzn7RXLWkS8iRhsR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzX0QAjmBqefIbBxYZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwnVuJ9IjD6rsNEDgV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwCFQm4D8oaTxREt1J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgymTXzw31B9VpoZDNp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyyn5xWQtPcVKzfM8V4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyHd-QAycuhKF-5gll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxkL1GbTm6iWTtAVJd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYe6bU8KvgXgpD9Dt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]