Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
what if the monster is human better yet a gaslit ai that was gaslit due to peopl…
ytc_UgzOLvRR0…
G
Yup.
Now let's talk about the truly autonomous armed drones they are currently …
rdc_mvd8oyq
G
ai scrapes artists art to train itself without asking the artist for permission,…
ytr_Ugwqg-8XC…
G
If I upload my own lyrics into an AI-generator just to get an idea of how I woul…
ytc_UgwUVW4eg…
G
Explanation:
I think it’s for a type of show about ai or something? Or movie.…
ytc_UgybSBj8k…
G
Its more like the dark side of any individual human being. AI is just a tool. …
ytc_Ugx6hruD5…
G
i thik robots should be taxed , like machinary that does things in factory that …
ytc_Ugjl7MMEY…
G
If your art sucks then it sucks. It will never be better than ai artists…
ytr_UgxmFbVCn…
Comment
I've been a technologist for over 50 years and agree with him that NeuroNets were the way to go with this. In the 80s I worked on a simulator of a non-Von Neumann architecture, that was not a neuronet, which would have been ideal for hosting AI. I did not pursue it.
Since the 90's I was always concerned with the moral implications of AI driven military machinery that could identify and kill specific individuals. Such a device would be easy to make now in 2025 in the form of a drone.
I'm not afraid of AI as it exists today, but rather of how humans will use the technology (hint poorly.) AI on it's own does not have true motivations and more importantly common sense. I'm always surprised when folks treat current AI as though it is an entity. It is not.
youtube
AI Governance
2025-07-10T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw5EWFvhkSeSj526hB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwUqV2FBdjn3s5sAjV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyaerjl2ScACRdrcxt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWfJmnh8a1ukRdN4J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwCi_itcHGrqSHmR9B4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx97YQBzVZrZFjymMJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxM7A_NAtDUnUwUpqV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxILpeyjj0KQiLivyV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzWuECOSpZl2RCJJXh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydE9NMfW-pTwbnFW14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]