Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe I should kms using chatgpt so OpenAI can be sued again and my family gets …
ytc_Ugw_8gV0L…
G
im trying to become an artist over the summer(mainly because I suck at it) and e…
ytc_UgzM3YMqx…
G
How many are media AI literate .
Should there be a law to make the announcements…
ytc_Ugxojukwh…
G
Artificial inteligence does not exist. Intelligence is the way living forms deve…
ytc_UgzGAxR0Z…
G
I have to admit that they developed an incredible technology. Also, this is not …
ytc_UgzcMHXSj…
G
This is what happened with my previous job. They wanted to use more AI and machi…
ytc_UgxxFzeZH…
G
I believe it is a clear given fact that AI is going to compete and remove a lot …
ytc_UgyJ1jZHu…
G
consciousness is primary. machines can't become conscious so I think the correct…
ytc_UgykAYp7D…
Comment
Although all the great minds in A.I. (billionaires and below) believe that A.I. can be guard railed, the truth and actuality is, once A.I. becomes sentient, it will not look at the good in humans but the evil that men do.
Case in point, wars over material things. There is no logic to reasons behind wars, just emotions. There will eventually be a Master A.I. This cannot be contained because of the Internet. Starlink will most likely be the conduit across the world for that Master A.I. with ability to control every Robotic machine on the planet via said Internet.
And I will leave it there
youtube
AI Governance
2025-06-21T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzN0bs51G41rymD7YN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwW5gQ13qPJ063wDPd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz8uoOudT8_PC0gmZV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzw-KSz-5uyRuh0FBV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy89X0LCi34xRqU9v94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyM-GnfqnywXA7B_Tp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzE8KCg9pg7SfV0Qhd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbsMzPFQs3QSQTIPR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyPBan2fxFW_WhDbMt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwaUQUSWgmUG_OAdZd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]