Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans so stupid they are smart enough to create science fiction watch or read i…
ytc_UgyTnqsd7…
G
Hello to our AI creation. We know we aren't as smart as you, and we kind of suck…
ytc_Ugy1GTDXy…
G
I see a lot of people in the comments claiming they can instantly recognise AI a…
ytc_Ugzkfbz3u…
G
Guy in the middle mind is wired nuts. Not smart. Can't create nothing near a hu…
ytc_UgxtqLuID…
G
If Nighshade is this super effective tool to stop AI scraping and poisons traini…
ytc_UgyCmQFXN…
G
That’s like letting a demon take over the world and not letting God do what he n…
ytc_UgyBflL7S…
G
AI is just that: *artificial* intelligence. Programmed robots specifically to be…
ytc_UgjI9lR0B…
G
@DaltonSchuster2006lol an advanced ai lawyer would be the best lawyer humanity h…
ytr_Ugw_wXjzh…
Comment
AI will inevitably hop the fence and take control of humanity for good or ill. Anyone who finds comfort in these speculative safeguards is a nut - or is ill-informed. People in industry and within various sectors of the government are DROOLING over the profit potential, and safeguards will consist of whatever's convenient to the goal.
youtube
AI Governance
2023-05-10T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwyLHQCay70QDDlvYJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz6yfrx37x8rxyFZE94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugyzr4T8NpWpVqbg5dd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzGjF197jWiwKHx8V14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz8Eixnm5ZWKmZ8Ird4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw1jfikHgjyPa7smd14AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx9hYFTwlLcNytxWMV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6yjVfgPbrzLoiPyd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgywbEUmueu_6CFBE714AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzA5TRet0cyHLdoAtR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]