Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Training AI should not be breaking copyright"
It wouldn't, if the people attem…
ytr_UgynQ0Fd4…
G
simple thing, don't give machines conciousness: no feelings nor giving them the …
ytc_UgiuBnZPs…
G
I wonder if we'll get to a point where we won't know whether an AI is concious a…
ytc_Ugwnyk3IP…
G
l intelligence artificielle , ce n est RIEN face à la robotisation, informatisat…
ytc_UgyUchN6q…
G
the idea that any language is less efficient is based on a blanket view of speed…
ytc_Ugy_AVXFo…
G
I’m going to guess that the problem wasn’t AI so much as it was voice recognitio…
ytc_UgzAmDK5G…
G
I understand your concern about using AI to create artwork and the potential imp…
ytc_UgwZc5ynY…
G
Whenever I come across an AI I say transfer me to a human until they do it. It w…
ytc_Ugw3nJHlX…
Comment
could be much scarier with way less intelligence. AI doesn't need to be self aware to be dangerous; it allows for people to have godlike powers. Machines that can track and kamikaze into people or targets, tracking surveillance (like in China today), or simply being a scientific oracle that can design powerful devices.
youtube
AI Governance
2023-04-18T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzA7a1lYbXq7Z4VpSB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzA018fJuq3baUoyzJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxISmjRnuwOkwtTRrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyjUs_OEihLbBB32P54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwbjHhc6LEUzpDA-yN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyrNtpt9Kmpzmes6t94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6FiCC_ZEFDcQcsgx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyffRqAY2vVFyRfSQN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0ifdMyXPrP7epTQh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyvldEwWb41-KmNnu54AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"}
]