Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The recent signing of the AI actress should be warning to everyone that AI and r…
ytc_UgwCsVYvK…
G
Terrible and terrifying.
I hate AI. No one asked for it and no one wants it exc…
ytc_UgwqoTISK…
G
So now we have an Ai advertising an ad on youtube . But still the cure of cancer…
ytc_Ugyioy-WQ…
G
So here is a question I haven't seen anybody ask, When all the work has been aut…
ytc_UgwboEB1Q…
G
AI knows about fasting and natural food = 99% of all doctors USELESS the rest th…
ytc_UgwPRt-tL…
G
If there is more of opinion A than opinion B it will repeat A more often.
Or, i…
rdc_mz11i09
G
wow, can I quote you on that please ? I want to show how incapable humans were b…
ytr_Ugzmg-8wN…
G
everyone you have on this show talking about AI has a pdoom score of 90. I would…
ytc_UgyGjxXpk…
Comment
This is childish bs. AI isn't human, doesn't think like humans, doesn't have emotions, intuition, physical sense of touch, cannot feel pleasure, disappointment, anger; AI will become super intelligent and will see humans the way humans see animals except AI will not experience fuzzy cute feelings about us, it will see is as unnecessary and even harmful given our history.
So take all the nice wonderful ideas about what AI will do for humans and flush them.
AI to humans will be as Israel is to Palestinians, only much worse.
youtube
AI Governance
2024-02-16T23:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzXNVDFcUAsu0IC5Ah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxT3F1DRjpzMNCg9nJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxtGXPL5qOTbKzhesZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzpKR_DITG-pg0EUU14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwaFChjOx-zFkmWzNh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxan297aCmYJ--6jXp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy-bY5_4lD_Sgua4p14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPErZbKQaTt5Gmn3F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwmX9ujS4XSKX0zOiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw1RjIX7h-gE4jTMWN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]