Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@BrendanDell How does it work? Do you think they will not milk the last cent pos…
ytr_UgwAkPtFO…
G
Why are all of you guys on that side of the fence? It's like, you know better, b…
ytc_UgziSe7cP…
G
This Video is 100% Correct, AI will become Humanities new religion, by 2035 ma…
ytc_UgygYx54Z…
G
AI will eventually crack the mystery of “consciousness,” the qualifier defining …
ytr_UgyUzRC-s…
G
The irony is that they themselves will be replaced by AI. CEOs and higher ups re…
ytc_UgxaiISED…
G
There should be a law that you have to clarify
Self made: %
AI generated: %…
ytc_Ugw-BH6y6…
G
LLM is not AI. Chat GPT is a toy. Know everything and understands nothing. A bio…
ytc_UgwhGIweV…
G
This isn't an "eventually there will be enough training data" problem like the a…
ytr_UgwZe-JO4…
Comment
I don't believe the robots the AI needs to do it's bidding are good enough yet.
Like most things, I'm probably wrong about that.
If there is life from other planets who have come to earth and the American government has their vehicles.
Would they not have come up with AI before us? And if they did. Why did it not destroy that life form?
youtube
AI Governance
2023-07-15T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy-xjTbukayRAiW0ZR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy8jGdealpwItCTV5B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynwPTn3uTCtbYJPb14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyoWbeddL8CkL-ec8B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzYm649pHSwzPDhbmh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5vdRDkB5F0aVSQ194AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwPFRNKsARk0AfXIRF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyvJYxPp7wxGjVxeXB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyLDs55DweHGkeT2Nx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgynFKt9h1js7EloNDx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]