Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I predict God-like AI will become reality in the year of 5024, I hope to stay al…
ytc_UgzjMyqhC…
G
I furthered my education because Obama suggested it. I’m hoping AI doesn’t take …
ytc_UgxEM-v26…
G
After all cars are made autonomous (which will come with time) the cars should b…
ytc_UggDy9xEJ…
G
Hollywood tosh. You need my philosophy- read it (future AI will be guided by it)…
ytr_UgxaJ9NtS…
G
The female ones look pretty real like robots but the last guy gave it away 😅. No…
ytc_Ugw1EJhWS…
G
so if a producer uses AI, (like a tool) to pick the melody, the lyrics, and the …
ytc_UgyDysoMw…
G
True. After all, YTPs were edited by people who have a creative mind, unlike AI …
ytr_Ugxo6qEDH…
G
This guy is a devil in sheep's clothing a hundred years??? The leading Google Ai…
ytc_UgzND-kqW…
Comment
Anyone who tells you that AI will soon be driven by the emotions of human frailties, has had the non scientific part of their brain influenced by fiction. There is absolutely no findings or anything close to this, and there are no good theory's to suggest it. Intelligence doesn't imply greed, or selfishness, or a need to conquer. If anything these are driven by a lack of intelligence. The lousy qualities we are trying to get over in ourselves are not part of the tools we use. These conclusions are fictional drama for clicks.
youtube
AI Governance
2024-05-03T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwCWE0em3fJlS-mnsB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzS0sGVdHfd8YU9XKd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwxKkEUNO0Mr6wV6rd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwyYGVQ0DeVW-ESQyt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxDDHigGj254IQxoal4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxsI3s5JCX1VkAYD694AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwwrQR7apurRcVAtQZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxntNVBxjy8dtLDzaV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwoEZKBswLBnGQpaSt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxlurhnOpdwfNnxt4t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]