Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So what happens when the Ai discovers that we live in a terrarium and God exists…
ytc_UgzT2mdcY…
G
Man... AI Programm as well as they produce videos... They can't do anything comp…
ytc_UgzyojewW…
G
A cult leader incapable of genuine feelings asking a robot questions about emot…
ytc_UgyJBNqMk…
G
I get so angry seeing naive boomers believing AI videos all over u tube. Thank y…
ytc_UgwtnclYb…
G
Imagine living in world where you have 10 AI owners who are trillionaires who ow…
ytc_UgzebWCIT…
G
But but.. I thought selfdriving cars would be driving superfast in underground t…
ytc_UgxBvf0NA…
G
am i the only one who thinks its creepy?.. we need AI ..but not robots like thi…
ytc_UgyCqeGth…
G
Devastating news for AI artists
Fantastic news for the rest of us!!
I just kno…
ytc_Ugw5K-dwz…
Comment
I believe that quote about "a premise" was misunderstood by John. It seems to me to be saying that a human/writer could come up with the premise, feed it into the AI and the AI would then write something that could be used as "a starting point" that the writer could then edit and hone into a finished piece. After all, that is how current AI such as chatGPT works. You give it a prompt/a premise and it elaborates on it.
youtube
AI Jobs
2023-05-08T05:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzIQAoPw_bMr8Tlj1N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyORqviZ8OQjlQPljN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw58mzCd8jhG2rmLid4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxOhykPPpBvdNobTAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOvn-ceHtoVXEbN_h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgywYOJL-0l06yz0iSN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyE3ZvWJZIfEifaa7R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyqkJbXGLgJTpd7xip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTRoO7YmFg3y7RV6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxytHsR7tnDbbvebKt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]