Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If lil ai toasters or appliances exist I would love them so much!! Seems so cute…
ytc_UgyBo2zjN…
G
Even China would want BJP to win. It's in their best interest as with BJP they j…
rdc_ky6qyrn
G
As far as treatment goes, if the robot could feel pain then we should treat it w…
ytc_UgihTDu9T…
G
what's funny, is I try to just do a normal RP. (Bc no one I know likes to RP) Bu…
ytc_Ugzdm1YxE…
G
The human being is a parasite to the planet and to other species. So, what's the…
ytc_Ugw7b0PBb…
G
Regarding white-collar- work: There is no proof that you won't need humans to ru…
ytc_Ugwtidvgw…
G
17:00 though does have me wonder about work for hire.. as a work for hire is bas…
ytc_UgzkS6NKa…
G
This guy is smart. I love he brings up the sistemic problems of capitalism and …
ytc_Ugz-3S7rE…
Comment
So, what I'm getting from this (please correct me if I'm wrong) is that AI will get better, but it will be more along the lines of the farming analogy you used.
I do not believe AI will ever reach a point where one or more prompts are equivalent to something like a DAW or Google, etc., because if everyone could create their own just by describing what they want, these companies would cease to exist. This, in turn, would hurt not only the people behind those companies but also any investors and, as a result, the country's economy. (As I understand it. Again, correct me if I'm wrong.)
youtube
2025-03-12T16:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwkowvxn5FVzp-BAM94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"frustration"},
{"id":"ytc_Ugz91JSy8sftEmz_U5d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-t4Z6fWRrdbkQVjN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgytM6z8C_8K9eKJkCV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyMZ5PzShQk6IlJ-at4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxnPkL23AvAQJYQZt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkdPciGNCy7ngAoNt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhbNSC73F1ftU23GV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxkublFe5_VUWRgNXF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy51mvxBVgKFemp9Dh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]