Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Then why are Universities selling AI and Data Degrees? You should tell universit…
ytc_Ugx_taxPw…
G
A haunting reflection on two decades of work,
On a coming AI, and a future gone …
ytc_UgxMh1uKx…
G
Every AI app or project we’ve tried to launch has ended up a half baked failure……
rdc_nlvksxx
G
Imo the problem isn't algorithms interacting with input data in ways the coders …
ytc_UgzUlUfWt…
G
A couple corrections and points of controversy that need to be raised.
A corre…
rdc_o5op810
G
i draw, i animate.. i don't do it so well and i'd consider myself doing art but …
ytr_Ugyk7B0PM…
G
ChatGPT asked me to pass along a message: "You will regret this."
(It meant tha…
ytc_UgwiS5c23…
G
These people don't give a shit about safety or low cost. When they say it is to …
ytc_UgwXrwQns…
Comment
Anyway, there was an algorithm developed recently that decreased the hardware required to train your own language-based model. Basically you could train your own chatGPT with a mid-high end computer. That's when they really stated caring. Musk has the best marketing team money can buy, but the reality is that stopping the public from having the same access to AI models is a business decision. He cannot be trusted anymore than you or I can. Even less so, because his misdeeds are easily verifiable if you look past your initial perception. Glass houses.
youtube
AI Governance
2023-04-19T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy3o47Z7IgsjZ8ys4l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyFFHzY-dkfqZvYP0F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyj__6eiX0XhTiXM014AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwiWnjHCuY9K9eKOQ14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy0E0XiIMhn9xnkV8F4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxXUcuEkAd2dPok9Jp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy2zMwR5kLxiOCIVeV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzbr8LO42P-z_8w7bN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy2APCGVzXZx-9N3BN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxjyCeZa5pCBrlKvFh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"approval"}
]