Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People are complaining about AI in the comments but this is hardly an AI issue. …
rdc_oalbyoy
G
You can give it all the time you want, AI will still not have critical thinking.…
ytr_Ugx6Nf-jA…
G
Machine learning and AI is not going to be the same the thought processes will c…
ytc_UgyjU5Hb9…
G
@wizardofaus1720 havn't seen Any antisemitism. Ps Zionism is to true Semites wh…
ytr_UgwhzI948…
G
I consider myself very creative, where do I apply for a job at OpenAI? xd…
ytc_Ugxd-ug9l…
G
Ideally...
Step 1.5: institute reverse tax to keep people alive and in their …
rdc_ncksmrp
G
@Dan-dy8zp"Improvements may continue" doesn't confer intelligence. Having all o…
ytr_UgwD6mxL7…
G
I always thought they were talking about people using artificial intelligence to…
ytc_Ugzo_fu_Q…
Comment
The whole flaw here is we are treating AI as if it is conventional technology. An FDA like regulatory level makes sense for products that take years to develop and have a clear ownership by big companies. The steps towards human level AI will sometimes be sudden and unexpected. And unlike a new prescription medicine, you have to consider the AI itself should have certain rights. Essentially we are regulating the reproductive rights of what in the future could be considered another species. If one wants to generate conflict between AI and mankind, I can think of no better way to do so.
youtube
AI Governance
2023-06-27T13:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgylkgUNwv6DUNAiyYB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz3GU2tEM9TIY679aB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzXFrKmG_ToxwhT-m14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugx2xn19G9n3SYR_ub94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},{"id":"ytc_Ugxck4zZhWBhAeCMUaR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugw-YV67KYkkryplsmR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw6SqTSLhqAOrEXs_t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugxya9-Tr7wBWfYxAPV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgxSqmwtQM__W3C0eZF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgwNJgSX0nU42Oqswqd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]