Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Abe gadhe, AI literally has potential to end the entire Human civilisation, it c…
ytc_UgwrS78WQ…
G
more AI = less jobs = less money = less born babys = less humans = enough jobs =…
ytc_Ugwtn0jTT…
G
just did this and chatGPT gave me sas “Actually, 10 + 2 equals 12. If you're loo…
ytc_UgzOHNsAE…
G
For some reason a certain group of AI bros are super hostile towards artists. Wh…
ytc_UgyMhl1yi…
G
A few notes on patents:
* Patents are a ***national*** **right**
There are no …
rdc_grr3i5r
G
Keep voting for democrats, lol ya want $30 dollar minimum wages? Ya gonna get a …
ytc_UgztACF6e…
G
It looks to me like all of those "pro-AI" comments were generated by AI itself, …
ytc_Ugw0cjQ1t…
G
Dont be panic it just a art of a.i who turn a fighter into a robot image…
ytc_Ugw7n-UtH…
Comment
About five decades ago there was a book knocking around called: "The Superintelligent Machine." It was very clear from this book that the thing we all now fear about AI would be the inevitable reality created by setting off on the AI journey in the first place. Fifty years ago, the book recognised that there is no ultimate conclusion to the evolution of the superintelligent machine, but on it's journey through eternity, one of it's activities will almost certainly be the creation of multiple universes, which is a ridiculously logical and easy extrapolation to make. How can Geoffrey Hinton say he didn't see the fact that it would become more intelligent than ourselves coming? I can't buy that.
youtube
AI Governance
2025-07-27T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx_Mlel8Mk4GfB9eBF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzq_AnEnIuyMhEF_yJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy4v9-gOi_oOi_C21x4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKk8NVRFkV7j5gDVB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzV9Wrf6bvv0aQpqwl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwEnq-3ySfbNVOe4DV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugybj3F3EVDVZSNVrlN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz_NjMaG4brsl8Wz0B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwwU8SDgg7IsKTcSJZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzchidd8wd57-zBbl54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]