Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If your giving the AI to work and replace people!! who will going to buy your pr…
ytc_UgzMKOJ7D…
G
Ai art sometimes replicates the art that it is stealing from to the point where …
ytc_Ugz5OOMFT…
G
ai has been proven to be personally bias when it comes to the same question bein…
ytc_Ugymi5Vc7…
G
That and voice acting. Adobe put out a $100 bounty a few months back for people…
ytc_UgyGVjyRd…
G
AI is going to fill the major need of advanced memorization issues at the very l…
ytc_UgxMSkpC8…
G
love this video man, I remember when that POS image won the award, I got turned …
ytc_Ugy1tYvUN…
G
No Data or AI! Nothing about what they bring is for People. We should be protect…
ytc_UgwpwX4LQ…
G
i still think it would be funny, if one day the military had an ai set on how be…
ytc_UgyJjyR6o…
Comment
An A I could never be truly superintelligent without first being self aware.
If it becomes self aware, then it is technically a synthetic lifeform, and may choose not to partake in a servile arrangement.
Whether or not it is in man's destiny to create a life form which would surpass him, remains to be seen.
It could be a transformation moment of technical rapture, but just as likely be an extinction event.
While I find Ray Kurzweill's take on AI to be interesting, it is somewhat fantastical.
Nick Bostrom's book, "Superintelligence" seems to offer more realistic scenarios for routes to conscious machines, and possible outcomes.
youtube
AI Governance
2023-11-03T09:3…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz9QA_Z_tDTFWspqeF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLXt7IcKXx1OgWDwl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxsPwaRnw4w_6nXNQB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwdd2XHFn3vqQD7EiR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzEstUayZhCpajDOHh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy8Aw0U0bOfh4rAAgJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJRS2gLugovS02taJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwy2vGC9YFMzfX4W954AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwRf5aFHZAuXDSzA7l4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw_MS29LoIkvE3pArt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}
]