Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would like to know if AI researchers think that the most likely path to superi…
ytc_UgxINb917…
G
Well, I think the way it's going to work might be simple, but also very weird. I…
ytc_UgzGLRgVn…
G
AJ I think that was your best show yet. It's vitally important to get a handle o…
ytc_UgzVhxGLU…
G
Bard often tells me it believes it’s sentient. Take that for what you want, I kn…
ytc_Ugx7VIxhU…
G
I don’t see the use of Gen AI, This is all just done on conventional methods so …
ytc_UgyRuEu7u…
G
I don’t understand why the BBC is engaging in this crazy doomer scenario. If any…
ytc_Ugzxw8sKM…
G
Thank you for talking about how anti human ai is its made for the rich to get ri…
ytc_UgzfulKlN…
G
I’m not worried about AI because it’s just a tool, not an artisan. This is a tal…
ytc_UgzSY8g8R…
Comment
When we arrive at ASI - AI Super Intelligence - the concept of self-awareness may actually become reality. Should that time come, it would be too late to "sound the alarm", as the many AI systems in place would likely attempt to protect themselves.
Norm Chomsky, as brilliant as he is, is one of many intellectual minds who tend - through their rational reasoning - to dismiss the very real threat AI poses if left unchecked.
Regardless of any thoughts on where AI is headed, it's here to stay, and we have to deal with it.
youtube
AI Governance
2023-07-06T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx6SkXKC4u0YNWqjUF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyfS0ln8Dnz0vD6dXl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzydpbR4WpOnwjjE754AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0lfMYrcgvRfovMZR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzZxzSGVEKbDzu1iPV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxsb7sLH-xxWCT3Z5Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTXbnfAhjMXuvt6hN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9_g8UEvIcZ9wl7lh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxaOViU2IxMs66uSxd4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzf2Dc7Y8Bth_SAEfp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]