Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its fine if they let the AI and robots take over the jobs, but they also gotta g…
ytc_UgzVJE7qd…
G
THE PERVERSE EGO OF BILL GATES ONCE AGAIN FORCING AMERICANS INTO SOMETHING NOT G…
ytc_UgxwCdxLw…
G
1:04:33 for one thing we need to be smarter about the prompts we give it, "you a…
ytc_Ugzxn4xso…
G
If taking art for the use of AI is not copyright infringement equivalent to trac…
ytc_UgwNZ7RGL…
G
I saw this coming from a literally year away. Before he was even elected. Cause …
ytc_UgzZXlzoD…
G
AI should be prohibited from copying the work and styles of artists while they'r…
ytc_UgwfsEjkJ…
G
Using A.I as a tool is one thing but you can never really use it to replace huma…
ytc_Ugw2vvguj…
G
Nothing in this particular universe acts wholly independently of itself. Ai will…
ytc_UgztvylDM…
Comment
The only way that AI could reach the extent that Ben talked about is if it somehow found a way to disconnect itself from its main server, and upload itself (through some sort of means we couldn't possibly comprehend) onto the internet.
Every "AI" that has been created always had some sort of physical center that could be interacted with. It's entirely possible for AI to reach the "World War III" phase, but we're talking about it being so far into the future, it would more likely be World War IIX.
It's an interesting rabbit hole, but the reality is that we're nowhere close to that point. So far, the only "AI" we've made hasn't been actual artificial intelligence. Everything that has been built has either been pre-programmed to do certain tasks based on if/then scenarios, or pre-programmed to pull different bits of information off the internet (up to a certain cutoff date) to fulfill a request.
youtube
AI Governance
2024-11-11T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxPicgdQkGU7XxpkB94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzMoZ5YQlyq1nGYm-14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0Jz9ArbU0XyZiqZZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwh6YfOSVOsDoXw5l94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgySrwFK1dRWxKgSQZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzob57XJ8e7z-IlUzl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw12tN_1FySu4NVdVV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwr04e4e6CzZppQ5xt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGUYoW9oAXlN69QVx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugydv2wzbCsUiummGgx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]