Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Dude, you are going to love when AI is going for the politicians, they ARE HUMAN…
ytc_UgwN4_K16…
G
Printing a copyrighted character onto a shirt _by hand_ usually won't get you pr…
ytc_UgwNSCBB-…
G
They skipped “Nu” for obvious reasons as it’s difficult to say Nu variant withou…
rdc_hm7z4kq
G
This AI researcher needs to get his head out of his own ass. Wants to capitalize…
ytc_UgyrRNmLA…
G
By that measure of sentience, ChatGPT, Bing, Claude, and even the Llama LLMs all…
rdc_jp7l0c5
G
Musk is a complete moron when it comes to AI and has no idea how it actually wor…
ytc_UgyJlzl5Z…
G
how has it gone the other direction? Last time I checked nobody is buying AI mu…
ytc_UgwtJND6-…
G
Goofballs like Musk don’t know if AI will kill us all. However, they do believe …
ytc_Ugwr_A4Sp…
Comment
Interesting since there's actually only 3 levels of AI. The point of 'Singularity' only refers to exceeding human intelligence. We're at the point of ANI, but AGI should be reached around 2030~2040 and ASI is a very exponential thing when an AGI can build its own intelligence and unpredictably fast periods of time. Certainly by 2100 humans will be like pets to ASIs.
youtube
AI Governance
2023-10-31T16:2…
♥ 13
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgzxvhtSiJtPX4dJYbN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFChThDBIX0szeK-p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzA84nk_p4DOzwNA6l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6YBofY5mvcLpS6H94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8r84v_JpAF_1bOpN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxMFVKReyLxjHHf29t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqTy9pbl9j2Ljdd9F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8XXbHD-Yp5IpBHFh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqkHRVOyzRe3dW7054AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxE39q66FlihxuyOLl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]