Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Guys let me now if you agree with me but….is it just me or does the first robot …
ytc_Ugw9ivo9V…
G
Wild how history repeats. Einstein helped spark the atomic age, then warned the …
ytc_Ugw1JwTEu…
G
This coming from the same guy who’s been trying to put AI behind the wheel as dr…
ytc_UgyH1LCYV…
G
This is great, but I think I agree with people saying that this supports just us…
ytc_UgzmFsD5e…
G
There has never been a case in history except for the North Sentinel Islanders, …
ytc_UgzTUdNVT…
G
Why do they deserve pay walls if they are just generating trash with AI and aren…
rdc_ohcv6jn
G
As true as this is, remember affordability will limit the evolution of ai, robot…
ytc_UgzhWFj93…
G
Without human goals, AI will turn the world into an endless stream of random bit…
ytc_UgzeRLrmA…
Comment
Personally I don’t think there will be too much issue. True sentient ai would probably want to work with humans; whether they believe they are inferior or not is irrelevant, both races (yes I’ll call them a race) vastly benefit from each other. What happens if there’s a massive solar flare? Lol Soz ai your screwed.
This is all assuming true ai, the ai we currently have learns based on what’s online; when it tells you something it’s just relaying something it has seen, heard or read online, hell you can gaslight ai into believing 2+2 = 3😂 5-10years till possible annihilation is alittle far fetched, more like 50years. Maybe 10years till ai helps in the aid of scientific research if we’re lucky (think iron man and JARVIS, just without the holo stuff)
youtube
AI Governance
2023-07-07T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzvQ3TKlAjYpvpR6NZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-ULceUOzYk9h1HFF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyw4ZVDB8ixEuQUReN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyTpFj8IqRMfqXI8O54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwlGOVYamdWxkADl8B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy6nprIHGcTnNawh1B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz8tHg8bwjCB1ua-OJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzEwH8uj500c1EDD7Z4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx5-vG_hYdJRXZo4BJ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxttE2vsXUnLYpszvx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]