Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nick Land's techno-nihilist, anti-humanist philosophy has had a big impact on th…
ytc_UgygUKG5-…
G
I’m a medical professional, sometimes when treating patients… you have to go out…
ytc_UgwBz4Q_V…
G
It's not even just the knowledge of it being generated. Often the AI art itself …
ytc_UgwxQKEjg…
G
Basically, we need a civilisation collapse (grilled grid) before super AI sees t…
ytc_Ugy4h4BJR…
G
Lol BINGs chat bot, also does not like Microsoft LUL.
Bing chat bot joined the …
ytc_UgykeuvV2…
G
I mean, ok but this is overlooking less fanciful issues starting humanity down r…
ytc_Ugx2Nctfc…
G
Robots will understand and have emotions, meaning they can get mad, they can bec…
ytc_Uggt5EwAZ…
G
I also feel like AI tools pull from a small pool of like the same dozen artists …
ytc_Ugzwc5Ube…
Comment
"We survive even though nature is 'smarter' than us"
We only survive in nature in one single place (and time) in the universe, the place we evolved to survive in. We wouldn't be around to discuss how survivable nature is for us if we weren't in this rare pocket of it. You could also say we survive by the grace of nature not in spite of it, but nature is still 99.99999999...% deadly to us.
If AI is like nature, then the chances are it will not arrange the universe to be suitable for us, just like how all of the universe is deadly to us, simply by trying to achieve it's goals.
youtube
AI Governance
2024-11-20T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx6qIp5l_aI9AfElqp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy8QpWKbxQ9n7H_aAx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx7QO9apJwSwFJJDFd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy_7LlslQK-jD28V2J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy0yNh2c8WMFRwdpqJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyEHcmpJN4NHiXUwF14AaABAg","responsibility":"user","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugymnp8X3WB_5uO6CpF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwcGbsvaLER3kwY4m54AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx2VcfgXyVMdW6L8P94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw5nG423IJW_SdQh0V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"resignation"}
]