Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Shad Brooks, of all people, having the audacity to call ghibli's art style plain…
ytc_UgzC7uq8m…
G
I use AI to assist with code, but it messes a lot of stuff up, even with extreme…
ytc_UgyU2HLbS…
G
Load of rubbish. It's not going to "take out loads of white collar jobs"... It's…
ytc_UgxelKkSp…
G
LLMs run of a dataset compiled from the internet. The dataset is corrupt from th…
ytc_Ugw50tZd_…
G
I want to see a self driving Garbage truck in my neighborhood with all the obsta…
ytc_Ugwxq3Iqg…
G
AI chat customer service (“virtual assistants”) and automated phone menus are bo…
ytc_Ugydtq6Y9…
G
I'm still wondering if superintelligence is possible. That aside, this conversat…
ytc_UgzWGnkC3…
G
Its not actually, there talking about the Google CEO Sundar Pichai who's been qu…
ytr_UgxpWOSpY…
Comment
A lot of people are afraid of Super AI as "what if we lost control of it?" I'm way more afraid of SAI as "what if a small group of people get control of it?"
SAI will allow to do:
- creating computer virus impossible to detect and get rid of
- be unbeatable in predicting finance
- creating virtual personality indistinguishable from actual person
- creating artificial records indistinguishable from real ones
Those three things will allow the owners of the AI to basically control everything: science, politics, laws, finance, industry...
No surprise there is a race for it, and I'm not looking for when it will happen (which will be right after AGI and not very visible...)
youtube
AI Moral Status
2025-11-01T00:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz6idoqSOMT011KrQ54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6do0hhd3IvUePHQ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzNEnzhSgveLp2ys-d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxRGPaiFomrZXKI3Fl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxFSvkAbdRnDfCWTIp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1sIOM6nQI3GAX3UR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzVouUmDwYZPSfiL7h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUgjrdxau62lzsYJ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy_UfQv7wTVXnaSLJB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwnWgz4MBt3ZkNVAj14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]