Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We must be in the dark timeline that instead of automating menial labour we auto…
ytc_UgyWN2Cpd…
G
I'm very pessimistic nowadays about the affects of AI art towards artists. I wil…
ytc_UgzwpxWEf…
G
Trigger warning Super unpopular opinion abt the questions at the end...1 telegr…
ytc_UgzPpty4J…
G
A big issue is that the ppl making these decisions are the old dinosaurs who don…
ytc_Ugwqv8U6u…
G
Okay, so what is your proof of these AI's truly being "Jailbroken?" Just baseles…
ytc_UgykXpjuS…
G
Purely a case of you get what you put in. You expect to get a image even remotel…
ytc_UgyTuw0P9…
G
AI has no heart therefore no real emotions or feelings therefore not sentient. N…
ytr_UgxGDe1X0…
G
@JustSenixFar from it infact China has banned Tesla from using the name "Full S…
ytr_UgxK9hEVP…
Comment
And even with that, the AI risk within the current paradigm and state of the art - is *considerably* higher than asteroid impact risk. If we make AGI without *concrete* control mechanisms (which we are nowhere near figuring out) - the doom is approaching 100%. Its a default outcome, unless we figure the control out. All the positions that this risk is less than 100% (people like Paul Christiano, Carl Shulman at ~50%, or Stuart Russel, Ilya Sutskever at ~10-20%) - hinge that we figure it somehow, down the road. But so far there is no solution.
And now that all of AI experts see the pace, they come to realization that it won't be someone else problem - it might impact them as well. LeCun is the only hold out, but I think only in public. He knows the risk and just want to take it anyway - for some personal reasons I guess.
youtube
AI Governance
2023-06-27T19:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxGaW9p18AEp5IotE94AaABAg.9rcov6TyeMk9sG0KIycBlk","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz-I_5z2MH1F-xN_bt4AaABAg.9ra7EGbrpwC9s-nH-OgPNs","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugz-I_5z2MH1F-xN_bt4AaABAg.9ra7EGbrpwC9tOj5W4zn70","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzUXE2d9iCAiRPKfyN4AaABAg.9rYaYPPO-7MA72SGK_7Aqz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugy8fQDWMBP-0LRsOAB4AaABAg.9rTdNN4aeVh9rY9ZyGIH8X","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugy8fQDWMBP-0LRsOAB4AaABAg.9rTdNN4aeVh9ra3e-kPDb_","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"fear"},
{"id":"ytr_Ugz8-e_-RYQnkl5h3MN4AaABAg.9rT0qF094RK9rTtn13CD3K","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxcuDNaybYEsp5vnLZ4AaABAg.9rT-uSfjN9d9rTLwqFhvnA","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwH-6hm87UtoueFPWt4AaABAg.9rSv5z5xe2O9rWNAl3PZiG","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwH-6hm87UtoueFPWt4AaABAg.9rSv5z5xe2O9rsZ4ItlwZY","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]