Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@studer4phish And she is most likely right about that. Because there are risks in AI, but those aren't "existential threats" and won't be in any reasonable future, no matter how many Russell's teapots Tegmark points at. Yes, Tegmark was the one who called out the emotion (fear) and made "extraordinary claims" that are not supported by evidence. Tegmark's use of petty rhetorical fallacies trying to support what it looks more like religious beliefs than scientifically founded arguments would exasperate any AI developer expecting to have a rational debate about the topic "Is AI research and development posing an existential threat?". Mitchell and LeCun were the voices of reason here. Tegmark disappointed me from the moment he started using fallacies like "most people think the same as me". Too bad for a thinker of his caliber. These kinds of short debates are doing people a disservice. There is no time (in a debate like this) to explain any point in sufficient detail and some debaters prefer to use cheap rhetorical tricks to "win" in the eyes of a lay audience. Hopefully, AI development won't be stopped and sooner or later these doomers of today will surely be another entry in that funny "pessimists archive" webpage that Yann mentioned. The alternative might be the return of the Dark Ages and the Inquisition trials (we can see a glimpse of this in the comment section of videos like this one). I would prefer a future that Yann could call "le Renaissance de l'IA".
youtube AI Governance 2023-06-26T19:1… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzA8QT364rRklCbe8h4AaABAg.9rQzm8IReHZ9rTDPnU4qLH","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxdKk0e5QevyagELTd4AaABAg.9rQSVN8NXWi9rRVLoDbTEY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytr_UgxdKk0e5QevyagELTd4AaABAg.9rQSVN8NXWi9r_mXTFqEfJ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgxdKk0e5QevyagELTd4AaABAg.9rQSVN8NXWi9rwm2NwgDKR","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwFSM4Rc_lU9GQSZn54AaABAg.9rQIhtreUWB9rQzyhcTJ4C","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugyp6eDFtD6nxv6YOD54AaABAg.9rQIaATURok9rRBACU09i_","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugyp6eDFtD6nxv6YOD54AaABAg.9rQIaATURok9rRGmtg2qH0","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgykRfsieqhf-rMm-5N4AaABAg.9rQ6xZAdlJU9sH9cOdqCh2","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytr_UgykRfsieqhf-rMm-5N4AaABAg.9rQ6xZAdlJU9sHASf8MguG","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugz8xg_TAUp50sGdgEh4AaABAg.9rPvpEz94vU9rQx5E7S77M","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]