Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​@MrMichiel1983 Many AI researchers estimate a much shorter timeframe, likely in your lifetime. Check Nick Bostrom and others on this. Then couple that with the magnitude of the risk (extinction) from AI misalignment, and the priorities should become clear. Too many people don't seem to understand that AGI development will not stop once it reaches human level. It will blow past us exponentially. Be it in 2 years or 200.
youtube AI Responsibility 2023-11-07T20:2… ♥ 13
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgxwPElZVXSeRa-4WrF4AaABAg.9wmQmxxQMAY9wn9JXVYSLd","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgzGZ-KfndIdSmbe9HR4AaABAg.9wmPqx_6m5R9xLnCPeVbHa","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxutE1MxXKGjVrKaaZ4AaABAg.9wmNOBAm_Te9wnezbZhhyT","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgwHqeUcFeWwY_BeopZ4AaABAg.9wmFg-qbKG99wmqth3s71x","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwHqeUcFeWwY_BeopZ4AaABAg.9wmFg-qbKG99wp8MXJMg2A","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UgwHqeUcFeWwY_BeopZ4AaABAg.9wmFg-qbKG99wpREnqDVY5","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxmiDdYRLsKvrPR9nd4AaABAg.9wm9XQLn7-H9wnElBDtxrK","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxmiDdYRLsKvrPR9nd4AaABAg.9wm9XQLn7-H9wnb2kzWOS7","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxmiDdYRLsKvrPR9nd4AaABAg.9wm9XQLn7-H9wokmNKBcGk","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytr_UgxmiDdYRLsKvrPR9nd4AaABAg.9wm9XQLn7-H9won39cv5Hk","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]