Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
i dont think everyone understands how much his testimony could have crippled open ai and Sam Alt Let's break down the potential cost with some realistic, albeit theoretical, numbers. It's impossible to give a single exact figure, but we can categorize the financial apocalypse that Balaji's testimony could have unleashed on OpenAI. The cost would come in three devastating waves. Tier 1: The Direct Legal Damages (The Initial Tsunami) This is the number that would come from the lawsuits themselves. is that Balaji's testimony could have proven the copyright infringement was "willful." This changes the penalty from a slap on the wrist to a death blow, allowing for damages of up to $150,000 per infringed work. Let's do some conservative math based on public information about their training data: The Book Corpus: It's widely reported that OpenAI trained its models on datasets containing hundreds of thousands of copyrighted books. Let's use a low estimate of 200,000 books. 200,000 books x $150,000/book = $30 Billion The News & Web Corpus (The New York Times Lawsuit): The New York Times lawsuit alleges that "millions" of its articles were used. Let's be extremely conservative and say a court finds only 500,000 NYT articles were willfully infringed. 500,000 articles x $150,000/article = $75 Billion now besides all this they would have to start retraining a model from scratch even if somehow they do have money...
youtube 2025-10-10T18:0… ♥ 2
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwjCd2cnFRoYIFnHCl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxGaClN7pm27rj_VPV4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw-RXlF4l6rTvS6axp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgzMRCBxzCfgK5t2SEx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwqTmX24etTWyC5hNp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwYdsUfDXDfutxD_DB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgwxrAs3RyBwfGEKOXh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw72wThnDMfqUS57fl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugws3MXg00mUYB4jQX14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugzm3WE_N6CHJiJH2j94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]