美國
Negotiation Analysis 豆瓣
作者: Howard Raiffa Belknap Press 2007 - 3
This masterly book substantially extends Howard Raiffa's earlier classic, "The Art and Science of Negotiation." It does so by incorporating three additional supporting strands of inquiry: individual decision analysis, judgmental decision making, and game theory. Each strand is introduced and used in analyzing negotiations. The book starts by considering how analytically minded parties can generate joint gains and distribute them equitably by negotiating with full, open, truthful exchanges. The book then examines models that disengage step by step from that ideal. It also shows how a neutral outsider (intervenor) can help all negotiators by providing joint, neutral analysis of their problem. Although analytical in its approach--building from simple hypothetical examples--the book can be understood by those with only a high school background in mathematics. It therefore will have a broad relevance for both the theory and practice of negotiation analysis as it is applied to disputes that range from those between family members, business partners, and business competitors to those involving labor and management, environmentalists and developers, and nations.
Introduction to Statistical Decision Theory 豆瓣
作者: John Pratt / Howard Raiffa The MIT Press 2008 - 1
The Bayesian revolution in statistics - where statistics is integrated with decision making in areas such as management, public policy, engineering, and clinical medicine - is here to stay. Introduction to Statistical Decision Theory states the case and in a self-contained, comprehensive way shows how the approach is operational and relevant for real-world decision making under uncertainty.Starting with an extensive account of the foundations of decision theory, the authors develop the intertwining concepts of subjective probability and utility. They then systematically and comprehensively examine the Bernoulli, Poisson, and Normal (univariate and multivariate) data generating processes. For each process they consider how prior judgments about the uncertain parameters of the process are modified given the results of statistical sampling, and they investigate typical decision problems in which the main sources of uncertainty are the population parameters. They also discuss the value of sampling information and optimal sample sizes given sampling costs and the economics of the terminal decision problems.Unlike most introductory texts in statistics, Introduction to Statistical Decision Theory integrates statistical inference with decision making and discusses real-world actions involving economic payoffs and risks. After developing the rationale and demonstrating the power and relevance of the subjective, decision approach, the text also examines and critiques the limitations of the objective, classical approach.
The Usefulness of Useless Knowledge Goodreads 豆瓣
作者: Abraham Flexner Princeton University Press 2017 - 2
A short, provocative book about why "useless" science often leads to humanity's greatest technological breakthroughs

A forty-year tightening of funding for scientific research has meant that resources are increasingly directed toward applied or practical outcomes, with the intent of creating products of immediate value. In such a scenario, it makes sense to focus on the most identifiable and urgent problems, right? Actually, it doesn't. In his classic essay "The Usefulness of Useless Knowledge," Abraham Flexner, the founding director of the Institute for Advanced Study in Princeton and the man who helped bring Albert Einstein to the United States, describes a great paradox of scientific research. The search for answers to deep questions, motivated solely by curiosity and without concern for applications, often leads not only to the greatest scientific discoveries but also to the most revolutionary technological breakthroughs. In short, no quantum mechanics, no computer chips.

This brief book includes Flexner's timeless 1939 essay alongside a new companion essay by Robbert Dijkgraaf, the Institute's current director, in which he shows that Flexner's defense of the value of "the unobstructed pursuit of useless knowledge" may be even more relevant today than it was in the early twentieth century. Dijkgraaf describes how basic research has led to major transformations in the past century and explains why it is an essential precondition of innovation and the first step in social and cultural change. He makes the case that society can achieve deeper understanding and practical progress today and tomorrow only by truly valuing and substantially funding the curiosity-driven "pursuit of useless knowledge" in both the sciences and the humanities.
Film History 豆瓣
作者: Kristin Thompson / David Bordwell McGraw-Hill Higher Education 2002 - 8
Written by two leading film scholars, "Film History: An Introduction" is a comprehensive survey of film - from the backlots of Hollywood, across the United States, and around the world. As in the authors' bestselling "Film Art", concepts and events are illustrated with actual frame enlargements, giving students more realistic points of reference than competing books that use publicity stills.
Hedge Fund Market Wizards 豆瓣 Goodreads
作者: Jack D. Schwager John Wiley & Sons 2012 - 5 其它标题: Hedge Fund Market Wizards: How Winning Traders Win
"Five Market Wizard Lessons" by Jack Schwager, author of Hedge Fund Market Wizards
Hedge Fund Market Wizards is ultimately a search for insights to be drawn from the most successful market practitioners. The last chapter distills the wisdom of the 15 skilled traders interviewed into 40 key market lessons. A sampling is provided below:
1. There Is No Holy Grail in Trading
Many traders mistakenly believe that there is some single solution to defining market behavior. Not only is there no single solution to the markets, but those solutions that do exist are continually changing. The range of the methods used by the traders interviewed in Hedge Fund Market Wizards, some of which are even polar opposites, is a testament to the diversity of possible approaches. There are a multitude of ways to be successful in the markets, albeit they are all hard to find and achieve.
2. Don't Confuse the Concepts of Winning and Losing Trades with Good and Bad Trades
A good trade can lose money, and a bad trade can make money. Even the best trading processes will lose a certain percentage of the time. There is no way of knowing a priori which individual trade will make money. As long as a trade adhered to a process with a positive edge, it is a good trade, regardless of whether it wins or loses because if similar trades are repeated multiple times, they will come out ahead. Conversely, a trade that is taken as a gamble is a bad trade regardless of whether it wins or loses because over time such trades will lose money.
3. The Road to Success Is Paved with Mistakes
Ray Dalio, the founder of Bridgewater, the world's largest hedge fund, strongly believes that learning from mistakes is essential to improvement and ultimate success. Each mistake, if recognized and acted upon, provides an opportunity for improving a trading approach. Most traders would benefit by writing down each mistake, the implied lesson, and the intended change in the trading process. Such a trading log can be periodically reviewed for reinforcement. Trading mistakes cannot be avoided, but repeating the same mistakes can be, and doing so is often the difference between success and failure.
4. The Importance of Doing Nothing
For some traders, the discipline and patience to do nothing when the environment is unfavorable or opportunities are lacking is a crucial element in their success. For example, despite making minimal use of short positions, Kevin Daly, the manager of the Five Corners fund, achieved cumulative gross returns in excess of 800% during a 12-year period when the broad equity markets were essentially flat. In part, he accomplished this feat by having the discipline to remain largely in cash during negative environments, which allowed him to sidestep large drawdowns during two major bear markets. The lesson is that if conditions are not right, or the return/risk is not sufficiently favorable, don't do anything. Beware of taking dubious trades out of impatience.
5. Volatility and Risk Are Not Synonymous
Low volatility does not imply low risk and high volatility does not imply high risk. Investments subject to sporadic large risks may exhibit low volatility if a risk event is not present in the existing track record. For example, the strategy of selling out-of-the-money options can exhibit low volatility if there are no large, abrupt price moves, but is at risk of asymptotically increasing losses in the event of a sudden, steep selloff. On the other hand, traders such as Jamie Mai, the portfolio manager for Cornwall Capital, will exhibit high volatility because of occasional very large gains-not a factor that most investors would associate with risk or even consider undesirable-but will have strictly curtailed risk because of the asymmetric structure of their trades. So some strategies, such as option selling, can have both low volatility and large, open-ended risk, and some strategies, such as Mai's, can have both high volatility and constrained risk.
As a related point, investors often make the mistake of equating manager performance in a given year with manager skill. Sometimes, more skilled managers will underperform because they refuse to participate in market bubbles. The best performers during such periods are often the most imprudent rather than the most skilled managers. Martin Taylor, the portfolio manager of the Nevsky Fund, underperformed in 1999 because he thought it was ridiculous to buy tech stocks at their inflated price levels. This same investment decision, however, was instrumental to his large outperformance in subsequent years when these stocks witnessed a prolonged, massive decline. In this sense, past performance can sometimes even be an inverse indicator.
Principles of Data Integration 豆瓣
作者: Doan, AnHai; Halevy, Alon; Ives, Zachary 2012 - 7
How do you approach answering queries when your data is stored in multiple databases that were designed independently by different people? This is first comprehensive book on data integration and is written by three of the most respected experts in the field. This book provides an extensive introduction to the theory and concepts underlying today's data integration techniques, with detailed, instruction for their application using concrete examples throughout to explain the concepts. Data integration is the problem of answering queries that span multiple data sources (e.g., databases, web pages). Data integration problems surface in multiple contexts, including enterprise information integration, query processing on the Web, coordination between government agencies and collaboration between scientists. In some cases, data integration is the key bottleneck to making progress in a field. The authors provide a working knowledge of data integration concepts and techniques, giving you the tools you need to develop a complete and concise package of algorithms and applications. It offers a range of data integration solutions enabling you to focus on what is most relevant to the problem at hand. It enables you to build your own algorithms and implement your own data integration applications. It provides a companion website with numerous project-based exercises and solutions and slides. Links to commercially available software allowing readers to build their own algorithms and implement their own data integration applications. Facebook page for reader input during and after publication.
Statistical Learning from a Regression Perspective (Springer Series in Statistics) 豆瓣
作者: Richard A. Berk Springer 2008 - 7
Statistical Learning from a Regression Perspective considers statistical learning applications when interest centers on the conditional distribution of the response variable, given a set of predictors, and when it is important to characterize how the predictors are related to the response. As a first approximation, this is can be seen as an extension of nonparametric regression. Among the statistical learning procedures examined are bagging, random forests, boosting, and support vector machines. Response variables may be quantitative or categorical. Real applications are emphasized, especially those with practical implications. One important theme is the need to explicitly take into account asymmetric costs in the fitting process. For example, in some situations false positives may be far less costly than false negatives. Another important theme is to not automatically cede modeling decisions to a fitting algorithm. In many settings, subject-matter knowledge should trump formal fitting criteria. Yet another important theme is to appreciate the limitation of one's data and not apply statistical learning procedures that require more than the data can provide. The material is written for graduate students in the social and life sciences and for researchers who want to apply statistical learning procedures to scientific and policy problems. Intuitive explanations and visual representations are prominent. All of the analyses included are done in R.
Causal Models 豆瓣
作者: Sloman, Steven Oxford Univ Pr 2005 - 7
Human beings are active agents who can think. To understand how thought serves action requires understanding how people conceive of the relation between cause and effect, that is, between action and outcome. In cognitive terms, the question becomes one of how people construct and reason with the causal models we use to represent our world. A revolution is occurring in how statisticians, philosophers, and computer scientists answer this question. These fields have ushered in new insights about causal models by thinking about how to represent causal structure mathematically, in a framework that uses graphs and probability theory to develop what are called 'causal Bayesian networks'. The framework starts with the idea that the purpose of causal structure is to understand and predict the effects of intervention: how does intervening on one thing affect other things? This question is not merely about probability (or logic), but about action. The framework offers a new understanding of mind: thought is about the effects of intervention, so cognition is thereby intimately tied to actions that take place either in the actual physical world or in imagination, in counterfactual worlds. In this book, Steven Sloman offers a conceptual introduction to the key mathematical ideas in the framework, presenting them in a non-technical way, by focusing on the intuitions rather than the theorems. He tries to show why the ideas are important to understanding how people explain things, and why it is so central to human action to think not only about the world as it is, but also about the world as it could be. Sloman also reviews the role of causality, causal models, and intervention in the basic human cognitive functions: decision making, reasoning, judgement, categorization, inductive inference, language, and learning. In short, this book offers a discussion about how people think, talk, learn, and explain things in causal terms - in terms of action and manipulation.
The Master Algorithm 豆瓣
作者: Pedro Domingos Basic Books 2015 - 9
A thought-provoking and wide-ranging exploration of machine learning and the race to build computer intelligences as flexible as our own
In the world's top research labs and universities, the race is on to invent the ultimate learning algorithm: one capable of discovering any knowledge from data, and doing anything we want, before we even ask. In The Master Algorithm, Pedro Domingos lifts the veil to give us a peek inside the learning machines that power Google, Amazon, and your smartphone. He assembles a blueprint for the future universal learner--the Master Algorithm--and discusses what it will mean for business, science, and society. If data-ism is today's philosophy, this book is its bible.
The Nature of Statistical Learning Theory 豆瓣
作者: Vladimir Vapnik Springer 1999 - 11
The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. This second edition contains three new chapters devoted to further development of the learning theory and SVM techniques. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists.
Too Big to Know 豆瓣
作者: David Weinberger Basic Books 2014 - 1
With the advent of the Internet and the limitless information it contains, we're less sure about what we know, who knows what, or even what it means to know at all. And yet, human knowledge has recently grown in previously unimaginable ways and in inconceivable directions. In Too Big to Know, David Weinberger explains that, rather than a systemic collapse, the Internet era represents a fundamental change in the methods we have for understanding the world around us. With examples from history, politics, business, philosophy, and science, Too Big to Know describes how the very foundations of knowledge have been overturned, and what this revolution means for our future.
Introduction to Linear Optimization 豆瓣
作者: Dimitris Bertsimas / John N. Tsitsiklis Athena Scientific 1997 - 2
This book provides a unified, insightful, and modern treatment of linear optimization, that is, linear programming, network flow problems, and discrete optimization. It includes classical topics as well as the state of the art, in both theory and practice.
Theory and Reality 豆瓣
作者: Peter Godfrey-Smith University of Chicago Press 2003 - 8
What makes science different from other ways of investigating the world? In Theory and Reality, Peter Godfrey-Smith uses debates--such as the problem of confirmation, the new riddle of induction, and the problem of scientific realism--as a way to introduce, in a completely accessible way, the main themes in the philosophy of science. Intended for undergraduates and general readers with no prior background in philosophy, Theory and Reality starts by surveying the last hundred years of work in the field. It covers logical positivism; induction and confirmation; Karl Popper's theory of science; Thomas Kuhn and "scientific revolutions"; the radical views of Imre Lakatos, Larry Laudan, and Paul Feyerabend; and challenges to the field from sociology of science, feminism, and science studies. The book then looks in detail at some of the broader philosophical issues at stake, such as philosophical naturalism, scientific realism, theories of explanation in science, Bayesianism, and other modern theories of evidence. Finally, Godfrey-Smith presents his own proposal for approaching the philosophy of science. Throughout the text he points out connections between philosophical debates and wider discussions about science in recent decades, such as the infamous "science wars." Examples and asides engage the beginning student, a glossary of terms explains key concepts, and suggestions for further reading are included at the end of each chapter. Like no other text in this field, Theory and Reality combines a survey of recent history of the philosophy of science with current key debates in language that any beginning scholar or critical reader can follow.