2017
The Innovators 豆瓣 Goodreads
作者: [美国] 沃尔特·艾萨克森 Simon & Schuster 2014 - 10
Following his blockbuster biography of Steve Jobs, The Innovators is Walter Isaacson’s revealing story of the people who created the computer and the Internet. It is destined to be the standard history of the digital revolution and an indispensable guide to how innovation really happens.
What were the talents that allowed certain inventors and entrepreneurs to turn their visionary ideas into disruptive realities? What led to their creative leaps? Why did some succeed and others fail?
In his masterly saga, Isaacson begins with Ada Lovelace, Lord Byron’s daughter, who pioneered computer programming in the 1840s. He explores the fascinating personalities that created our current digital revolution, such as Vannevar Bush, Alan Turing, John von Neumann, J.C.R. Licklider, Doug Engelbart, Robert Noyce, Bill Gates, Steve Wozniak, Steve Jobs, Tim Berners-Lee, and Larry Page.
This is the story of how their minds worked and what made them so inventive. It’s also a narrative of how their ability to collaborate and master the art of teamwork made them even more creative.
For an era that seeks to foster innovation, creativity, and teamwork, The Innovators shows how they happen.
Global Electrification 豆瓣
作者: William J. Hausman / Peter Hertner Cambridge University Press 2008 - 4
This 2008 book examines how multinational enterprises and international finance influenced the course of electrification around the world. Multinational enterprises played a crucial role in the spread of electric light and power from the 1870s through the first three decades of the twentieth century. However, their role did not persist, and by 1978 multinational enterprises in this sector had all but disappeared, replaced by electrical utility providers with national business structures. Yet, in recent years, there has been a vigorous revival. This book, a co-operative effort by the three authors and a group of experts from many countries, offers an analysis of the history of multinational enterprise. The authors take an integrated approach, not simply comparing national electrification experiences, but supplying a truly global account.
Getting Started With Conjoint Analysis 豆瓣
作者: Orme, Bryan K. Research Pub Llc
This 164-page book assembles and updates introductory white papers that have been available on our website. There are substantial new sections, including a 50-page glossary of terms, and two new chapters.
Paul Green (Professor Emeritus of Marketing, University of Pennsylvania) and the "father of conjoint analysis" wrote the foreword for Getting Started with Conjoint Analysis. He writes: "Getting Started with Conjoint Analysis is a practical no-nonsense guide to what happens when one designs, executes, and analyzes data from real marketplace problems. It should appeal to academics and consultant-practitioners alike. The book is easy to follow, while at the same time being almost encyclopedic in its coverage of topics ranging from study design to the presentation of results to clients."
Making Things Happen 豆瓣
作者: James Woodward Oxford University Press 2005 - 10
In Making Things Happen, James Woodward develops a new and ambitious comprehensive theory of causation and explanation that draws on literature from a variety of disciplines and which applies to a wide variety of claims in science and everyday life. His theory is a manipulationist account, proposing that causal and explanatory relationships are relationships that are potentially exploitable for purposes of manipulation and control. This account has its roots in the commonsense idea that causes are means for bringing about effects; but it also draws on a long tradition of work in experimental design, econometrics, and statistics.
Woodward shows how these ideas may be generalized to other areas of science from the social scientific and biomedical contexts for which they were originally designed. He also provides philosophical foundations for the manipulationist approach, drawing out its implications, comparing it with alternative approaches, and defending it from common criticisms. In doing so, he shows how the manipulationist account both illuminates important features of successful causal explanation in the natural and social sciences, and avoids the counterexamples and difficulties that infect alternative approaches, from the deductive-nomological model onwards.
Making Things Happen will interest philosophers working in the philosophy of science, the philosophy of social science, and metaphysics, and as well as anyone interested in causation, explanation, and scientific methodology.
Fundamentals of Computer Graphics, Fourth Edition 豆瓣
作者: Steve Marschner / Peter Shirley A K Peters/CRC Press 2015
Drawing on an impressive roster of experts in the field, Fundamentals of Computer Graphics, Fourth Edition offers an ideal resource for computer course curricula as well as a user-friendly personal or professional reference.
Focusing on geometric intuition, the book gives the necessary information for understanding how images get onto the screen by using the complementary approaches of ray tracing and rasterization. It covers topics common to an introductory course, such as sampling theory, texture mapping, spatial data structure, and splines. It also includes a number of contributed chapters from authors known for their expertise and clear way of explaining concepts.
Highlights of the Fourth Edition Include:
Updated coverage of existing topics
Major updates and improvements to several chapters, including texture mapping, graphics hardware, signal processing, and data structures
A text now printed entirely in four-color to enhance illustrative figures of concepts
The fourth edition of Fundamentals of Computer Graphics continues to provide an outstanding and comprehensive introduction to basic computer graphic technology and theory. It retains an informal and intuitive style while improving precision, consistency, and completeness of material, allowing aspiring and experienced graphics programmers to better understand and apply foundational principles to the development of efficient code in creating film, game, or web designs.
Cointegration, Causality, and Forecasting 豆瓣
作者: Engle, Robert F.; White, Halbert; Oxford University Press 1999
The book is a collection of essays in honour of Clive Granger. The chapters are by some of the world'leading econometricians, all of whom have collaborated with or studied with (or both) Clive Granger. Central themes of Grangers work are reflected in the book with attention to tests for unit roots and cointegration, tests of misspecification, forecasting models and forecast evaluation, non-linear and non-parametric econometric techniques, and overall, a careful blend of practical empirical work and strong theory. The book shows the scope of Granger's research and the range of the profession that has been influenced by his work.
Statistical Models and Causal Inference 豆瓣
作者: David A. Freedman Cambridge University Press 2009 - 11
David A. Freedman presents here a definitive synthesis of his approach to causal inference in the social sciences. He explores the foundations and limitations of statistical modeling, illustrating basic arguments with examples from political science, public policy, law, and epidemiology. Freedman maintains that many new technical approaches to statistical modeling constitute not progress, but regress. Instead, he advocates a 'shoe leather' methodology, which exploits natural variation to mitigate confounding and relies on intimate knowledge of the subject matter to develop meticulous research designs and eliminate rival explanations. When Freedman first enunciated this position, he was met with scepticism, in part because it was hard to believe that a mathematical statistician of his stature would favor 'low-tech' approaches. But the tide is turning. Many social scientists now agree that statistical technique cannot substitute for good research design and subject matter knowledge. This book offers an integrated presentation of Freedman's views.
Matched Sampling for Causal Effects 豆瓣
作者: Rubin, Donald B. Cambridge Univ Pr 2006 - 9
Matched sampling is often used to help assess the causal effect of some exposure or intervention, typically when randomized experiments are not available or cannot be conducted. This book presents a selection of Donald B. Rubin's research articles on matched sampling, from the early 1970s, when the author was one of the major researchers involved in establishing the field, to recent contributions to this now extremely active area. The articles include fundamental theoretical studies that have become classics, important extensions, and real applications that range from breast cancer treatments to tobacco litigation to studies of criminal tendencies. They are organized into seven parts, each with an introduction by the author that provides historical and personal context and discusses the relevance of the work today. A concluding essay offers advice to investigators designing observational studies. The book provides an accessible introduction to the study of matched sampling and will be an indispensable reference for students and researchers.
The Foundations of Causal Decision Theory 豆瓣
作者: Joyce, James M. Cambridge University Press 1999
This book defends the view that any adequate account of rational decision making must take a decision maker's beliefs about causal relations into account. The early chapters of the book introduce the nonspecialist to the rudiments of expected utility theory. The major technical advance offered by the book is a "representation theorem" that shows that both causal decision theory and its main rival, Richard Jeffrey's logic of decision, are both instances of a more general conditional decision theory. In providing the most complete and robust defense of causal decision theory the book will be of interest to a broad range of readers in philosophy, economics, psychology, mathematics, and artificial intelligence.
Targeted Learning 豆瓣
作者: Mark J. van der Laan / Sherri Rose Springer 2011 - 6
The statistics profession is at a unique point in history. The need for valid statistical tools is greater than ever; data sets are massive, often measuring hundreds of thousands of measurements for a single subject. The field is ready to move towards clear objective benchmarks under which tools can be evaluated. Targeted learning allows (1) the full generalization and utilization of cross-validation as an estimator selection tool so that the subjective choices made by humans are now made by the machine, and (2) targeting the fitting of the probability distribution of the data toward the target parameter representing the scientific question of interest. This book is aimed at both statisticians and applied researchers interested in causal inference and general effect estimation for observational and experimental data. Part I is an accessible introduction to super learning and the targeted maximum likelihood estimator, including related concepts necessary to understand and apply these methods. Parts II-IX handle complex data structures and topics applied researchers will immediately recognize from their own research, including time-to-event outcomes, direct and indirect effects, positivity violations, case-control studies, censored data, longitudinal data, and genomic studies.
Programming Language Pragmatics, Third Edition 豆瓣
作者: Michael L. Scott Morgan Kaufmann 2009 - 4
Programming Language Pragmatics is the most comprehensive programming language textbook available today. Taking the perspective that language design and language implementation are tightly interconnected, and that neither can be fully understood in isolation, this critically acclaimed and bestselling book has been thoroughly updated to cover the most recent developments in programming language design. With a new chapter on run-time program management and expanded coverage of concurrency, this new edition provides both students and professionals alike with a solid understanding of the most important issues driving software development today.

Classic programming foundations text now updated to familiarize students with the languages they are most likely to encounter in the workforce, including including Java 7, C++, C# 3.0, F#, Fortran 2008, Ada 2005, Scheme R6RS, and Perl 6.
New and expanded coverage of concurrency and run-time systems ensures students and professionals understand the most important advances driving software today.
Includes over 800 numbered examples to help the reader quickly cross-reference and access content.
The Algebraic Mind: Integrating Connectionism and Cognitive Science Goodreads 豆瓣
作者: Gary F. Marcus The MIT Press 2003 - 3 其它标题: The Algebraic Mind
In The Algebraic Mind , Gary Marcus attempts to integrate two theories about how the mind works, one that says that the mind is a computer-like manipulator of symbols, and another that says that the mind is a large network of neurons working together in parallel. Resisting the conventional wisdom that says that if the mind is a large neural network it cannot simultaneously be a manipulator of symbols, Marcus outlines a variety of ways in which neural systems could be organized so as to manipulate symbols, and he shows why such systems are more likely to provide an adequate substrate for language and cognition than neural systems that are inconsistent with the manipulation of symbols. Concluding with a discussion of how a neurally realized system of symbol-manipulation could have evolved and how such a system could unfold developmentally within the womb, Marcus helps to set the future agenda of cognitive neuroscience.
Semi-Supervised Learning 豆瓣
作者: Olivier Chapelle Mit Press 2006 - 9
In the field of machine learning, semi-supervised learning (SSL) occupies the middle ground, between supervised learning (in which all training examples are labeled) and unsupervised learning (in which no label data are given). Interest in SSL has increased in recent years, particularly because of application domains in which unlabeled data are plentiful, such as images, text, and bioinformatics. This first comprehensive overview of SSL presents state-of-the-art algorithms, a taxonomy of the field, selected applications, benchmark experiments, and perspectives on ongoing and future research.Semi-Supervised Learning first presents the key assumptions and ideas underlying the field: smoothness, cluster or low-density separation, manifold structure, and transduction. The core of the book is the presentation of SSL methods, organized according to algorithmic strategies. After an examination of generative models, the book describes algorithms that implement the low-density separation assumption, graph-based methods, and algorithms that perform two-step learning. The book then discusses SSL applications and offers guidelines for SSL practitioners by analyzing the results of extensive benchmark experiments. Finally, the book looks at interesting directions for SSL research. The book closes with a discussion of the relationship between semi-supervised learning and transduction.Olivier Chapelle and Alexander Zien are Research Scientists and Bernhard Scholkopf is Professor and Director at the Max Planck Institute for Biological Cybernetics in Tubingen. Scholkopf is coauthor of Learning with Kernels (MIT Press, 2002) and is a coeditor of Advances in Kernel Methods: Support Vector Learning (1998), Advances in Large-Margin Classifiers (2000), and Kernel Methods in Computational Biology (2004), all published by The MIT Press.