CS
Words, Thoughts, and Theories 豆瓣
作者: Alison Gopnik / Andrew N. Meltzoff The MIT Press 1998 - 7
Words, Thoughts, and Theories articulates and defends the "theory theory" of cognitive and semantic development, the idea that infants and young children, like scientists, learn about the world by forming and revising theories, a view of the origins of knowledge and meaning that has broad implications for cognitive science.Gopnik and Meltzoff interweave philosophical arguments and empirical data from their own and other's research. Both the philosophy and the psychology, the arguments and the data, address the same fundamental epistemological question: How do we come to understand the world around us?Recently, the theory theory has led to much interesting research. However, this is the first book to look at the theory in extensive detail and to systematically contrast it with other theories. It is also the first to apply the theory to infancy and early childhood, to use the theory to provide a framework for understanding semantic development, and to demonstrate that language acquisition influences theory change in children.The authors show that children just beginning to talk are engaged in profound restructurings of several domains of knowledge. These restructurings are similar to theory changes in science, and they influence children's early semantic development, since children's cognitive concerns shape and motivate their use of very early words. But, in addition, children pay attention to the language they hear around them and this too reshapes their cognition, and causes them to reorganize their theories.
Parallel Distributed Processing, Vol. 1 豆瓣
作者: David E. Rumelhart / James L. McClelland A Bradford Book 1987 - 7
What makes people smarter than computers? These volumes by a pioneering neurocomputing group suggest that the answer lies in the massively parallel architecture of the human mind. They describe a new theory of cognition called connectionism that is challenging the idea of symbolic computation that has traditionally been at the center of debate in theoretical discussions about the mind. The authors' theory assumes the mind is composed of a great number of elementary units connected in a neural network. Mental processes are interactions between these units which excite and inhibit each other in parallel rather than sequential operations. In this context, knowledge can no longer be thought of as stored in localized structures; instead, it consists of the connections between pairs of units that are distributed throughout the network. Volume 1 lays the foundations of this exciting theory of parallel distributed processing, while Volume 2 applies it to a number of specific issues in cognitive science and neuroscience, with chapters describing models of aspects of perception, memory, language, and thought.
Parallel Distributed Processing, Vol. 2 豆瓣
作者: James L. McClelland / David E. Rumelhart The MIT Press 1987 - 7
What makes people smarter than computers? These volumes by a pioneering neurocomputing group suggest that the answer lies in the massively parallel architecture of the human mind. They describe a new theory of cognition called connectionism that is challenging the idea of symbolic computation that has traditionally been at the center of debate in theoretical discussions about the mind. The authors' theory assumes the mind is composed of a great number of elementary units connected in a neural network. Mental processes are interactions between these units which excite and inhibit each other in parallel rather than sequential operations. In this context, knowledge can no longer be thought of as stored in localized structures; instead, it consists of the connections between pairs of units that are distributed throughout the network. Volume 1 lays the foundations of this exciting theory of parallel distributed processing, while Volume 2 applies it to a number of specific issues in cognitive science and neuroscience, with chapters describing models of aspects of perception, memory, language, and thought.
Categorization and Naming in Children 豆瓣
作者: Ellen M Markman A Bradford Book 1991 - 5
In this landmark work on early conceptual and lexical development, Ellen Markman explores the fascinating problem of how young children succeed at the task of inducing concepts. Backed by extensive experimental results, she challenges the fundamental assumptions of traditional theories of language acquisition and proposes that a set of constraints or principles of induction allows children to efficiently integrate knowledge and to induce information about new examples of familiar categories.Ellen M. Markman is Professor of Psychology at Stanford University.
Big Data Baseball 豆瓣
作者: Travis Sawchik Flatiron Books 2015 - 5
After twenty consecutive losing seasons for the Pittsburgh Pirates, team morale was low, the club's payroll ranked near the bottom of the sport, game attendance was down, and the city was becoming increasingly disenchanted with its team. Pittsburghers joked their town was the city of champions…and the Pirates. Big Data Baseball is the story of how the 2013 Pirates, mired in the longest losing streak in North American pro sports history, adopted drastic big-data strategies to end the drought, make the playoffs, and turn around the franchise's fortunes.
Award-winning journalist Travis Sawchik takes you behind the scenes to expertly weave together the stories of the key figures who changed the way the small-market Pirates played the game. For manager Clint Hurdle and the front office staff to save their jobs, they could not rely on a free agent spending spree, instead they had to improve the sum of their parts and find hidden value. They had to change. From Hurdle shedding his old-school ways to work closely with Neal Huntington, the forward-thinking data-driven GM and his team of talented analysts; to pitchers like A. J. Burnett and Gerrit Cole changing what and where they threw; to Russell Martin, the undervalued catcher whose expert use of the nearly-invisible skill of pitch framing helped the team's pitchers turn more balls into strikes; to Clint Barmes, a solid shortstop and one of the early adopters of the unconventional on-field shift which forced the entire infield to realign into positions they never stood in before. Under Hurdle's leadership, a culture of collaboration and creativity flourished as he successfully blended whiz kid analysts with graybeard coaches―a kind of symbiotic teamwork which was unique to the sport.
Big Data Baseball is Moneyball on steroids. It is an entertaining and enlightening underdog story that uses the 2013 Pirates season as the perfect lens to examine the sport's burgeoning big-data movement. With the help of data-tracking systems like PitchF/X and TrackMan, the Pirates collected millions of data points on every pitch and ball in play to create a tome of color-coded reports that revealed groundbreaking insights for how to win more games without spending a dime. In the process, they discovered that most batters struggled to hit two-seam fastballs, that an aggressive defensive shift on the field could turn more batted balls into outs, and that a catcher's most valuable skill was hidden. All these data points which aren't immediately visible to players and spectators, are the bit of magic that led the Pirates to spin straw in to gold, finish the 2013 season in second place, end a twenty-year losing streak.
The Innocent Eye 豆瓣
作者: Nico Orlandi Oxford University Press 2014 - 8
Why does the world look to us as it does? Generally speaking, this question has received two types of answers in the cognitive sciences in the past fifty or so years. According to the first, the world looks to us the way it does because we construct it to look as it does. According to the second, the world looks as it does primarily because of how the world is. In The Innocent Eye, Nico Orlandi defends a position that aligns with this second, world-centered tradition, but that also respects some of the insights of constructivism. Orlandi develops an embedded understanding of visual processing according to which, while visual percepts are representational states, the states and structures that precede the production of percepts are not representations.
If we study the environmental contingencies in which vision occurs, and we properly distinguish functional states and features of the visual apparatus from representational states and features, we obtain an empirically more plausible, world-centered account. Orlandi shows that this account accords well with models of vision in perceptual psychology -- such as Natural Scene Statistics and Bayesian approaches to perception -- and outlines some of the ways in which it differs from recent 'enactive' approaches to vision. The main difference is that, although the embedded account recognizes the importance of movement for perception, it does not appeal to action to uncover the richness of visual stimulation.
The upshot is that constructive models of vision ascribe mental representations too liberally, ultimately misunderstanding the notion. Orlandi offers a proposal for what mental representations are that, following insights from Brentano, James and a number of contemporary cognitive scientists, appeals to the notions of de-coupleability and absence to distinguish representations from mere tracking states.
Theoretical Neuroscience 豆瓣
作者: Peter Dayan / Laurence F. Abbott The MIT Press 2005 - 9
Theoretical neuroscience provides a quantitative basis for describing what nervous systems do, determining how they function, and uncovering the general principles by which they operate. This text introduces the basic mathematical and computational methods of theoretical neuroscience and presents applications in a variety of areas including vision, sensory-motor integration, development, learning, and memory.The book is divided into three parts. Part I discusses the relationship between sensory stimuli and neural responses, focusing on the representation of information by the spiking activity of neurons. Part II discusses the modeling of neurons and neural circuits on the basis of cellular and synaptic biophysics. Part III analyzes the role of plasticity in development and learning. An appendix covers the mathematical methods used, and exercises are available on the book's Web site.
Biophysics of Computation 豆瓣
作者: Christof Koch Oxford University Press, USA 2004 - 10
Neural network research often builds on the fiction that neurons are simple linear threshold units, completely neglecting the highly dynamic and complex nature of synapses, dendrites, and voltage-dependent ionic currents. Biophysics of Computation: Information Processing in Single Neurons challenges this notion, using richly detailed experimental and theoretical findings from cellular biophysics to explain the repertoire of computational functions available to single neurons. The author shows how individual nerve cells can multiply, integrate, or delay synaptic inputs and how information can be encoded in the voltage across the membrane, in the intracellular calcium concentration, or in the timing of individual spikes.
Key topics covered include the linear cable equation; cable theory as applied to passive dendritic trees and dendritic spines; chemical and electrical synapses and how to treat them from a computational point of view; nonlinear interactions of synaptic input in passive and active dendritic trees; the Hodgkin-Huxley model of action potential generation and propagation; phase space analysis; linking stochastic ionic channels to membrane-dependent currents; calcium and potassium currents and their role in information processing; the role of diffusion, buffering and binding of calcium, and other messenger systems in information processing and storage; short- and long-term models of synaptic plasticity; simplified models of single cells; stochastic aspects of neuronal firing; the nature of the neuronal code; and unconventional models of sub-cellular computation.
Biophysics of Computation: Information Processing in Single Neurons serves as an ideal text for advanced undergraduate and graduate courses in cellular biophysics, computational neuroscience, and neural networks, and will appeal to students and professionals in neuroscience, electrical and computer engineering, and physics.
Spiking Neuron Models 豆瓣
作者: Wulfram Gerstner Cambridge University Press 2002 - 8
Neurons in the brain communicate by short electrical pulses, the so-called action potentials or spikes. How can we understand the process of spike generation? How can we understand information transmission by neurons? What happens if thousands of neurons are coupled together in a seemingly random network? How does the network connectivity determine the activity patterns? And, vice versa, how does the spike activity influence the connectivity pattern? These questions are addressed in this 2002 introduction to spiking neurons aimed at those taking courses in computational neuroscience, theoretical biology, biophysics, or neural networks. The approach will suit students of physics, mathematics, or computer science; it will also be useful for biologists who are interested in mathematical modelling. The text is enhanced by many worked examples and illustrations. There are no mathematical prerequisites beyond what the audience would meet as undergraduates: more advanced techniques are introduced in an elementary, concrete fashion when needed.
Eye, Brain, and Vision 豆瓣 Goodreads
Eye, Brain, and Vision
作者: David H. Hubel W. H. Freeman 1995 - 5
For over thirty years, Nobel Prize winner David H. Hubel has been at the forefront of research on questions of vision. In Eye, Brain, and Vision, he brings you to the edge of current knowledge about vision, and explores the tasks scientists face in deciphering the many remaining mysteries of vision and the workings of the human brain.
Data Mining, Fourth Edition: Practical Machine Learning Tools and Techniques (Morgan Kaufmann Series in Data Management Systems) 豆瓣
作者: Ian H. Witten / Eibe Frank Morgan Kaufmann 2016
Data Mining: Practical Machine Learning Tools and Techniques, Fourth Edition, offers a thorough grounding in machine learning concepts, along with practical advice on applying these tools and techniques in real-world data mining situations. This highly anticipated fourth edition of the most acclaimed work on data mining and machine learning teaches readers everything they need to know to get going, from preparing inputs, interpreting outputs, evaluating results, to the algorithmic methods at the heart of successful data mining approaches.
Extensive updates reflect the technical changes and modernizations that have taken place in the field since the last edition, including substantial new chapters on probabilistic methods and on deep learning. Accompanying the book is a new version of the popular WEKA machine learning software from the University of Waikato. Authors Witten, Frank, Hall, and Pal include today's techniques coupled with the methods at the leading edge of contemporary research.
Provides a thorough grounding in machine learning concepts, as well as practical advice on applying the tools and techniques to data mining projectsPresents concrete tips and techniques for performance improvement that work by transforming the input or output in machine learning methodsIncludes a downloadable WEKA software toolkit, a comprehensive collection of machine learning algorithms for data mining tasks-in an easy-to-use interactive interfaceIncludes open-access online courses that introduce practical applications of the material in the book
Principles of Data Integration 豆瓣
作者: Doan, AnHai; Halevy, Alon; Ives, Zachary 2012 - 7
How do you approach answering queries when your data is stored in multiple databases that were designed independently by different people? This is first comprehensive book on data integration and is written by three of the most respected experts in the field. This book provides an extensive introduction to the theory and concepts underlying today's data integration techniques, with detailed, instruction for their application using concrete examples throughout to explain the concepts. Data integration is the problem of answering queries that span multiple data sources (e.g., databases, web pages). Data integration problems surface in multiple contexts, including enterprise information integration, query processing on the Web, coordination between government agencies and collaboration between scientists. In some cases, data integration is the key bottleneck to making progress in a field. The authors provide a working knowledge of data integration concepts and techniques, giving you the tools you need to develop a complete and concise package of algorithms and applications. It offers a range of data integration solutions enabling you to focus on what is most relevant to the problem at hand. It enables you to build your own algorithms and implement your own data integration applications. It provides a companion website with numerous project-based exercises and solutions and slides. Links to commercially available software allowing readers to build their own algorithms and implement their own data integration applications. Facebook page for reader input during and after publication.
Principles of Statistical Inference 豆瓣
作者: D. R. Cox Cambridge University Press 2006 - 8
In this definitive book, D. R. Cox gives a comprehensive and balanced appraisal of statistical inference. He develops the key concepts, describing and comparing the main ideas and controversies over foundational issues that have been keenly argued for more than two-hundred years. Continuing a sixty-year career of major contributions to statistical thought, no one is better placed to give this much-needed account of the field. An appendix gives a more personal assessment of the merits of different ideas. The content ranges from the traditional to the contemporary. While specific applications are not treated, the book is strongly motivated by applications across the sciences and associated technologies. The mathematics is kept as elementary as feasible, though previous knowledge of statistics is assumed. The book will be valued by every user or student of statistics who is serious about understanding the uncertainty inherent in conclusions from statistical analyses.
The Master Algorithm 豆瓣
作者: Pedro Domingos Basic Books 2015 - 9
A thought-provoking and wide-ranging exploration of machine learning and the race to build computer intelligences as flexible as our own
In the world's top research labs and universities, the race is on to invent the ultimate learning algorithm: one capable of discovering any knowledge from data, and doing anything we want, before we even ask. In The Master Algorithm, Pedro Domingos lifts the veil to give us a peek inside the learning machines that power Google, Amazon, and your smartphone. He assembles a blueprint for the future universal learner--the Master Algorithm--and discusses what it will mean for business, science, and society. If data-ism is today's philosophy, this book is its bible.
The Nature of Statistical Learning Theory 豆瓣
作者: Vladimir Vapnik Springer 1999 - 11
The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. This second edition contains three new chapters devoted to further development of the learning theory and SVM techniques. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists.