Airtifical Intelligence and Machine Learning
AI is changing the world. Friend or foe?
AI - Simply put, artificial intelligence is a sub-field of computer science. Its goal is to enable the development of computers that are able to do things normally done by people -- in particular, things associated with people acting intelligently. Stanford researcher John McCarthy coined the term in 1956 during what is now called The Dartmouth Conference, where the core mission of the AI field was defined. If we start with this definition, any program can be considered AI if it does something that we would normally think of as intelligent in humans. How the program does it is not the issue, just that is able to do it at all. That is, it is AI if it is smart, but it doesn’t have to be smart like us.
The most important general-purpose technology of our era is artificial intelligence, particularly machine learning (ML) — that is, the machine’s ability to keep improving its performance without humans having to explain exactly how to accomplish all the tasks it’s given. Within just the past few years machine learning has become far more effective and widely available. We can now build systems that learn how to perform tasks on their own. Why is this such a big deal? Two reasons. First, we humans know more than we can tell: We can’t explain exactly how we’re able to do a lot of things — from recognizing a face to making a smart move in the ancient Asian strategy game of Go. Prior to ML, this inability to articulate our own knowledge meant that we couldn’t automate many tasks. Now we can.
The most important thing to understand about ML is that it represents a fundamentally different approach to creating software: The machine learns from examples, rather than being explicitly programmed for a particular outcome. This is an important break from previous practice. For most of the past 50 years, advances in information technology and its applications have focused on codifying existing knowledge and procedures and embedding them in machines. Indeed, the term “coding” denotes the painstaking process of transferring knowledge from developers’ heads into a form that machines can understand and execute. This approach has a fundamental weakness: Much of the knowledge we all have is tacit, meaning that we can’t fully explain it. It’s nearly impossible for us to write down instructions that would enable another person to learn how to ride a bike or to recognize a friend’s face.
Areas of development:
Supervised learning (Our favorite!), dimensionality reduction, data analysis, visualizations, information design.