BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
Anant Agarwal, a professor of engineering and computer science at MIT in Cambridge, Mass., was recently named president of edX, an online digital learning platform that is redefining the educational system.
In this video interview, filmed at the MIT Sloan CIO Symposium in Cambridge, Mass., Agarwal talks with SearchCIO.com Editorial Director Scot Petersen about the digital learning program, a joint venture between MIT and Harvard University that will offer free, university-level courses online to a global audience. Agarwal also discusses how cloud applications and application development are evolving to take advantage of "manycore" processors in parallel environments.
Read a partial transcript of the interview with Agarwal below, and watch the video interview to learn more about how advances in digital learning are changing education.
Scot Petersen: Can you explain what edX is?
Anant Agarwal: EdX is a new online learning initiative of MIT and Harvard. Think of edX as a not-for-profit startup that is looking to reinvent education. The mission of edX is to create an online learning platform that will host courses from universities like MIT and Harvard -- branded under the names MITx and HarvardX, respectively -- and from other universities around the world -- offering this to millions of students worldwide. We also hope to make an impact on our own campus education. So our vision is, we would like to educate billions of students worldwide, and we also want to dramatically change how we do education on campus. Our hope is that by making it very affordable worldwide, we can really make a change in the world.
Read the full transcript from this video below:
Redefining education with digital learning in the cloud
Scot Petersen: We're at the MIT Sloan CIO Symposium, and I'm here today with Anant Agarwal who is Professor of Electrical Engineering and Computer Science at MIT and, for today at least, the director of CSAIL, the Computer Science and Artificial Intelligence Laboratory here at MIT and the future president - correct?
Anant Agarwal: Right.
Scot Petersen: Of edX. Could you explain what edX is?
Anant Agarwal: So, edX is a new online learning initiative of MIT and Harvard. Think of edX as a not-for-profit startup that is looking to reinvent education. So, the mission of edX is to create an online learning platform that will host courses from universities like MIT and Harvard under the brand names MITX and HarvardX, respectively, and also courses from other universities around the world, and offer this to millions of students worldwide, and also make an impact on our own campus education.
So, our vision is that we would like to educate a billion students worldwide. We also want to dramatically change how we do education on campus, and our hope is that, by making it very affordable and available worldwide, we can really cause a disruptive change in the world.
Scot Petersen: Is it going to be their own degree programs through this or is it going to be augmenting and in-person type of campus, life campus experience?
Anant Agarwal: All of the above possibilities. So, for example, we launched a prototype course as part of MITX and edX in the spring of this year, in February, and we had over 120,000 students registered for the course worldwide, and students are taking this course from all walks of life. We have students that come in from - were currently high schoolers, who want to do more advanced courses. We have students in universities in higher ed already that are looking to take on some challenging courses to augment what they are doing on their campus.
Many students - many of "our learners," I should say, are continuing learners. They have jobs and so on, and they are learning to augment their knowledge and learn about new things.
We also have an octogenarian who is taking the course. He's 81 years old. So, people from all walks of life. Although we're not giving degrees, we are giving a certificate and will also give credentials, if people want to string together a few courses and create some customized learning programs in that manner.
Scot Petersen: Okay. So, when you talk about redefining education, is there an underlying technological basis for this? How are you changing the game as far as some of the work you did at CSAIL playing into here?
Anant Agarwal: Absolutely. So, CSAIL, the Computer Science and Artificial Intelligence Laboratory is one of the world-leading laboratories in computer science. It does a lot of research in technologies like cloud computing, the Internet, wireless communications, handheld devices, mobility, security, cloud computing, cloud sourcing, machine learning, and the list goes on and on.
Many of these have resulted in dramatic changes and improvements in computing technologies, both at CSAIL and at other laboratories and research labs around the world. Our goal is to harness some of these technologies into being able to offer online learning.
So, as an example, with online learning technology, students will be able to, as an example, like they're taking a course, a prototype course. They watch some videos, and they can be anywhere in the world. They can take the course at any time, at any place and at their own pace. So, they can learn, watch videos.
There's assessments and exercises and problem sets and quizzes, which are completely automatically assessed. With 120,000 students, there's no way you're going to manually grade all the exercises. So, all of this is done by the computer. They can answer with equations and symbolic expressions, yet all of that is automatically checked.
Interestingly enough, the checking technology for expressions and so on originated at CSAIL many decades ago in a program called "Maxima," which was one of the first programs that did symbolic manipulation of mathematical equations and so on. So, a lot of these technologies have been brought to bear to online learning.
Another example is how do you do things like scalable video? So, the Internet is now able to - through YouTube and other technologies, to be able to offer video around the world, with very low latency and the required bandwidth and quality for students to learn. We have an online interactive laboratory - this was built by my colleagues Chris Turman and Jacob White that use a lot of their simulation technologies to create an online laboratory experience where students could do laboratories with components, circuit components, in much the same manner as you would build Lego components.
Scot Petersen: So, a lot of the work you did at CSAIL and are doing at MIT involve multicore, many-core processing. Why is that important to cloud computing and other big projects such as big data?
Anant Agarwal: So, in the 90s, we did work with - we took a single core, a single processor, and put that on a chip. We tried to make that more and more powerful, but the problem was, when you take one thing and make it bigger and bigger and bigger and more powerful, you ultimately hit the limits of how well you can do. So, multicore was the idea of putting many cores, many processors, on a single chip. Each processor may not be as powerful as the one processor of the past, but together, they are able to be much more effective.
By doing so, you can make these chips not just much simpler and more modular, but also more power efficient. It's simply more power efficient to have many simpler chips working together on problems than one large processor, and this is critical to cloud computing, because, in cloud computing, the whole thing is about aggregating a large number of servers and computers in a single data center and making that accessible to people worldwide.
So, they're being able to put lots of compute in a highly dense area. Density matters. Power is extraordinarily important. It turns out that over half the cost, half the cost of the PCO, the total cost of ownership of a cloud data center, goes in power and energy. So, multicore is critical to cloud computing, because multicore goes to the heart of the problem, which is it can only make things much more dense. I can put a lot more computing on a single chip, but I can also dramatically reduce the power by going to lots and lots of cores on a chip. So, therefore, it is very important for data centers and cloud computing.
Scot Petersen: So, what kind of work are you doing to reduce that power consumption?
Anant Agarwal: So, CSAIL has launched a major new project called "organic computing." Of course we went from single-core to multicore, and that reduced the power, but we have to continue working on it. So, one new effort that my colleagues and I are working on at CSAIL is something called "organic computing," where the idea is you try to make computers look more like humans. You try to take inspiration from biology. You try to make computers adapt themselves to the competition at hand. You try to learn from the competition. By doing so, you try to continually reduce the power that you are taking, while you compute. So, that is a really cool new initiative at CSAIL that is being launched at this time.
Scot Petersen: So, what kind of intelligence [is] built into the chip, based on the power it needs to do its calculations?
Anant Agarwal: Absolutely. So, one example is - was a running a video encoding application. So, in a video application, you care about getting 30 frames per second. So, what you can do is you can build machine learning technology, control systems and so on, into both the hardware and your software system, and what the systems can do is get a goal from the applications driver.
What's the goal? Well, my goal is, in this video application, to get 30 frames per second, and what the application and the machine learning system, intelligent system, can do is "What's the computation?" If, at any given point in time, you find a missing a goal of 30 frames per second, and if you find that your power budget - if you're not too hot, if you're not - you're blowing through your power budget - you can say, "Hey, increase the frequency or use more cores," while, at the same time, if you find that you are exceeding your power budget, then you can say, "Hey, maybe what I'm going to do now is exceeding my power budget, but I'm hitting my goal of 30 frames per second, but I exceeded the power budget. What am I going to do?"
So, what idea there is that you can adaptively choose different algorithms. You can say, "Okay. Let me pick a different algorithm that is less of a power hog, but it may reduce image quality slightly." So, the machine learning system, the control system, has to sit there making all these trade-offs. "How do I trade-off power with performance, with quality of the image and so on and so forth?" That's where the intelligence comes in, where you make these trade-offs on-the-fly, as the applications running, while trying to accomplish the users goals, in terms of getting good quality video, while, at the same time, trying to keep the power as low as possible.
Scot Petersen: Now, our applications up to speed with, literally and figuratively, with this kind of architecture? Are they being developed for using all the cores as efficiently as possible?
Anant Agarwal: So, multicore is ubiquitous and is everywhere, but multicore does require that users think differently. Where in the past, you're writing a single application or what we call "a sequential application," now, we have to think about how we take our application and either run multiple applications on multiple cores or breakup a single application into multiple pieces.
So, cloud computing has been a natural target for multicore, because, in the cloud, you very naturally have to perform many tasks. So, you can run each task on one of the cores. So, for example, if you have a million people accessing pages or accessing some video on the web, each access can be serviced on a single core. In fact, you can service multiple requests on a single core. So, cloud becomes a natural place for multicore, where you don't have to dramatically change the way you program, because you're running many applications simultaneously.
But, in other areas like in the embedded world, for instance - let's take networking. So, as an example, the multicore companies that sell chips into the networking space or the video space, such as Tilera or others, have to think about how you take in a single application - take video encoder, for example or a networking router, and how do you break that up into pieces so that you can have multiple cores working on the same problem?
So, it does require changing the way people program, and our software industry is really behind in that area.
Scot Petersen: So, we're here at the CIO Symposium, and what kind of advice would you give to today's CIOs when it comes to cloud computing or mobile or big data, these new initiatives that are coming out? Should they be diving in whole hog, or should they be taking a little bit more time to see if the technology is going to mature?
Anant Agarwal: We say this during every decade, which is the changes happening at this time are very disruptive, and we've never seen this happen before, but every decade, we just go back and surprise ourselves yet again, and then yet again. So, this decade is no different. We see cloud computing. We see big data. We see multicore. We see intelligent AI and machine learning technologies. These are all disruptive technologies. It truly behooves CIOs to keep pace with these technologies. These are disruptive, and, if they don't keep pace with these and grab them as an opportunity, they will soon become a problem, and those establishments, the companies, that are able to be nimble and harness some of the new technologies, will be able to move very fast.
So, I think it's very critical for CIOs and others to be able to jump onto the new technologies. Take parallel computing, for instance. Our programming is all moving from sequential programming to parallel computing, and software is at the heart of virtually all enterprises. So, to that extent, what software development companies and CIOs developing their own software really need to start rethinking and retraining teams into thinking parallel, and it requires a whole new way of thinking. How do you think parallel?
The challenge, of course, is that, even in the leading universities, we are behind in terms of courses. These are for executive education, even for our own campus education, in some of these areas. So, it would behoove all of us to really adapt to technology and universities to offer courses quickly, both on campus and for executive education and at companies to be able to take advantage of such courses. In fact, I would encourage CIOs to be going and beating down the doors of universities, saying "Hey, look. You need to offer these courses so we can all benefit from these new technologies."
Scot Petersen: How about some courses for CIOs?
Anant Agarwal: Absolutely. One could imagine courses and partnerships with CIOs and companies, to give them an early sense of new technologies. So, as an example, whether it's courses, whether it's partnerships or engagements, that's the key. And really, it's not university educating CIOs, but really sitting down together and discussing it, or the CIOs can tell us, "Hey, look. We have this big data problem. Okay. Here is all the big data. Let's work together and see if we can make sense of it." So, I think it becomes a dialogue, not a one-way street, in consortiums such as the big data at CSAIL partnership.
Scot Petersen: Anant, thank you for joining us today.
Anant Agarwal: Thank you. My pleasure.
Scot Petersen: And good luck with edX.
Anant Agarwal: Thank you very much.
Scot Petersen: We're at the MIT Sloan CIO Symposium. I'm Scot Peterson, editorial director of the CIO IT Strategy Group at TechTarget. Thanks for watching.