Artificial intelligence. Machine learning. Deep learning.
As a Chief Technology Officer, this is an exciting time for me – especially living and working in Montreal. In recent years, Montreal has become a major epicenter for these research fields. We host pioneers like Yoshua Bengio in deep learning and have a buzzing hub of more than 250 researchers.
Similar to a modern-day gold rush of knowledge, the AI world is speeding up at an unprecedented pace through organized crowdsourced and open sourced projects like TensorFlow or PyTorch. The term AI-bubble was even coined with more than 190 AI companies funded in the last three months and more than 15B$ of funding in 2017 (up 141% from 2016). Players like Google, Microsoft, IBM, and Amazon are all racing to become cloud destinations for your data. In all, a great moment to be in information technology.
Machine learning search term interest
To better understand this exponential growth, we must slow down. We need to jump from the fast pace 40 Zettabytes world of today to the Exabyte world of 2000. A time when data collection in the size and speed we know it now was in its infancy. World data production was still—under control. Then internet usage exploded and with it the Global Data Size. Persistence of data became more efficient as more information about consumers, transactions, usage behaviors – to name a few – were collected and stored. The Big Data era was here.
The Big Data philosophy is to keep as much relevant and irrelevant information in a structured, unstructured or semi-structured form. This accumulation of data over time also powered Business Intelligence (BI) – the art of analyzing the past. The steady growth of data also presented the challenge of over-information because of the unstructured approach that was usually taken in keeping the information. Warehousing and Data Marts were used to make sense of it all.
“Knowledge is power” – Kofi Annan
Jumping forward in time to recent years. The abundance of data has opened possibilities in sales forecasting, consumer cross-selling, purchasing patterns, and overall user experience improvements. Predictive Analytics (PA) plays a crucial role in utilizing this data. The more knowledge we can give to decision makers the more power they have.
What is the intelligence?
The intelligence is the ability to learn. This is what Machine Learning (ML) research is bringing to the table. Systems or models that can either classify or predict in a constant state of learning. Processing enormous amounts of data with flexibility and speed. Models are mostly contextual and are trained not only from data but also from human experts’ inputs.
The data abundance and the ever-growing computational power – specifically in the GPU (Graphics Processing Unit) are the key reasons why AI is feasible and mainstream today. The model training went from multiple weeks for elite-only ordeal to hours on a single GPU box in a dorm room.
What does it all mean for higher education?
With all this data and BI (or AI), only our creativity will define the possibilities. Advances in Neural Networks and deep learning are barely scratching the surface of these great research fields.
For example, in education, multi-model systems can be used for student retention, student-teacher paring, student success prediction, grade prediction and many other predictive scenarios. I believe Machine Learning can improve drastically the way we undergo processes by finding patterns that would be impossible via a human approach. All this, however, must be done ethically and always with the specific goal of learning improvement in mind.
At Explorance, I am leading the effort to bring a better experience to all through machine learning. We are aiming to offer a wide variety of optimized features using this cutting-edge science. Our proud AI Center of Excellence is always pushing the envelope to provide advanced categorization of feedback and sentiment analysis.
Educational experience•Educational technology•Higher education•Student Experience Management•