When science drives business outcomes: applying AI to solve customer challenges
Chelsea Flower Show, Milan Fashion Week, Frankfurt Book Fair – all professions have their favorite get-togethers. For us, UK technologists, London Tech Week is a highlight and source of inspiration every year. After months of attending virtual-only events, it’s exciting to see organizers experimenting with the hybrid format.
From the Future of Work to Quantum Computing, from EdTech to AI, there were many topics close to my heart.
It’s interesting to observe how artificial intelligence and machine learning have progressed rapidly over the past few years, driven by a vibrant research community, the availability of ML-ready datasets, increase of compute power, and mathematical advances in the field.
High-profile consumer applications such as digital assistants, driverless cars and ‘human like’ digital robots have captured the public imagination. Yet, most businesses are still in the process of learning what it takes to adopt AI and ML.
For one, AI helps humans make better decisions and work faster. Data-centric hedge funds already rely on AI to support new trading models. And at a time of acute shortage of talent, HR departments are looking to AI for enhancing talent acquisition and retention, whilst human resource consultants use AI for candidate sourcing and matching.
At Cisco, we are looking to AI and ML to solve real problems via a pragmatic approach -and to drive business outcomes.
For example, by analysing huge amounts of network data, from telemetry to traffic patterns, we are able to understand anomalies as well as optimal network configurations. Ultimately, we enable a self-driving, self-healing network. The network will redirect traffic on its own and heal itself from internal shocks, such as device malfunctions, and external shocks, such as cyberattacks. Plus, we are using machine learning to find malware inside encrypted traffic without decrypting it.
Driven by the shift to remote work and now the need for hybrid work solutions, we have seen an explosion in the use of AI in Collaboration technologies. For our collaboration platform Webex alone, we introduced more than 800 new features in the last 12 months, many of them strongly relying on ML and AI. These include video meetings that translate as you speak, closed captioning, noise cancellation to mute barking dogs or vacuum cleaners without muting yourself, or a feature that recognizes a physical gesture, such as a thumbs up or an applause, and turns it into emojis.
Talking about Collaboration, let me briefly add a few thoughts on contact centers. Call centers have been around since the 1960s, but have undergone lots of innovation in recent years. These days, we tend to refer to them as contact centers, and see them as a crucial part of the digital experience businesses aim to offer their customers.
At Cisco, we have been delighted to welcome the team of UK-based IMImobile through an acquisition that closed in February this year. Interestingly, IMImobile is using lots of AI for creating digital customer journeys. This includes developing AI-driven chatbots enhanced with Human-in-the-Loop (HITL) learning. What this technology does: it brings in human feedback to expand the scope of a chatbot’s intelligence. On the one hand, it means updating its knowledge base, adding relevant tags or articles, and feeding its performance back into the algorithm. On the other hand, it helps companies monitor virtual agents’ conversations to evaluate performance and seamlessly take over by a human agent if the conversation becomes too complex.
One of the new areas I am most excited about is the use of deep learning, a type of machine learning at the forefront of artificial intelligence research, to drive computer vision object detection. This is used in our newest Cisco Meraki cameras for object detection. These cameras are able to create histograms of objects detected by object type – person or vehicle. For example, in object detection, you will be able to analyze data to provide information about how many people/vehicles entered or were present at a specific time. The dashboard can show this data at a minute, hourly, or daily scale, which allows you to identify time-based trends and anomalies in the usage of your space.
The science behind the magic: the smart camera development teams continually show a computer thousands of examples of what objects look like and it “learns” how to identify them more and more accurately over time. The model improves as we provide it with additional training data.
But as with all innovation – it only works if we will have the right talent constantly learning, exploring and experimenting. Therefore, I am so proud of our collaboration with UCL and the many initiatives we are driving together – such as the recent Machine Learning event that we held for young women, aged 16 to 18 years, who are studying A level Computer Science, Mathematics and/or Statistics.
My hope is that through such programs we can encourage more people to enter the field of AI and MI and at the same time build a better understanding on the impact these technologies have on diversity, data and ethics.