Using Machine Learning to Augment Automation
Push aside the AI hype and identify practical applications that deliver tangible returns. That's the mission
22 August 2019
4 Minutes read time
In the second episode of our new podcast series Unleash Your Data, we had a chat with Dan Reid, CTO of Xceptor and one of its original founders. Here’s a few key highlights:
The shiny newness of AI technology, machine learning, NLP etc. is distracting firms from the end goals and it is not uncommon for many AI initiatives to be led by the technology itself. Previously the business needs dictated the IT needs. As Dan says, we can learn from the past and should get the two teams together at the very start. After all, the business data is what underpins the eventual solution.
Data science, a much-needed discipline in the data-heavy banking, finance, securities and insurance industries, is getting more attention and is also a scarce resource. The focus though is still too caught up in the technology. This is especially when it comes to how data is being prepared. Data scientists tend to be caught up in the algorithms and less on the data preparation for ingestion into AI projects.
Xceptor has developed some good insight into the dynamic of the clients’ central AI teams and their business teams. It has become clear that there is a role for mobilising the information that AI teams need from all parts of the business. And vice versa, making sure the work carried out by the central AI teams gets rolled out where it needs to in the business.
Data is not good to go from the start in any project (see our previous post highlighting recent data quality stats in the world of finance), let alone for complex AI initiatives. Getting the right information ready from the start is a key part of the process and one that is often missed. The starting point for all AI initiatives should be normalised, cleansed, trusted data taken from all relevant sources.
Machine learning requires a copious amount of trained data to produce decent outcomes. But it is also true to say that poor data quality is a killer of AI techniques - as the old saying goes, ‘garbage in, garbage out’. There’s no getting around it - you need both quality and quantity. Failure is inevitable if you don’t have the ability to scale up techniques that ensure the quality of data that's being fed through.
Want to hear direct from Dan? Click here for more first-hand knowledge of how firms are using AI and the data considerations they need to make to succeed.