The Intelligent Workplace

The Intelligent Workplace

Episode 6

Looking at data differently

Sam Conway, Zegami

Share on facebook
Share on twitter
Share on linkedin
Share on email
Sam Conway is the CEO of Zegami, a visual search and analytics solution. Zegami transforms visual data into information by combining media, data and AI. It simplifies an organizations interaction with large data sets.
 
By exploring visual representations of their data, users can uncover valuable insights. They can explore things like employee information, sporting statistics, and even non-traditional data sources like Instagram feeds. Zegami identifies outliers, biases, patterns and correlations within data sets.
 
An interesting concept that has potential applications in so many industries. This discussion will interest everyone from the Marketing team to the Data Scientists.
 
Enjoy the conversation.

Chris:                   

Hi. I’m Chris Lukianenko, and this is Intelligent Workplace brought to you by LiveTiles. My chance to speak with the industry experts and explore the new ideas and technologies that are shaping and transforming the modern workplace. 

Welcome back to another episode of the Intelligent Workplace, and boy do I have a treat for you on this one. My guest for this episode is Sam Conway, CEO of Zegami, a company that creates software design to make it easier for organizations to interact with large datasets.

Chris:                    

They allow the user to visually explore those large datasets. Anything from human resources information to sporting statistics, and even nontraditional data sources such as images from an Instagram feed. I think it’s a really interesting concept, and something I think would really appeal to my fellow marketers out there. But how does it fit into the intelligent workplace? Well, to talk us through that, welcome to the podcast Sam Conway.

Sam:                     

Hi. Look, it’s exciting to be here and talk to you about some of the stuff that we’ve been doing in conjunction with Oxford University.

Chris:                    

Mate, you know I’m a fun, so I’m really looking forward to this whole chat we’re having tonight. Now look, the concept you’re exploring with Zegami is a really interesting one. I just want to know how did you come across this idea?

Sam:                     

Well, it’s all about visual data exploration, and it’s based on some technology that come out of Microsoft in 2009 and 2010, and unfortunately Microsoft didn’t pursue Microsoft [inaudible 00:01:51] but we did. And I guess it really stems from when I started a company back in Adelaide in 2011. I ran SharePoint consulting. I guess like LiveTiles, my humble beginnings were in SharePoint, and I’ve built a SharePoint consultancy business back in Adelaide, and one of the things we were looking for was a new solution, or a product actually, and a way that can enable people to access data in a different way, and that’s sort of led to the project with Oxford University, and from there we built a company.

Chris:                    

I was going to say, that having Oxford as partner is not a bad choice to have, is it really?

Sam:                     

No, not at all. It’s really interesting going through the whole process. Starting the business in Adelaide, and I owned the business on my own. And then moving to Oxford was certainly a bigger step with investment and working with the university itself. But I guess that’s a bit of an interesting story because my CTO, Roger Noble, was actually playing with the technology and doing some after hours work with the Weatherall Institute of Molecular Medicine around high frequent microscopy. So they were generating 10s of thousands of images in the associated metadata.

Sam:                     

And what they really needed to do was analyze the dataset while still having access to the images. And because we’ve been playing with this we actually pitched a project to the university, and the university said yes. So we then spent the next year actually building out the application and the product, and we presented that back to the university in about 2015 I think it was, end of 2014. And said, “Guys, you’ve done such an amazing job. Would you like set up a company with us?”

Chris:                    

Yes, please.

Sam:                     

Yes, absolutely. You know, when the number one university in the world comes knocking you don’t really say, “No, I’m not interested.” We set up a company, and we got some investment into the company and we’ve really built it all from there.

Chris:                    

Nice. So you were talking then about working with visual data, some images. But with this idea, we can also talk about large traditional datasets, and then being able to present them back to the user in a visual representation of that data, and where they can recognize patents and uncover insights that would pretty much otherwise be very difficult to find.

Sam:                     

Yes, absolutely. I guess the concept is around what we define is visual data exploration. Now, there’s some really interesting things about how the human brain works, and the fact that half of the brain is actually dedicated to, I guess directly or indirectly, process visual information. And the brain’s ability to process visual information I absolutely incredibly. We can do that 60,000 times faster than [inaudible 00:04:07]

Chris:                    

Oh, wow.

Sam:                     

I guess what I find fascinating is that the human brain can actually process an image within 13 milliseconds, and that’s quite phenomenal. So why aren’t we using some of that instant visual recognition as an analysis platform? Or drawing on my skills to able to extract information out of large amounts of visual content? And look, as you said, we focus on the ability to I guess take visual content and make that discoverable or explorable in a way that no other tool set really does.

Chris:                    

Yes, absolutely. The nice thing I think about your concept is how you marry together that human element of being able to recognize different elements in the visual imagery, and then you take the power and scale that machine learning can provide us, and you allow that to come together and maybe augment those datasets with human observations, and I feel like it’s just the best of both worlds.

Sam:                     

Yes, most definitely. You know, one of the interesting things is we have this ability to generate so much information. I think the latest statistics tell that we generate 2.5 quintilian bites of data every day. Just think about that, how much information that is.

Chris:                    

Do you know what? I can’t even think about that. I have got no concept. That is just humongous.

Sam:                     

How do we actually manage that? So, we’re getting really good at storing that information. But how do we turn that data into information, into insight and understanding? Now, one of the biggest challenges I guess that we’re starting to see is that we can collect so much information, but how do we actually make sense of it? How do we put it into a format that enables us to gain insight out of it?

Sam:                     

Now, the default is let’s turn to machine learning. But if you think about visual content, if you turn to machine learning what is that actually doing? It’s maybe extracting further information out of the image, or the dataset to provide more information that you need to search, sort, filter and group to gain insight. And I guess what we’ve done is we’ve combined I guess the power of an analytics platform with the power of a visual search platform to combine it all together to provide a solution that you can do that ad hoc analysis to gain insight, that search, sort, filter and group.

Chris:                    

When I asked you to jump on this podcast I told you that I wanted us to be mindful that we didn’t want it to dissolve into an infomercial style of chat. But I think it’s really important to set the scene for the listeners on the power of your tech. I just want to pull out one more example, and then we’ll wrap this part up and move on. Being a marketer I was really blown away by the Instagram example on your website, and I just want to give the listeners a quick overview of that for any of those other marketers out there that may be listening.

Sam:                     

Yeah. Really interesting collection that we built. So what we did is we went to Instagram and we grabbed all of the images that Red Bull have posted over the last I think eight years. Over every single image we’ve done some analysis to pick out tags or to identify items within the image. It could be a car. It could be a racetrack. It could be skydiving. It could be ocean. And then we’ve tagged all of those images.

Sam:                     

In addition to that, we’ve run some machine learning of the type which allows us to do some similarity comparisons, and then we’ve also looked at the data that’s attached to each of the images, the known data, so the number of likes and number of comments. And we thought we’d then compliment that with something called sentiment analysis. So all of the comments that were made against of all of the images we’ve basically looked at was it positive, was it negative, was it strongly positive, or was it strong negative.

Sam:                     

We’ve then combined all of that data into Zegami, and what it allows us to do is start to really explore that dataset and look for insight in a way that is very difficult with other solutions. For example, we might want to put it into graph view by the number of likes, so you can see the most liked and the least liked. Then on the other axis we might to that by date, so you can see when all the posts were put up, and you can you start to see some trends between video content or image content?

Sam:                     

Why is it that some images are getting lots of likes and other images are getting lots of comments? And does it correspond that the more comments that an image gets the more likes that it’s going to get as well? And what we kind of found is these interesting patterns that was really difficult to discover. And you might look at an image and go, “That’s a pretty boring image,” but it got lots of comments and lots of likes. So what was tied back to happening at the time? Or you looked at the likes versus the sentiment and you saw the items with strong sentiment we’re necessarily the ones which had lots and lots of likes. And it’s these kinds of insights that you can do a really quick ad hoc way that’s very difficult, and to have reference back to the image or video.

Chris:                    

It’s just awesome. Any social media marketers out there would just be rubbing their hands with glue getting their hands on that type of data. It’s amazing. But let’s switch gears a bit, and let’s talk about how we can plug this tech into the intelligent workplace. I feel like there are a lot of opportunities here, but how do you see it complimenting some of the other elements of the IWP?

Sam:                     

Well, I think what we probably should first talk about is with respect to the intelligent workplace, what it means to me. I guess we’re all about data, and not just numerical, but all types of data. And for me, the intelligent workplace is about us tackling one of the largest challenges, which is about data ingestion. Now, we’re generating so much data and information within an organization, and then data consumption. What can we do, and how can we present that information in a way that makes it really easy to consume?

Sam:                     

So, if you think about the intelligent workplace you’ve got solutions like Hyperfish, you’ve got solutions that are designed to bring data in. If you look at SharePoint for example, it’s designed to store data, and it’s got some ways of visually representing that information that aggregated. But are there ways that we can put Zegami over the top of that and maybe through in some AI to extract additional information, and then use Zegami or other tool sets that are really starting to challenge the concept of visual data, exploration or business intelligence to present information that makes it really easy to consume, and I guess democratizing process?

Sam:                     

Because what we really want to do is enhance business intelligence and allow people to make smarter and faster decisions over the larger datasets. Just think about this for a second. If we do a basic Google search on an image it might bring up 20,000 images, or we just do a Google search and it might bring up 24 million search results. How do you know that the ones that are at the top are the ones that best relate? We’re relying on their algorithms. The other thing is that it’s only presenting with images. It might present 20 on the screen at a time, and then you’ve got to continually scroll, or with a text search it might present 12 items.

Sam:                     

What about if we’re able to display more information on the screen in a way that made it really easy to look at it in different views and navigate through it to find exactly what you’re looking for? But to then not just find what you’re looking for, find similar items. And I think for me, the intelligent workplace is all around eventually building what I guess we can define is an augmented intelligence platform. That’s a solution that allows you to access the right information at the right time in the right way to make the best decisions.

Sam:                     

And all of a sudden these five minute questions that maybe you have to go and ask an analyst in a business you can work out for yourself. And if we can start to wrap all of that together, we’re combining some AI to start to extract additional information, that’s a pretty powerful thing.

Chris:                    

Yeah. It sounds awesome. Look, I can see the datasets that you’re talking about just sitting beautifully on your intranet page next to your Power BI dashboard, next to your document list, or whatever. Just it seems to work in seamlessly.

Sam:                     

Yes, most definitely. And I guess that’s our aim is that we really want to change the way people think about consuming information and consuming data because ultimately we are visual creatures and we want to be able to present information in a way that makes it really, really easy to understand, easy to consume, but also facilitate collaboration. Sometimes it’s not necessarily about the end result of what you find, it’s the journey that you go on to find that. And not two people don’t necessarily search for information in the same way.

Sam:                     

They might be looking for the same result, but if we can educate as we go along then that’s going to really facilitate communication within meetings, within an organization to give better perspectives on just understanding the shape of the data. Now, if you’re a business analyst within the organization and you talk about understanding the shape of the data you really get what that means.

Sam:                     

You understand how the data’s being generated, what some anomalies and patterns are. But if you’re not a data analyst, how can you give people that ability to really understand the makeup of the information within your organization? And that’s I guess one of the things that we’re trying to tackle.

Chris:                    

Now mate, when we’re talking off air earlier, you were talking about how you could use your technology for all sorts of different things, and one thing that really struck a chord with me was the way you were talking about slicing and dicing contact information based on recent interactions that you’ve had, and those people in your contacts and LinkedIn and all that sort of thing. Could you just take the listeners through that concept for me? Because I thought it was really interesting.

Sam:                     

Yeah. I guess it comes back to … I was listening to the other podcast, and one of the questions you sort of asked was, “How far can we go with the intelligent workplace?” Now, we’ve just talked about presenting the right information to the right people in the right way. Well, what about if we take it a little bit further than that? What about if we were use something like Microsoft’s HoloLens, or technology like that that allows you present information in a live situation?

Sam:                     

Now, imagine if we were at an event or a conference, we caught up and we’re chatting and you said, “Oh look, I’ve been doing this kind of work.” I said, “Oh, look. I really need to introduce you to this person I met last week.” And then all of the people that I met last week pop up into my field of vision. And I go, “Now what company did they work for?” So then all of a sudden the data sorted into company. “Oh, that’s right they work for IBM,” or, “They work for Microsoft,” or they work for whoever the company was. And then you can drill in further and you find that person that you were talking about, and then all of a sudden maybe that’s tied to their LinkedIn profile.

Sam:                     

Maybe it’s tied to some information you have, maybe their social media profile. And then you go, “Oh, wait a minute,” all of a sudden it then shows you that there’s a link between you and the person already. And you sort of say, well yes, you always know that person. “Actually why don’t I just send an email?” And then it automatically will generate an email as an introduction. We’re starting to intuitively navigate through information as if it’s like a web of information rather than point and shoot, point and shoot, point and shoot.

Sam:                     

Now, for us it’s all about that augmented intelligence. It’s about enhancing human intelligence, not replacing it. And I had to laugh, we caught off and talk about what’s the future going to look like with AI? Is it going to be robots wandering around the world? Should we be worried about that? I was chatting to my CTO the other day and he said, “We can always turn off the power.” That’s probably a fair call. So for me, it’s about where we go with all of this. It’s about shaping human intelligence and presenting information in a way that really facilitates decision making, but also enhances that intuitive process for the decision making as well.

Sam:                     

Because the one thing that we have that machines don’t is that gut instinct. Have you ever driven up to a set of lights and stopped for some reason and not realize why you’ve done it? You’ve don’t it subconsciously, and then a car goes across an intersection?

Chris:                    

Yes.

Sam:                     

You know, it’s that kind of inbuilt intelligence that we have, which you can’t necessarily replicate. And you know, this is some of the stuff they’re really challenged with, with respect to autonomous vehicles. How can you breed in this sense of intuition? And that’s why I’m quite passionate about what we’re trying to do because I’m trying to allow people to access more information in a much smarter way.

Chris:                    

Yes. I love it. I love it. I love this chat, talking about the future. Which takes me on to my next question, where do you think you can take this technology? And what industry or sector do you think can benefit the most from it?

Sam:                     

Really interesting question because I guess if you look at what we’ve created, it’s a data agnostic platform. So it really depends on the dataset that you feed into it. We obviously started at the Weatherall Institute of Molecular Medicine. We’re doing cancer research. They’re taking large amounts of images, doing analysis on it. And they’re looking for patterns and anomalies for different drug trials, for different tests and everything. That’s some of the application there.

Sam:                     

Then we’ve had plant phenomics. We’re talking about curing the world’s food shortage. We’ve got organizations using the software for when they grow genetically modified plats, and they’ll take 10s of thousands of images as those plants are growing, doing analysis in a very similar way. And then the technology further extends to humanizing HR. If we’re creating a little tile of reach and every individual within the organization, then that’s tied back to all of the data that we know about that particular person.

Sam:                     

Maybe we’re starting to do some sentiment analysis over the way they’re communicating internally and externally within the organization to look at how they’re actually feeling about the work environment. Are they a person that’s at risk of potentially leaving the organization? And these kinds of things, you’re starting to get to a solution that is more of a holistic solution than I guess just having single applications. And we’re getting some really good feedback with respect to HR, and I guess where we want to with it eventually is that we’re able to take large amounts of unstructured data and query that in a way that’s incredibly intelligent using the AI to help not just structure the data. But when you make that maybe natural language query, or whether you’re using technology to really and drill into a dataset what patterns what ways can we actually present information that makes it more usable and more intuitive?

Sam:                     

I guess before I wrap up, I was speaking to some guys from a very well known bank in Australia last week. Basically what they were talking about is the projects that are going on at the moment with respect to how they structure their data. They’re spending large amounts of money at the moment in structuring data in a way that they think that enable them to gain insight and understanding about it. Now, they’re not doing anything with respect to the consumption of that data, just in terms of structuring that data. And after listening to them about what they were talking about, I posed a question. I said, “How do you know that the way you’re structuring that dataset today is going to be the way we consume data in the future?”

Chris:                    

Brilliant.

Sam:                     

And that really messed with him. And he said, “What do you mean?” I said, “Well, what about if we’re able to develop machine learning or AI that allows you to take large amounts of unstructured data structure with respect to the query that you’ve got about that data?” And he said, “We’ve never thought about data in that way.”

Chris:                    

I love it.

Sam:                     

Yeah, a really, really, really interesting concept. But I think at the end of the day we need to look at different ways of interacting and consuming data because what we’re doing today has to evolve and has to change, and it can’t just relate rely on machine learning algorithms or AI to tell you what the answer is. We need ways of interrogating that and interpreting it in a very visual way that allows us to form those natural links and see patterns in data that maybe the AI and machine learning cannot see.

Chris:                    

I love it. I love it. And it really leads on nicely to my final question for you, which you sort of touched on a bit earlier. On this podcast I usually my million dollar question at the end, but I want to pivot here slightly for you. I don’t want to talk about robots taking over the world or anything about that. We talked about Zegami being a really nice mix of artificial intelligence, machine learning, being augmented with the human elements, experience, intuition, and you mentioned gut feel as well. Which element would you say is more important to the success of Zegami and how it changes the game in terms of the intelligent workplace?

Sam:                     

Yeah. Good question. Good question. I think really it comes back to the way we as humans consume information. As I said, we’ve got ways of storing and ingesting information now, and we do need better ways of presenting that. And I think for us, it’s about continuing to evolve how we represent data in a visual way. One of the next steps for us is to start to really work with different organizations and different users of the software to really shape that human element and the way that humans consume data and evolve data as well.

Sam:                     

And I think one of the really exciting things about being involved with an intelligent workplace is that we’re going to get lots of different datasets from lots of different industries. And I think if we can compliment that by some of the work that’s being done here in Oxford and some of the research work that we’re looking at getting done, I think we can start to create a [inaudible 00:20:12] user experience that is completely immersive with respect to data. And from that, could that potentially lead to new insights and understanding?

Sam:                     

And maybe we can find ways of curing cancer in ways that we didn’t expect because of that ability to access and understand data in a completely different way.

Chris:                    

So I feel like you can’t have one without the other. The human element is still very important.

Sam:                     

Most definitely. I think we’re going to be in trouble if we let machines always make binary decisions, because as we know, life is not binary.

Chris:                    

As a human I’m really glad to hear that. That’s fantastic. Well mate, look, I hate you cut you off here tonight because I’ve enjoyed out chat, but we do have the time limit here on the Intelligent Workplace podcast, but I feel like you and I could chat for hours. So I just want to say thank you very much for joining me here tonight, and I’ll be watching on with great interest as Zegami develops over the next few years because I was really pumped to see your demo and I wish you the best for the future. So, thanks for joining me.

Sam:                     

All right. We’ll speak soon.

Chris:                    

Cheers mate. Thanks for joining me on the Intelligent Workplace podcast brought to you by LiveTiles. If you have any feedback or want to suggest a guest for a future show email podcast@livetiles.nyc. Thanks for listening. I’ll catch you next time.

More Episodes

New-world networking…lessons from LinkedIn.

Sally Illingworth, is one of the new wave of influences with a network of 50,000 LinkedIn followers. She is an experienced Content Marketing Strategist who is an engaging presenter.

Sally is also passionate about diversity in the workplace. #brainsoverbreasts is her online campaign that recognises women for their intellect, not their inherent female body features.

View Episode