BrainChip Holdings (ASX:BRN) December 2018 Quarter Update Presentation

Company Presentations

BrainChip Holdings Limited (ASX:BRN) CEO, Louis DiNardo, provides an update on the company.

Good morning everyone. This is Louis DiNardo from BrainChip (ASX:BRN), chief executive officer. Thank you for joining us in this morning's call. This time I can actually say good morning. I'm here in downtown Sydney, beautiful Sydney where it's a little bit hot, a bit sticky. I expected actually to call in from Melbourne, but all the flights from Sydney to Melbourne last night happened to be cancelled so I'm back here in Sydney. I expect this to be a relatively quick call, we did a year end update in January. Really, what you'll see here is the text is really almost cut and paste from the quarterly update, which we put out last night. But my intent here is to give you a bit of color and more background, on the content of that release.

So moving forward, I'm on slide one, which is just Brainchip Holding's (ASX:BRN) December 2018 quarter update. Let me make sure that we can move forward here. Just a quarter summary. Total net cash outflows for the quarter were US$2.7 million compared to what we expect in the first quarter of 2019 which is US$2.6 million. When you look at the 4C, it gets a little complicated. In order to come up with the total net cash outflows, you really need to look at three different sections in the 4C's, 1.9, 2.6 and 3.10 in order to come up with the actual number. If you just look at a 1.9 you see 2,141,000 and then you go to 2.6, you get to 293 and in an addition, and 3.10 get you 19 more. Total cash receipts for the quarter, I know we talked about this, or I spoke to this in the January update. It was 227 US, approximately 310,000 Aussie. Again, you have to add up a couple of numbers in the 4C.

As much as anybody I'm disappointed in total cash receipts. That does not imply at all that BrainChip Studio and Accelerator are not gaining traction. We'll talk about that in another slide or two. Great activity going on with respect to trials moving to design wins. The learning cycle that we've been through on the latency of moving from a trial to a design went into revenue. And we spoke about, when I was on the call here in January. And of course, this doesn't include invoices that were issued in the quarter that might be paid at a future date. As sales or revenue grow, I think we'll start reporting what accounts receivable and what DSO's or day sales outstanding are. Some work at places pay their bills on time 30 days or 45 days. Some marketplaces are frankly a bit more lax in their payment schedule.

An important bullet here, number three, which we don't talk about and there's no place really for us to put out in the public domain other than in an update such as this, is we did retire approximately 38.5 million stock options in 2018 and we published 3B's when we hire people and we issue new stock options, but there's really no venue with ASX listing requirements forced to indicate what it is that we've retired. These were forfeitures or expiration dates. People that have left the company, they don't get to harvest those options, or those options that based on time have expired. So, 38.5 million stock options in 2018, is a number that I don't think we've discussed publicly before, not because we didn't want to, but there's really no kind of ASX listing requirement, or form that applies to it. Again, as a quarter summary, when I say that the company is implementing measures to control expenses, seems like a paradox. We're always controlling expenses.

At this juncture, cash is king. We've got some latency in cash receipts from all the trials moving through design wins and getting to revenue. So, it's important for us to really keep a close eye on those expenses that are variable and they are controllable. So, I just wanted to assure shareholders that is a top priority for us and we've taken significant action, so that as we go into Q1 of 2019, you can see that total cash outflows are expected actually be down a bit quarter to quarter and we'll grind on that on a day in and day out basis. Moving forward to slide three, I put this at the top of the list of things to talk about because I think it is very important for people to understand or shareholders to understand management. In fact understand intellectual property is really the value that this company is bringing and that will manifest itself in the introduction of Akida. In some respects it's already manifest itself in the successful trials and ongoing trials with BrainChip Studio.

We did file and this is again a summary that in some respects we talked about in January on the year end call. But we did file that on the bus provisional patent for those inventions or innovations that are comprised in Akida. It is critically important that we protect our intellectual property, is the foundational technology of the company. I think we have the best advisors in North America with respect to how to protect that property as well as the best patent attorneys in North America, in order to write those patents and make sure that they're done properly. We call it a provisional patent, because we have a year from the date that we file that provisional patent to add to it or create what we call child patent applications. So the provisional patent will likely turn into multiple patents with additional claims as we move forward, but it is very robust, really covers a lot of the foundational inventions that are embodied in Akida, including as I say, and as we say here in both three of the overall neuromorphic system on a chip architecture.

The state-of-the-art configurability, the low latency, low power feature set and many, many other things. So we've spent a lot of time, Peter spends a great deal of his time focused on intellectual property as you know, heading up research as CTO. He is driving the innovation and invention, and we want to protect that as best as possible. I think we've done a good job of that, throughout 2018, but in particular the filing of that provisional patent in the fourth quarter, in the December quarter. Moving forward to slide five a little bit more about sales engagements. And I know there's a lot of frustration, sales engagements include 26 or so plus or minus one or two every day. We've got new trials going on. Those are active or pending with end users and original equipment manufacturers. And I think there's an important distinction that we'll talk about in a moment here. This is, it really is a critical first step towards the ultimate goal of achieving a design win and a design win be getting revenue.

You go through that trial process, it is frustrating, that trial sometimes it can move very quickly, could be weeks or months and some trials can take months or quarters, depending on the resources that are allocated either by the end user or the original equipment manufacturer. And then those trials with an original equipment manufacturer or moving through their sales channel and into the public domain, the user base for those OEM's, original equipment manufacturers to generate revenue. But well, as I've said, disappointing with respect to cash receipts, 26 active or pending trials, new trials being initiated on a regular basis gives us significant confidence that BrainChip Studio and the Accelerator, the accelerator card, I'll touch on one particular opportunity in a few minutes. It's a very good product and the latency or the time to revenue is a bit more exaggerated than we expected, but there's nothing that implies that we're losing in these trials. In fact, most of these trials are turning out to be quite successful.

Next bullet does touch on and again, under NDA, so it's difficult for us to talk about the name. But in the storage market, there are five major players in the storage market, Dell/EMC, HP, Net App, IBM, and the fifth would be a Hitachi. I think I mentioned several calls ago that when we announced the collaboration that we have with Quantum, which doesn't fall into that top five, we immediately got an inbound from one of the top five. They have trialed BrainChip Studio very successfully. And in fact, went ahead and said, "Okay, now, we're very satisfied with what we see. We'd like to see what the accelerator could do for us with regard to throughput." We did ship that accelerator card. I think I touched on this in a couple of more slides down. We did ship that accelerator card. They plugged it in, activated the licenses in a very, very fast pace for a major storage manufacturing the OEM space. It was something that we're very pleased with and could have a meaningful upside potential with the company as we move forward.

I'm going to move to slide five, which is our reseller program. And this is, it's very important for us to have a well organised and well executed reseller program. We don't want to have to build a sales force trying to boil the ocean with a thousand salespeople. We've talked about OEM customers being a point of leverage. Our reseller program is coming along quite nicely. We've strengthened our reseller presence in both Europe and North America. I think everybody may be aware that, you know, we contracted ION sales in Texas and the surrounding states. ION will be responsible for a couple of major OEM's in that marketplace. So it doesn't include Dell EMC, which is in Texas as well as HP, which is in Texas. Those are two, as I just mentioned, two of the largest storage manufacturers in the world and they have great relationships. We've got great traction in that top five. So, I think ION is an important part of our future.

In Europe we've added, Telesikring, I don't know if I pronounced that exactly correctly, in Denmark. And Novo in Greece and Cyprus, there are leaders in their respective markets. They don't seem to rise to the top of everybody's mind when we think about where the major markets in the world, but covering all of Europe, including Greece, Cyprus, we've got, and as we'll talk about in a moment, we've got trials going on in Spain. We've got trials going on in the UK, but having representatives in that marketplace rather than a direct sales force gives us a great deal of leverage, keeps expenses down as well as gives us more intimate relationships in local markets and that's an important part of us building a reseller organisation. Move forward to GPI. We touched on GPI in January. Frankly, not much has changed since the beginning of January. The development of the ATS system particularly, and I don't know if we've talked about this specifically historically, but Blackjack and Baccarat are really the table games, which are the focus of their automated table system.

We're expected to continue development through the first quarter of 2019. There's some enhancements, there are some corner cases, when you're looking at a Blackjack table or a Baccarat table, that we need to continue to enhance the visual or the division system to work closely with the RFID system that's going well. They're demonstrating ATS to major gaming operators in North America and Asia, Asia and Australia. As I think we've talked about before, GPI has intimate relationships and very long and deep relationships with major gaming operators around the world. The board of directors of GPI did agree to be acquired by Angel, Kyoto, Japan, while GPI has a significant market share in the chips, the currency on the table, Angel has a significant market share, a very significant market share of all the cards on the table games. So the combination of Angel and GPI, I think the ultimate result is going to be a positive.

Of course, there's been a bit of a delay in getting through a commercial agreement in part because of the binding agreement that they have during their due diligence period, as well as the enhancements that we continue to develop with the direction of GPI. And we'll continue to update the market on significant developments regarding GPI. But overall there is continued development and continued demonstration trials to begin shortly. So I think, generally speaking, the GPI thing is moving along quite nicely. SN Tech, this is something we've talked about for the past couple of quarters, I think as most shareholders would recall. We did invoice them and for a bit over $600,000. We did an audit in the December quarter. We found that their responses and disclosures were unsatisfactory, under our contract we have a right to audit books, records and even source code for what is being deployed. As a result of that lack of disclosure, our view, lack of disclosure, we filed a Freedom of Information Act.

If you recall this is the Lockport school district and that deployment was funded by a bond from New York state, which allows us to use the Freedom of Information Act to ask Lockport as a school district to provide information. They responded to that request on January 9th and indicated they would provide information, according to the guidelines. I would expect information to be at hand in the next few weeks pertaining to that request. Once we get that information, we'll either go back to SN Tech and say, "Here's what we found and here's the case that we make that your dispute of the invoice is not valid." Or, we'll have to evaluate whether we move forward with some sort of legal action based on what it is that we found and that that decision will include time and money and distraction, versus what we think that we could harvest should we prevail in any legal action.

Again, we'll continue to update the market and I expect it will do that on a regular basis as we go through this process. After we've evaluated a response from the request and make some determination about how we move forward. OEM engagements with BrainChip Studio, this is really where our big leverage is, but you know, we've talked about these 26 trials a lot of that is end users. You need those end users and successful trial staff, proof of life, proof of concept. So when you work with OEM's, they have confidence that there are customers that have validated that the product that is something that has compelling advantages, whether it's for law enforcement, homeland security, sometimes it's visual inspection and we'll talk about Saffron in a moment. But Quantum was our first OEM engagement. They continue to market BrainChip Studio to their customers. They have two primary markets, both which are very large and both which can benefit significantly from BrainChip Studio being incorporated into Quantum with the core StorNext system.

Media and entertainment, and this is everything from high quality production video and movies, as well as a lot of sporting activity, whether it's the NBA, NASCAR, soccer or what would call here, football, they're really doing a great deal of activity in media and entertainment and becoming a major influence in the surveillance market. They're out with us, in fact, joining along visiting customers that are the major production companies in media and entertainment. When I say major there are names when you watch a movie and at the beginning it comes up and tells you who the production studio is, or what network it's on, those are the names that you should think about. In surveillance, the joint sales effort, I think the nearest term opportunity is a smart city project in Asia that will be a proof of life, a first deal for the Quantum sales force to build success upon success. So, the engagement with Quantum has gone quite well, and we expect some results in the near term.

I'm trying to touch on a lot of the questions about Quantum, questions about Veritone and Veritone's integration of BrainChip Studio. We've talked about this for a couple of quarters and there was what's called a docker, in order to incorporate or integrate into their system. This is called their aiWARE platform, it's a cloud based video storage product. It's got video analytic tools, it's like a supermarket and users can upload their video to the cloud. They have a menu of video analytic tools that they can select. It's a bit of a different business model. In the case of something like Quantum, we would sell licenses, we'd get a maintenance fee. In the case of Veritone, we would get dollars or pennies per hour of video searched. And whether that's hundreds of thousands of hours, or millions, or tens of millions of hours, really depends on the use cases and their customer base. But we would get paid based on how much of the video used our particular video analytic tool.

Veritone is quite excited about this and I expect that there'll be some rollout in the near future of BrainChip Studio having been integrated with the aiWARE platform. And then of course we'll report as much as we can on a regular basis how that is going with their end customers. Saffron, just a quick update and I know this seems like a bit of a paradox. Saffron, we've been selling Saffron licenses in small volume and they've integrated it into their manufacturing inspection or visual inspection system, in part because they have tight government ties. It was necessary for them as they move into larger scale deployment to put a tender offer out. That tender offer, frankly, a great deal of it was designed around the specifics of our solution. So it was a, I think it was a process that they had to go through and it appears that that tender offer has been completed.

Should we be selected, which has a high probability given the specification that was written as well as the trials that have been done more than trials actually in use in their inspection flow we would generate revenue from those licenses that we sell. And again, that would be licenses with an ongoing maintenance fee. We'll give you an update as things move forward with Saffron. But it's a nice opportunity for us to get beyond law enforcement, anti terrorism and homeland security and into the industrial sector and with the object recognition, pattern recognition, with warrior vision inspection system. Video management system engagements, and again, this VMS is where all the wires come back from all the cameras and you can imagine, you know, thousands of cameras in a downtown or city environment. They come back to some central location and the first thing that they hit is a VMS, something that can manage.

I want to look at camera number one, or number 300, or number 5,000, all of that video comes through a VMS. There are three, four or five large providers in the world. We do have an agreement that's moving through a process with one of the large suppliers on a global basis. And while that agreement hasn't yet been signed, they're currently marketing BrainChip Studio in conjunction with their VMS for video analytics, particularly in surveillance systems, primarily at this point in North America. I think we can talk publicly in the very near term about the name of the company. If they're out marketing BrainChip Studio, it's in the public domain. And so, we're pushing to have a bit more ability to move outside of the non-disclosure agreement. We also entered into a partnership agreement with milestone. Milestone is really one of the leading, if not the leading global DMS provider.

That partnership agreement really enables the integration of BrainChip Studio into Milestones VMS system. That's a major step forward for us. Again, with a leader on a global basis in VMS and I just put this in context, something well over 500 million surveillance cameras in the civil surveillance domain. This isn't consumer surveillance in your home, 500 plus million cameras. They all go through some VMS before they get stored. If we can, as we expect we will engage with the storage manufacturers, the storage suppliers and the VMS providers. It keeps us from having to, again, build a sales force to call on, you know, every surveillance system installer in Europe, North America, Asia, Australia, we'll have a handful of customers that provide the VMS, have integrated BrainChip Studio as well as potentially the storage suppliers were VMS then it goes into storage.

So, two great points of leverage for us, both with respect to BrainChip Studio and the Accelerator for large scale deployments through OEM's. You know, rather than, again, trying to boil the ocean and go to every end user using OEM's, again, both at the VMS level and the storage level in order to gain that leverage. And user engagements, again, this is where we have proof of life. So when we talked to the OEM's we can show them there 26 trials that are going on and the successful trials that have been completed, and really garner their attention. The sales cycle is longer than we expected, as I've said before, no one has probably more disappointed about cash receipts in the last few quarters than I am. These end user trials are coming to fruition. There's nothing about what we've seen in trials that undermines our confidence in BrainChip Studio. These trials have gone quite well, a bit longer than we would've expected. And then the deployment, goes through the budgeting process, goes through a whole bunch of red tape before things can get deployed. But there's nothing about the trials that are going on that undermines our confidence in BrainChip Studio and Accelerator.

In addition to what we've experienced back in 2017 where we were very France centric, primarily because the engagements were on the back of the acquisition of Spikenet, which is located in Toulouse. And they worked France very hard, with our current team focused on end user engagements were now in Denmark, Spain, France, the UK, as well as many law enforcement agencies in North America, including California, New York, New England and this is just a very, this are very summarised or shortlist of the engagements that are going on with those 26 trials as well as many active opportunities that could turn into trials very quickly. And I think everyone here knows that, with respect to BrainChip Studio in the surveillance market, it's really about suspects of interest, homeland security and antiterrorism. Unfortunately it also includes investigation of child exploitation and some things that, well, we would like to be successful there. There are really unfortunate circumstances that we're trying to address.

So, one big leg of BrainChip Studio surveillance, as we talked with respect to Saffron that's in pattern or object recognition capability applies to visual inspection for manufacturing. And we'll talk about Akida in a few minutes, which really takes it to a next level, which takes it that visual capability into even a wider array of arenas. But we'll talk about that when we get to Akida. So, here we are on Akida, so now we're moving from BrainChip Studio, which was really an offshoot of the acquisition of Spikenet. When we acquired Spikenet, the primary goal was to get a team, which really had the competency for video analytics and envision systems that, that team had. And we currently have I think 14, I'll talk about it in a moment. I think we have 14 people there, both get that competency and get the intellectual property. We're exploiting that acquisition by introducing BrainChip Studio, which is a software product. Of course the Accelerator is a hardware product that accelerates the software.

But the primary or core vision of BrainChip (ASX:BRN) has always been to build an integrated circuit, a chip, which really focuses on the compelling advantages that a spiking neural network can bring to artificial intelligence. So, that is Akida. Previously, we've thought about as SNAP64, which was really a test chip that demonstrated and what Peter had in his head and put on paper, and then reduced to practice in an FPGA actually worked. You take that, you combine it with the learning, particularly for video analytics at most, acquire through the Spikenet acquisition and that is where Akida is going. We'll talk about the development schedule in a few minutes, but Akida is what we call as we've talked about before, a neuromorphic system on a chip. And it basically, a system on a chip in digital logic that mirrors how the brain processes information. And, what's unique about the definition of Akida is that we can do both inference and inference is just, you can use it as a synonym for deep learning as well as autonomous learning, which was really what was demonstrated in SANP64.

We can do both inference and autonomous learning in a single low power, low latency and low cost IC. What that allows us to do is bring artificial intelligence from the data centre or the cloud to the edge, the edge being where is the transducer, where's the camera, where's the ultrasound or is it the lidar or radar in automotive application? Where are the smart transducers that are measuring pressure, temperature flow in the industrial environment? You know, a caterpillar tractor or some other device that's in the field but has smart transducers. Bringing intelligence to the edge allows you to make decisions at the edge or at least produce actionable data at the edge rather than taking all the camera data, all the transducer data, sending it back to a data centre or sending it up to the cloud, having to process that data, the latency, the hogging of the bus and the CPU or GPU processing power. We move that to the edge whether it's in a surveillance camera, whether it's in a smart transducer.

Our primary targets for Akida and we're already engaged in significant dialogue, are primarily envision applications. That's what really dominates AI at the edge at this point. Although I do think the internet of things and smart transducers are a close second, advanced driver assisted systems what's called ADAS. Autonomous vehicles would be nice, but autonomous vehicles have a long latency before you'd see meaningful production volumes. But ADAS is here and ADAS comes in levels one through five, level one, two and three are fully deployed. Almost any car you buy today we'll have some backup camera, it will have something in the bumper with respect to, whether it's radar, primarily radar, eventually it'll be lidar. But the ability for us to provide a solution that goes to the edge for ADAS goes to the edge for vision guided robotics. And robotics is an interesting and very large opportunity. This is not robots that look like people providing services to you.

There's a good video that I recently saw robots that basically look like they are the size of a shoe box and they're running around manufacturing floors, distributing product in warehouses. Thousands and thousands of these things on a warehouse floor. Some are big enough and powerful enough and they had the hydraulics that you can actually move very large pallets with, you know, hundreds of kilograms of weight on them. So, don't think of robotics as I've previously misled myself to think about robots that look like people and are doing household tasks. These are really industrial vision guided robotics. Similarly, with drones, it's not the consumer drone. These are things that are being done by industry for inspection of infrastructure, being used for search and rescue operations. Certainly surveillance cameras, moving AI to the edge in the camera rather than taking all the data off the camera. Again, either going back to a data centre or going up to the cloud to make decisions, we can analyse the video at the camera and send back what we call metadata or send back only actionable data, to wherever the host processing element is.

And in smart transducers, is also a very large opportunity where you are measuring pressure, temperature flow. The smart transducers are, they're really starting to proliferate in many, many end markets. In some respects, you could call it IOT, but I think the layman thinks of IOT as consumer applications but these are smart transducers which do use the internet to move data around. But again, moving AI to the edge in the transducer, just alleviates a whole bunch of problems with latency, clogging a bus and bringing actionable decision back to the central processing arena. I'm going to try here because this was a request that has come in many times, as we look at Akida and we're going to focus here on vision. There are industry benchmarks of performance, amnest, which we've done and we've done in a native spiking domain. This happens to be what's called CIFAR10. CIFAR10 has 60,000 images that it trains the network on and 10,000 test images that you run against 10 classifiers.

So, if you look at the image to the left, it's a very grainy, very low resolution image. And that's not because this is a poor presentation, that is the image that's generated for the test. And looking at that image I'll tell you, I could not, I would not guess that's a dog. But Akida using the 10 classifiers, is it a plane? Is it a car? Is it a bird, a cat, a deer, a dog, a frog, a horse, a ship or a truck? Akida is able to determine, based on how the neural network and the neurons that fire react to this image is able to determine that this is a dog. So I'm going to let this thing play. I'll stop it a few times. There is a link at the bottom of the page, the pdf that will be loaded on our website. In a pdf you can't get a live video, but there will be a link so you could see this on our website. But if I can get this to work, you can see these images.

And I mean, looking at this image in particular, I don't know that I would guess it's a frog. Akida determines accurately with 100 per cent accuracy that that is a frog. We moved forward a bit more. Okay, here is the truck. And again, this is one of 10,000 test images being run against 10 classifiers. So I'm just going to let the video run and I could tell you that at the end of this small demonstration, we're going to come up with something better than 90 per cent accuracy with Akida, myself looking at these images and we want to, I can't keep up with them. And second, I would probably come up with 30 per cent or 40 per cent accuracy trying to determine that's a plane, that's a dog, that's a cat, that's a frog. This is one of many industry standard benchmarks. You know, beyond CIFAR10, there's ImageNet, there's ResNet, there is Mobile Faces. So what we're doing now with Akida is looking across the benchmarks, determining what our accuracy is.

As you can see, this is moving along, as you know, well over 200 images that have been analysed now and we're seeing 92 plus per cent accuracy on very low resolution images. This is an example of how this would be introduced into it driver assistance system. It could be looking for a person, a child, a ball, whatever it might be in front of the car, could be looking at street signs, whatever the data set is, now we're up to 350, 380 and we're still clocking along with the key to being able to recognise at 92 plus per cent accuracy what those images are. I'm going to stop it here, because I don't want to bore everybody with this. But the full video we'll put up on our website at the link that you see below. So when we talk about Akida, and these are early customer engagements, Akida is in development.

But when we put out the architectural announcement and when we put out the development environment announcement, we had a raft of very interested, high value customers come in and say, "Get us through development environment, we can see the value of moving artificial intelligence to the edge." And I don't cover a lot about competition in this particular deck, but there is not a lot of competition, there is a lot of money being invested in AI hardware, integrated circuits. Look at companies like Grok, you can look at companies like Graphcore, several others. Even Intel and IBM all focused on accelerating AI at the data centre or in the cloud. We have a leading solution in our mind and probably significant time advantage on our part, because the spiking neural network is low power. You have to be low power at the edge. You're not going to be plugged into a wall, you're not going to be able to have cooling, whether it's air cooling or water cooling. So the Akida instance at the edge at this point we believe is a leading solution, with little competition that's really focused here.

But large volume, power centred applications were our advantages are really, I think really well received. Vision sensor manufacturers, it's again, it's a way for us to get leverage. When you think about all the cameras that are out there and whether it's in Adas autonomous vehicles or driver assisted vehicles, vision guided robotics, drones, the vision sensor manufacturers are very limited. There's a couple that really dominate the arena. Our engagements in dialogue with them to incorporate Akida into the module that they package their sensor with an image signal processor and now with artificial intelligence would allow us to cover the landscape of, let's stick with automotive, cover the landscape of everybody from the larger European automobile manufacturers to the large North American automobile manufacturers. As well as, maybe India and China because the image sensor is really, at this point, really dominated by two major companies.

So, an engagement with one or both of them would bring us into the first tier suppliers into the automotive business. Allow us to get time to market more quickly than trying to work with every individual automobile manufacturer. The Akida development environment is, as we've talked about before, it's a fully simulated Akida device that runs in software. It has, although the learning rules, it has all the neuro fabric but running software. It allows customers, potential customers to completely simulate what they will see when Akida comes out as a piece of hardware that gives us the ability to have waiting and ready customers when the chip comes out rather than introduce the chip and then really start the process. Having the development environment in the hands of meaningful and strategic customers at this, at this point as we move through the next few months in quarters, really allows us to gain traction before Akida to hits the marketplace.

We've also got engagements and I just say this tangentially with manufacturers of laptop, computers and cell phones. They kind of like edge devices and that they're not connected to the data centre either through a local area network or a WI-FI connection. But being able to bring artificial intelligence in the way of an SNN, which is low power, low size, low cost into a cell phone or into a laptop, would likely be an intellectual property sale. And that they're not going to put another chip in a cell phone, but they would license our intellectual property. We give them the design, they put it into their own larger system on a chip while our chip is, we call it a system on a chip and it is. It is a dedicated function in a cell phone. You know, they're building a dedicated applications processor, which could be 10 times the size of real estate, that our system on a chip is.

But that intellectual property sale would carry an immediate licence and an ongoing royalty. This again is one of those areas where long before we get the chip out, we could be generating revenue on the Akida intellectual property. With the Akida development environment, we may have an opportunity to take the development environment, put a wrapper around it for specific use cases, whether it's cybersecurity or whether it's, you know, something in the IOT space, but be able to generate revenue off of the development environment as a custom solution for a very specific use case, or intellectual property for integration into a larger SOC. These are important transactions, again, because they could generate revenue from Akida in advance of actually releasing the neuromorphic system on a chip.

Just an update on the product development itself. We announced the architecture as well as the development environment in late 2018 as I mentioned, lots of interests, lots of press coverage. You can go to the website, I think in the September quarter update I actually included the last slide which had 16, 17 of the premier articles that were written, but there's still continued coverage. You go to the website, you can find a list of the recent publications. We've talked about the ADE, the development environment, fully simulates Akida well in advance of product availability. And so, that's out in the public domain now, worked selectively releasing it to customers, got a lot of good feedback. Some enhancements that were recommended by customers, but being very, very selective and really trying to focus on those OEM's where we get the single point of leverage into large volume markets well in advance, again of when the chip comes out.

But the chip is expected to be available in the second half of 2019. The typical flow of semiconductor development is we'll get our first silicon back, we'll have engineering samples that again will be selectively distributed. We'll get feedback, we'll do our own quality assurance and then there'll be a full production release on the back of those engineering samples. A little update on human resources. And again, this ties back to just looking to control expenses as cash. As I said, cash is king. Our head count, as of December 2018 was 36 in our workforce, approximately 80 per cent of the workforce is really engineering development and research. The design centre in Aliso Viejo, California has 16 people. The design centre in Toulouse, France, has 14 people. Again dominated by either research or engineering development, and the balance of the workforce is either the executive staff or field sales and marketing.

So, between Alisa Viejo and Toulouse you've got 30, and then we've got a few people in the field and a few people on the executive staff. We do have a serious priority on hiring a senior finance exec , I think most of you know our CFO left a couple of quarters ago. I think we are getting very close and I would hope to have announcement on that very shortly. But the senior financial executives is going to be responsible for financial planning, reporting, general accounting, treasury tax audit and some part of investor relations, which I'll touch on investor relations again in a moment. It's an important position. I know we're a public company and we have the public company overhead that goes with being in a public market and a very important position for letting the rest of us get out and visit customers, deal with strategic relationships.

We are also, as we put a no doubt, I think it was maybe just before we did the January update or maybe slightly thereafter, non executive chairman search is ongoing, interviewing candidates, talking to quite a few people. We've put out a release, we're really focused on the non executive chairman being Australia based, really helps us in understanding the ebb and flow of how investors think as well as ASX compliance. And, we hope to have the non executive chairman, as well as a non executive director because we did appoint an interim director at the last AGM. We certainly would like to get that done and not just before the AGMA, before the notice of meeting goes out. Frankly I would like to see that happen immediately, but we're going through a process and it is a process. Vetting candidates, looking for really the skillset that we think could provide the diversity of experience, the industry experience. Making sure that we really have the best and most robust skill set on our board.

A bit about our organisation and how we communicate, and I think we've taken it to heart inputs from investors or shareholders about how we communicate to the marketplace. The stock, the registers, closely held in some respects, but the free float. I do think we can do a better job and we're taking some action to make sure that that is the case. We are looking at separating sales and marketing. Bob Beachley runs business development and marketing. That's a big job when you're introducing something like Akida making sure you have the right product definition, that you're really targeting the right strategic customers. And it's a different mindset than closing an order. I use an expression which some might find distasteful, but salespeople have a mindset of they eat what they kill. They book an order, they feel good about it. They don't book an order and they feel really bad about it and they start to starve. It's a different sort of DNA than marketing, which is a methodical process.

I'm talking about product marketing, not necessarily communications, but product marketing is a very detailed, very technical part of a semiconductor company. So we're looking at the separation of sales and marketing and frankly, we're also looking at the separation of what we call applied and advanced research. Peter runs the research team, and Neil runs the development team and Neil's, you know, constantly trying to get the solution to a problem out of the research team, which is moving to a production environment. That's really applied research where you're applying what you know at a research to product development. Peter really focuses his attention in the best of worlds, he focused his attention on what's over the horizon a bit. What Akida 2.0 going to look like, what are the requirements at the edge that we could solve with some the innovations that he said, banging around in his mind for the last couple of years?

So, it's something we're entertaining. We are transitioning from end users to the OEM and system integrator sales model. The end users were absolutely, necessary for us to prove to ourselves and to have proof of life for OEM partners that the use cases were valuable. Again, the trials that we've either completed or in process allow us now to step back and say, "Okay, we've got proof of life." We'll generate revenue, but move those relationships to our OEM partners or system integrators, value added resellers so that we can have a very cost effective and highly leveraged sales organisation. I know we are at the point now with the Akida that we are going to enhance our branding campaign. We've spent a lot of time with the engineering community in part, now it's time to take a step up as well and start to globally brand Akida in the more general marketplace, whether it's Forbes, Forbes magazine, Fortune magazine, Wired magazine, get that brand out there.

Well in addition to what we've done, I think very successfully in the engineering community. I think there is a lot of internal focus and I hope you all understand we take it to heart about improved communication strategy for investor relations. We take the time and effort to do these conference calls, which we do four or five times a year. We've done a couple of tech conferences in Australia as well as in the US in the last year. But I think we can do a better job here and we're looking for resources that are more familiar with how the ebb and flow is in Australia, but also be able to capitalise on investor relations in North America and Europe. So, just assure you that we're taking that to heart and we are evaluating locations for what at this point we'll call an Akida innovation centre. And that is some place where customers can come in, they can be workstations, they can play with it rather than sending out people to every customer location, it could be Silicon Valley for an innovation centre, which is design oriented.

Could also be a research centre for Akida innovation, where we have some centralised group outside, again, having split potentially applied research and advanced research, whereas, the best location for us to have a gene pool of people that can really bring Akida forward to the next level. So, in summary and I'm sure to stay on time here. In summary BrainChip Studio revenue opportunities continue to grow. I'm disappointed in cash receipts. I don't, again, I don't look at that as the product is nothing but a good product which will be successful and lots of field trials moving through design wins a bit slower, maybe significantly slower than I would have liked or would have expected. But nothing dissuades me from seeing BrainChip Studio as anything other than a successful product. We are allocating resources to support closure on current engagements and we are as focused on generating call it cash rather than revenue, because revenue has revenue recognition implications. But generating cash on an ongoing basis from those opportunities is our dominant theme right now.

We talked about Akida developments moving through design. It is a question that I think looms about the FPGA, I think it's worth just taking a moment. With the Akida Development Environment, which is a fully simulated Akida execution engine and all that's necessary around it, it is really the pre-eminent way to look at what Akida can do in an end application. We will use internally an FPGA to validate the logic design of Akida. But an FPGA is a validation tool, it doesn't have enough gate count. I'll call it gate count in layman's terms, so to speak. But FPGA's aren't really done at gate count, but you can only do one 100th in an FPGA of what we'll do in final silicon for Akida. So internally, FPGA absolutely is right here, its right now, as we move through logic design. The best solution for customers to understand Akida and its full capacity really is the development environment. So the FPG is, it's not off the table, it's not delayed in any way, but it is an internal tool for us to validate the logic design.

The Akida Development Environment is the most important thing for customers to get comfort that whether they're doing with our help a SNN conversion, or whether they're trying to develop in a native SNN environment, they will get more exposure, more understanding about the development environment. Our internal resources or design guys need the FPGA to validate their logic design. So, I hope that brings some clarity to the development flow, the development environment in the hands of customers. We use the development environment extensively internally for use cases like the one that I demonstrated with CIFAR10, FPGA in hand in order to validate the logic design as we move through the design process. And then eventually of course we get silicon out and we do expect you have silicon in the second half of 2019. Nothing's moved on that front.

So that's, I think that's it from my end. I hope I covered most of the questions that were sent in, as well as brought everybody up to date. I know some of this seems redundant. We just did a call back in January, we put the quarterly update out yesterday. I know most of you know how to get in touch with us, and if you have any questions that I haven't covered, please feel free to send it to the update address. And I think many of you know that I respond to those directly myself, but let's call it a day and well talk to you in about a quarter or so. Thank you very much.