Sophie Ehrlich 00:07
I thank you. Good afternoon, everyone. On August the first of this year, six weeks ago, the European Union launched the first regulatory framework and the first legal AI Act globally. So six weeks in, we're going to discuss today with our panelists what this means for startups, what does it mean for investors, and what does it mean in healthcare? So the AI Act, for those of you that are less familiar, covers every industry, including healthcare. It's a risk-based mechanism that looks at different risk levels of AI, and the main aim is to ensure that AI is safe for European citizens.
Sophie Ehrlich 01:01
In healthcare, companies that are in this space are really going to be classified as high risk under this AI Act. And what it actually means in terms of the timing is that we have three years in order to comply with this new regulation. So we've got until August 2027 to become compliant.
Sophie Ehrlich 01:26
For companies that don't meet this deadline, the penalties and the fines are high.
Sophie Ehrlich 01:34
Now in terms of the AI Act, what does it actually mean for companies working in the space? Is this really a game changer? Is this an additional layer to the MDR that we have in place today? Is it an addition to GDPR? And the answer is yes, it's an additional layer to MDR and to GDPR. And for companies that have all of those in place, it's a little bit smooth sailing in terms of the requirements that are in place today.
Sophie Ehrlich 02:09
What I'd love to do, my name is Sophie Ehrlich, I work at EIT Health, which is a public-private organization that sits in between the European Commission and the European healthcare ecosystem. At EIT Health, we support early-stage healthcare startups with Horizon Europe EU grant funding and lots of other initiatives that we support the companies with as well. I'm really delighted to be joined today by our panelists. Today we have two companies from Spain, one company from Germany, and one from Austria. Two of the companies have medical devices plus AI, and two of the companies have software only, so medical AI alone. Anjany, if you'd like to introduce yourself, sure.
Anjany Sekuboyina 03:02
I'm Anjany, CEO of Bonescreen.
Anjany Sekuboyina 03:06
I'll try to give a bit of context about what we do, so that all of us are on the same page. We are trying to build a platform for screening bone health.
Anjany Sekuboyina 03:17
What we focus on especially is to extract information that you cannot see from a normal radiological image, CT, MR, whatnot. If you see a bone in there, we try to extract biomarkers from there. What's interesting here is we don't look at structural changes; we try to extract functional biomarkers from these images, and we use AI for this. That is what I want to bring to the panel. The AI in there is more complicated, and it's not very easy to show that it is working. I want to see or how to put it. I want to show how we have to show to the regulators how the AI is doing its job. Thanks to EIT support, we are very close to CE certification. We'll have the audit at the end of December. So that's a fresh perspective as well. We started our journey roughly when the EU AI Act passed, so the regulatory bodies had some information about what they are supposed to do, and there were some changes probably compared to the other startups on the panel. Let's see how that aligns with the rest of the startups. Thank you, Andreu.
Andreu Climent 05:00
Thank you very much. Well, it's a pleasure to be here. Thank you, Sophie, thank you, EIT Health. I'm Andreu Climent. I am an electronic engineer, but I was almost born in a cath lab in the electrophysiology units. Most of the people who are here are already close to the age to have atrial fibrillation or any other arrhythmias. If you work in a cath lab, you discover that half of the patients that go there, we put cables inside of their heart. We burn part of the heart, but we fail daily, and we do not have enough solutions for the clinicians to see what they do. That's why I was in Germany and the States, and we developed a technology that is able to see the heart fully globally in a fast way in 10 minutes, which is Corify Care. That's the company that I set up three times ago. We recently got the C mark. The great advantage that makes us unique is that we are the only ones that can show clinicians the heart in real time during interventions, easily and really fast. But how did we do that? How do we make this possible in 10 minutes? Basically because the device is great, but also because there is a lot of software inside, a lot of engineers working on making something unique, and most of that unique is based sometimes on artificial intelligence and databases that we need to be sure are good enough for what we are going to help. That's one of the important things of the regulation, and I think we will discuss it.
Andreas Schriefl 05:58
Hello there. I'm Andreas, Founder and CEO of eMurmur. We built auscultation AI. For those of you who don't know what auscultation is, it's when a provider listens to your heart and lungs using a stethoscope. What they're trying to do is to hear something that's abnormal that should not be there, via we saw a crackling lung or a murmur in the heart.
Andreas Schriefl 06:19
What we have done is we have created an AI that can analyze heart or lung sound recordings. We got our first FDA clearance in 2019, and CE marking as well under MDD. At the time today, we hold six regulatory clearances. Number seven is on the way on the FDA side. So we are MDR cleared now and FDA cleared, and I look forward to discussing how the AI Act is going to impact current filings we have and future filings we are working on.
Jordina Arcal 06:51
Hello, everyone. My name is Jordina, and I'm the Deputy CEO at MJN Neuroserveis. We have developed the first available device in the world that is able to predict tabletop access seizures. It's a portable EEG that continuously records the electric brain activity through the ear canal, and it sends this information via Bluetooth to a mobile phone, where we embed an artificial intelligence algorithm that is able to continuously assess the risk of suffering a seizure, so we can alert patients within one minute of a seizure, allowing us to avoid accidents. We can provide peace of mind, improving the quality of life, not only for patients but also for caregivers. At the same time, we are providing valuable information to doctors in order to better manage their patients. We have the CE mark, MDR, UKCA, and we have a contract for distribution in Europe in 32 countries with pharmaceutical companies. So we are currently selling in Europe and trying to jump to the US with the FDA. Yes, happy to talk about the new AI Act, new regulation, as we are struggling with the FDA. So that would be nice.
Sophie Ehrlich 08:13
Some of you may be sitting in the audience thinking, you know, we're a company based in Boston. How does this apply to us? The AI Act applies to either European or any global companies that are deploying AI in the European region. So it applies to everybody.
Sophie Ehrlich 08:30
In terms of the AI, the European Union's definition of AI, this is something that I really recommend everybody to take a look at because what we can all call AI might be slightly different from what the European Union defines as being AI. There may be companies here that may not need to apply for this new regulation. So that's certainly something to take a look into, and that's all publicly available online. In terms of what this means for you as companies, has this new regulation that's been in effect for six weeks changed anything that you're doing, any of the work that you're working on? How's this in the past six weeks or past few months? Have you done anything? Are you doing anything differently in your companies? Andreas, maybe we'll start with you.
Andreas Schriefl 09:13
We started researching it, partially driven by the deadline that came up today.
Andreas Schriefl 09:20
As a medical device company that has CE marking, the way the system works is that the AI Act is built on risk classifications, and automatically, if you are a class IIa to B or higher in Europe, you classify as a high-risk device under the AI Act, meaning it applies.
Andreas Schriefl 09:44
We looked at this broadly in terms of what that means, in terms of what we have to do, and when we have to do it by. So in terms of timeline, we have three years as medical device companies for this to be implemented.
Andreas Schriefl 10:00
Most of what's required is already part of being regulated under MDR, except it's not harmonized yet, meaning that within our quality management system, it's built for MDR. We still have to show that we are compliant now with the AI Act on top of it, and there's a set of rules that are not part of MDR that we'll have to add in addition to what we're doing today. But overall, I'd say, as a medical device company with clearance, we're in a pretty good spot because a lot of the big work we need to do setting up the quality management system and the whole risk management is already in place, versus other companies that might be affected. They don't have that yet today.
Sophie Ehrlich 10:49
Okay, super. Jordina, tell us a little bit about what you've been doing at MJN.
Jordina Arcal 10:53
So when it came out, I guess as every company, we said, "Oh no, another regulation that we have to go through." So the first thing we did was try to understand what this is about, and we talked with our regulatory experts so we understood if we fit in or not. Are we AI, as the European Commission understands AI? Yes, we are. Then we said, "Okay, now let's go." So what do we need? There are lots of tools available that you can use in order to understand what work you need to do to be compliant with this AI Act. What we found out is what Andreas said: if you have the MDR with a medical device, you are like 93% compliant with the AI Act. This doesn't mean that we don't have to work; it means that we have lots of things in place, but we still have to work to be compliant. So what we did so far is to understand it, get to know a little bit about it. As you said, there are no experts on that, so we just managed to understand it and to understand what we should do and how many times we have to do it. Just as we say, start the engines to be prepared and to be one of the first ones because we foresee that this will be a huge thing, and regulatory bodies will be overwhelmed, so we have to be the first ones in line because if not, it will take a long time. So that's what we did. So ahead of the game.
Anjany Sekuboyina 12:21
I think we were lucky for two reasons.
Anjany Sekuboyina 12:25
We have been hearing about the Act for almost a year now, right? Or probably more than that. I think we started our regulatory journey right about when we started hearing about these things. So we started preparing for it. The second point is we are based in Germany, and we love regulations. There is a working body of all the notified bodies in Germany that came up with this very long list of what they call the AI guideline checklist. It covers different parts of different regulations. It brings in MDR 62304, the risk management, and all that. It adds a lot of questions in the form of a checklist: do you have this? Do you have this? And so on. It took us almost, I don't know, a week to fill this checklist of just saying, "Okay, this is here. This is here." Because we worked with this checklist, I think we were already pretty prepared for this, so I'd imagine we still have some work to do, but definitely not as much.
Sophie Ehrlich 14:12
And for anybody here, this AI checklist is actually publicly available. It's IG-NB, and if you look for the latest version, it's been put together by the German notified body association. Even if you're not a German company, it's still quite a helpful checklist to take a look at to see whether your AI fits this criteria or not. If you do seem to align with most of it, it should be pretty positive, but it may also highlight areas where maybe more work needs to be done. It's certainly a good source to take a look at, and I'm happy to send you the link if anybody wants to reach out to me afterward.
Sophie Ehrlich 14:47
So it seems, it sounds as though, you know, it's not a huge change. It's not a big change; it's an additional layer, I guess, the strawberries on the cheesecake. So it sounds as though you very quickly have been looking into this and understanding what you need to look for and what is more relevant and less relevant. To understand what the criteria is, because this is a new act, we don't yet have, you know, there is no one expert that can really guide a company to say everything that's needed, but I guess here we're looking at a group. Law firms have tremendous knowledge; notified bodies have tremendous knowledge. At EIT Health, we can, of course, provide additional support, but it's also within the ecosystem, companies speaking to other companies to try and understand what this new regulation means. I think especially in this next 12 months, it's going to be the power of the ecosystem to help us all navigate together what this actually means because it is new, and as we all understand in innovation, you know, being the first is exciting, but it also has challenges, and nobody's really been there or done that before. I think there are many more questions than answers today, and it's a question of understanding this through the months to come. One of the things to highlight in terms of notified bodies: notified bodies can apply to support companies from November this year, so it's going to take time for the notified bodies to actually be able to work on supporting companies with the approvals. In addition, each European member state, each country, has 12 months in order to set up their national governing bodies in order to be able to enforce this regulation as well. So we're really talking about, it's going to take months, especially in this coming year, for the setup of this to be kind of more established.
Sophie Ehrlich 16:32
If we think about fundraising, and if we think around, you know, you're all fundraising today. Have investors asked you about this in their diligence? Has this kind of topic come up yet? It is new, but just out of interest, is this something that you're being asked about? Maybe Anjany, I start with you.
Anjany Sekuboyina 16:59
We are looking for a seed round, around one or two million, and we have not been asked anything specific. I think until now they are just more interested in CE and MDR. So yeah, probably it will change the more we speak to or at least as time passes because it has just been six weeks, right? So, yeah, it's early. You have to see.
Andreu Climent 17:38
In your case, we recently closed a Series A, and it wasn't one of the important questions. We are a technology-driven company, so our algorithms are one of the important assets, although we are a medical device with a disposable. They had some questions, but mainly during the due diligence. As soon as they go deep into the cybersecurity, they go deep into how the MDR software is working, and they go deep into the checklist about AI and more or less they wanted to learn, "Okay, how do you plan to be compliant with this new regulation, and how much is it going to cost?" So probably these two questions came, and we tried to answer.
Andreas Schriefl 18:11
Okay.
Jordina Arcal 18:14
We are currently doing a 2.5 million Series A round, and I also have not been asked by investors. But I do mention, I do make sure that I bring it up because in our case, I think it's advantageous for two main reasons. Since we already have the CE mark and the MDR, the investment needed to build a system like this from the ground up has already happened. The ongoing addition, let's say you now have the audit, is a bit more expensive. That's part of the regulatory budget anyway. But the other reason is that our AI models that are cleared have been trained, validated, and tested. The requirements around that part of the framework within the AI Act are quite different than under MDR. If I had to start today with the AI Act in place, the investment needed to be compliant would be much higher than what it was for us at the time.
Anjany Sekuboyina 19:16
Yeah, I absolutely agree with you. I mean, we are now raising a 4 million round, and I only had one investor asking for it, but I have to say that it was in a meeting where they were talking about the AI Act, so it made sense that this person was asking about the AI Act. The other investors I've been talking with, I'm making sure I'm making it aware that, hey, we are aware of that, so we are in the MDR and blah, blah, blah. But I think it's normal they are not asking about it because it's super new. I mean, it launched six weeks ago, and as you said, no one is an expert, and it's one year to come that we will understand it. So it's a topic that I guess that when we close a round with the due diligence, I will make sure that everything is ready for this new AI Act. As you said, we are one step ahead because we have the CE mark under the MDR, and this is, as you said, the strawberries in the cheesecake. So it's another layer that it will have to be placed, but it won't be that hard as if we started from scratch.
Sophie Ehrlich 20:27
Interesting, interesting. So in terms of fundraising, I understand it's not yet in the full due diligence. It will be; it should be. Regulatory aspects are certainly something that investors dive into as part of the diligence, and it will be a part of it. It's certainly a good indication for investors to start asking about it, to understand if the companies that they're meeting with are aware of where they stand with this or if they're starting to look into this themselves. In terms of costs, as we all know, another regulatory approval is going to take time out of your time, your team's time; there might be additional costs working with external advisors.
Sophie Ehrlich 21:18
How are you thinking about these costs? Are you fundraising a little bit more than you originally planned to compensate for this? Just to give you a little bit of insight on the estimated cost, the European Commission has estimated that the costs for an SME to become compliant with the AI Act—this isn't healthcare specific; it's AI in general—may be up to 400,000 euros. Now that's assuming they don't have a quality management system in place. It's assuming they're starting from day one and that they don't have other approvals that have some elements of this in place. So it's really kind of a high number, but that's estimated in an impact report of the estimated costs of becoming compliant. At the end of the day, we're all going to know how much this cost three years from now, you know, calculating backwards to see really how much was spent on this. But what are your thoughts around this as you're looking at your cash runway, your fundraising? Share that with me.
Anjany Sekuboyina 22:30
I think audits are expensive, and the more regulations we have, the more money we have to pay. That's my view on it. On top of this, I think, as Jordina and Andreas were saying, we are not starting from scratch. This is some budget we already put in; we already accounted for it. Probably because we are already doing this, the extra cost is a differential. I guess probably the more experienced have a different perspective, but we have already accounted for this. For example, look at it as an investment. Let's tell it in that way. We are making an effort to become compliant with a regulation that is trying to make patients safer. That makes our companies a little bit more prepared for what is coming. Everyone will have to pay for it sooner or later, so the sooner we do it, probably the easier it will be because we will learn. We will make the path. The first companies that had to fight with it found it difficult, but later they were the ones that succeeded. I think it's not going to be as much more expensive as it's worth it. It's already in the air to get it because it's the additional layer; it's not something brand new.
Andreu Climent 24:18
Exactly. I don't think it's something that fully changes. We are software companies. The last regulation in software already made it much more complex to explain all the software to the notified bodies. We already had to do that. That's another layer. Now you have to explain the databases; now you have to explain how you train it; now you have all these kinds of things. It makes sense. To some extent, it will take money, of course.
Andreas Schriefl 24:45
I agree. I think the cost question, if I step back from what we do at my company, it depends on your product, and it's important to understand what is qualified as AI. I was surprised to learn it's also logic-based systems. If your product uses some decision tree as part of the software, you qualify as AI, which I assume AI and machine learning refine for us. It doesn't make the difference because we do machine learning. But if I had a product that utilizes some decision tree, I would not have been regulated under the MDR the way we are. But now I have to adhere to the AI Act, which would be, I think, much more expensive than the number quoted if.
Anjany Sekuboyina 25:00
Four. So for us, on a regulatory basis, the increase is going to be there, but it's not going to be too significant. I'd say, of course, if you look back at what the audit cost under MDD, MDR times three, and now, plus the AI Act next year, it does add up over time, that's for sure, for smaller companies, yes, of course. I mean, it's what Anjany said: more audits, more money, more regulatory, more money we have to spend. So we are always thinking about that. As a medical devices company, we are super used to spending lots of money on regulatory. This is what we've chosen; this is the environment we work in, and we are very used to that. So as I said, we were scared at the beginning. Then we understood it's another layer in the cake. We compare it with the UKCA. First, with the MDR, it was one audit, one day. Now we have one day and a half because we have to have the audit for the MDR and the audit for the UKCA. Okay, now it will be maybe today, two days and a half. We know we have to work. We know we have to save money for that. It will be more money. As you said, with the MDD, it was this money; with the MDR, it's this money; with the UKCA, it's this money. Now with the AI Act, it's this money.
Andreas Schriefl 26:15
But on the other hand, it's a stamp of quality, and it's a competitive edge. We can see it as a value for our product, a value for our services.
Sophie Ehrlich 26:30
It's a double-edged sword, let's say. It's good for one thing; it increases costs. But as we said, it's not that I'm super scared of this 400k because everyone will feel like, okay, it's 400k. For example, we, and I speak as we, as a company, as a medical device class II, it's not going to be 400k; it cannot be. I'm sure that those companies that only have software will spend more than 400k because if I look at the cost of getting the MDR, it was much more than 400k. So if this is the equivalent, they'll spend lots of money on that. I'm sure the estimated cost is not for healthcare, but for AI becoming compliant in general for SMEs. So, you know, it's a kind of average number. But I guess from what I'm hearing, you know, this number doesn't really resonate, and you think it's going to be a lot lower, given where you're at and what you've done so far on this path as well. But Jordina, you were mentioning here, you know, you see this as a positive giving you a competitive edge. How do you all see, you know, let's say three years from now, you've got the AI approval? How do you see this as you're scaling your companies?
Jordina Arcal 27:49
For me, it has benefits. Like, as I said, it's a competitive edge, another barrier for competitors, which for us, it's good. It also takes quality time, so it will make professionals feel safer. Now when you go to a physician, they ask you, "Do you have the CE mark?" Yes, okay, so I trust your technology. They will ask, "Do you have the AI Act approval?" Yes. So now I trust your technology; it's a quality stamp. I think it will help us enter new markets. We are going through the FDA; it will help us with the FDA, and when we go to China, it will help. In three years, let me say in five years, it will be a good thing for the companies that went through that process.
Andreu Climent 28:34
I agree. I think it adds a moat, another layer around it. Regulatory is one; the Act is another. I think the details, some of it, are still open and out there. Harmonization is supposed to come next year, around May, I think. Don't quote me on that, but I think around May is when the first organization is going to come out, which will be important to understand the impact it has. As a company with three CE marks under the hood, and with the AI Act now being law, from a competitive advantage, it is significant because it's one thing to say I can create an algorithm that can analyze heart and lung sounds, and here is my sensitivity and specificity. That's another thing then to be able to file this and show, going back from the beginning that you have adhered to all the regulations, how your data was collected, etc. So that's, I think it's a positive once you're at where we are at.
Andreas Schriefl 29:42
Always when you think about regulation, it's difficult to find, okay, this is positive. The first thing that you say may not be that, oh, okay, this is something that is going to be complex, and it's scary. But at the same time, when you think about AI and artificial intelligence, we are just at the beginning. All of us are scared a little bit. We need that kind of regulations. The European Union has made a great innovation with this regulation. Regulations, in many aspects, are drivers of innovation because they will put the walls, but also they will put the doors, and we will learn before other regions of the world where the doors hold or where there are the windows. So how to make our companies, our devices, go through these kinds of regulations and how to succeed in that. Of course, it will be a little bit harder at the beginning, but at the same time, probably it will train us to be competitive internationally, and that's what I want to think.
Anjany Sekuboyina 30:46
I think the harmonization touching upon everything that the other guys said, the moat is a very important point. For me, I think it's the perception of it. You will always have people, especially in our case, it will be the doctors, right? There will be a group of doctors who will not like AI. The hope is that regulations like this will bring them under the fold, saying, "Okay, this is regulated. The EU has looked at this. There's a stamp of quality. Now I'll trust this a bit more than I would have earlier." This is the hope, and also this would generalize to other countries. We already see during our initial investigations that there are some similarities between the FDA's machine learning files and the AI Act's requirements. The hope is that they get a lot more harmonized so that the work we have to do is mostly replication after that.
Sophie Ehrlich 31:45
Just a step towards a better future as well.
Sophie Ehrlich 31:50
I think the potential of AI in healthcare to transform healthcare is huge. As you mentioned, the trust by hospitals, clinicians, patients alike, and the safety aspect, this, you know, becoming compliant and getting this new AI approval will almost mitigate some of that concern and hopefully help for more AI healthcare products to be deployed both in the European region as well as globally once other regions launch their new regulations regarding AI, which we're seeing. Other regions are working on this, and it's all coming. So this is the first, which means, you know, you're the brave ones that are working to get the approval here and to figure this out together, with our support, notified bodies, law firms, and the power of the ecosystem.
Sophie Ehrlich 32:56
Hopefully, it will open the doors in lots of other regions as you scale afterward. But I think, in summary, if you, you know, for other innovators in the audience, what would you suggest from your initial experience here so far for other companies to do if they haven't looked at this yet?
Anjany Sekuboyina 33:18
I would say, don't procrastinate. Start working. I mean, it's a complex thing, and I understand it can be scary, but you have to understand it. I would suggest everyone to first just settle it down. For example, the AI Act is offering great free workshops on that, and they are showing what the AI Act is. So first understanding it and then understanding if the company fits, and then understanding what's needed to be done. Not waiting these three years, I think would help companies a lot.
Andreas Schriefl 33:56
Yeah, I think I would say so. I see how the FDA handles this and how Europe handles this, and I find some of the things the FDA does are very much forward-thinking, and I'm hoping that's going to apply here as well. With AI, there's this concept of locked algorithms, right? The way it used to be a few years back when we got our first clearance was we had one model to find a pathologic murmur in this patient. If we make an improvement, you're not allowed to just release the improvement; it's a new model. You have to go back to file another 510(k). But now they introduce something like a PCCP, a predetermined change control plan where you can tell the FDA, "Here is my plan for how I'm going to validate internally that my improvement is safe." Once they approve that PCCP, we can now do this without having to do a new filing each time you make an improvement on the algorithm. That doesn't exist on the European side yet, and that's, I think, where I'm hoping that Europe can learn a few things from the FDA. It seems to be a bit more industry-driven versus a top-down approach here. The last thing I want to add that we haven't discussed today is cybersecurity. It's another maybe for young companies. Interesting also on the FDA side, the new cybersecurity regulations are significant. We are going through our third FDA filing now, and the work required to be compliant was massive, really massive for a small company. That is something on the EU side we have not had to do to that extent. Even on the GDPR, we weren't nowhere near that; it was just not that much work. So I'd say, as a European company, I'd still go for Europe first, but make sure to think about the FDA side when you build up the system because I'd say it has an 80% overlap, and the rest you can design smartly from the beginning to save you the work to do parallelization later.
Anjany Sekuboyina 35:58
Fully agree. I think that we are just at the beginning. Some of them have more experience, but most of the companies are quite new in these technologies, and most of the devices that are using artificial intelligence are not yet in the real market or are not yet globally exposed. This is going to be like at the point computers started. This is something that is going to be everywhere. Every single company will have it. So as soon as you are, you may be right now thinking, "Okay, I'm not really using AI." Forget about AI because let's say that we do not use AI. That's something that we are afraid of. Companies that may say we do not go in that direction, but I will say exactly the opposite. This is the way to go. We will need AI for sure in any software company. We will need it almost in any medical device. So let's make use of this to start as soon as possible and be the ones that are ready to make it. New models will come, new technologies will come, and we will need to be ready to make them a product, make it in the market.
Andreas Schriefl 37:05
Advice for new companies, I think I'm a bit too green behind my ears, but I think if you're passionate about your solution, this is just one more hurdle. If you look at it as a hurdle, if you're more passionate about the solution and you want users to use it, I think you can push through. There is an ecosystem that will support this. EIT, like you mentioned, there are investors who know about this. There are grants you can apply for. If there is effort, for sure, there is a financial burden. But if you account for this, and if you really want to see your solution being used in the real world, I think you should go for it. All these landscapes will be changing; they have been changing forever. That should not stop you because I've heard a lot of startups say this. The first time I tell them I'm in med tech, I'm in healthcare, they say, "Oh, the regulations must be a pain." Like, yeah, of course, but I still want to build that one solution. This AI regulation is for all industries.
Sophie Ehrlich 38:39
Yeah, true. It's not just for healthcare. I think really for companies, stop procrastinating. Yes, there are three years, and there is time, but there isn't really. Just understanding whether your AI fits the definition is certainly a first step. Speak to your notified body to understand if they're going to be applicable here, if they're going to be working on the AI approvals or not. If not, then you're going to have to rethink which organizations you're working with. Speak to your law firms; speak to everybody that has some insight on this is certainly a suggestion. For investors, your portfolio companies should start thinking about this. When you're looking at new investments, it's probably good to add this into the due diligence to start looking at these things. I do think it's exciting. I think AI can transform and improve healthcare tremendously, and I'm looking forward to a safer and more trusting environment. Thank you very much. Thank you. Applause.
Sophie Ehrlich 00:07
I thank you. Good afternoon, everyone. On August the first of this year, six weeks ago, the European Union launched the first regulatory framework and the first legal AI Act globally. So six weeks in, we're going to discuss today with our panelists what this means for startups, what does it mean for investors, and what does it mean in healthcare? So the AI Act, for those of you that are less familiar, covers every industry, including healthcare. It's a risk-based mechanism that looks at different risk levels of AI, and the main aim is to ensure that AI is safe for European citizens.
Sophie Ehrlich 01:01
In healthcare, companies that are in this space are really going to be classified as high risk under this AI Act. And what it actually means in terms of the timing is that we have three years in order to comply with this new regulation. So we've got until August 2027 to become compliant.
Sophie Ehrlich 01:26
For companies that don't meet this deadline, the penalties and the fines are high.
Sophie Ehrlich 01:34
Now in terms of the AI Act, what does it actually mean for companies working in the space? Is this really a game changer? Is this an additional layer to the MDR that we have in place today? Is it an addition to GDPR? And the answer is yes, it's an additional layer to MDR and to GDPR. And for companies that have all of those in place, it's a little bit smooth sailing in terms of the requirements that are in place today.
Sophie Ehrlich 02:09
What I'd love to do, my name is Sophie Ehrlich, I work at EIT Health, which is a public-private organization that sits in between the European Commission and the European healthcare ecosystem. At EIT Health, we support early-stage healthcare startups with Horizon Europe EU grant funding and lots of other initiatives that we support the companies with as well. I'm really delighted to be joined today by our panelists. Today we have two companies from Spain, one company from Germany, and one from Austria. Two of the companies have medical devices plus AI, and two of the companies have software only, so medical AI alone. Anjany, if you'd like to introduce yourself, sure.
Anjany Sekuboyina 03:02
I'm Anjany, CEO of Bonescreen.
Anjany Sekuboyina 03:06
I'll try to give a bit of context about what we do, so that all of us are on the same page. We are trying to build a platform for screening bone health.
Anjany Sekuboyina 03:17
What we focus on especially is to extract information that you cannot see from a normal radiological image, CT, MR, whatnot. If you see a bone in there, we try to extract biomarkers from there. What's interesting here is we don't look at structural changes; we try to extract functional biomarkers from these images, and we use AI for this. That is what I want to bring to the panel. The AI in there is more complicated, and it's not very easy to show that it is working. I want to see or how to put it. I want to show how we have to show to the regulators how the AI is doing its job. Thanks to EIT support, we are very close to CE certification. We'll have the audit at the end of December. So that's a fresh perspective as well. We started our journey roughly when the EU AI Act passed, so the regulatory bodies had some information about what they are supposed to do, and there were some changes probably compared to the other startups on the panel. Let's see how that aligns with the rest of the startups. Thank you, Andreu.
Andreu Climent 05:00
Thank you very much. Well, it's a pleasure to be here. Thank you, Sophie, thank you, EIT Health. I'm Andreu Climent. I am an electronic engineer, but I was almost born in a cath lab in the electrophysiology units. Most of the people who are here are already close to the age to have atrial fibrillation or any other arrhythmias. If you work in a cath lab, you discover that half of the patients that go there, we put cables inside of their heart. We burn part of the heart, but we fail daily, and we do not have enough solutions for the clinicians to see what they do. That's why I was in Germany and the States, and we developed a technology that is able to see the heart fully globally in a fast way in 10 minutes, which is Corify Care. That's the company that I set up three times ago. We recently got the C mark. The great advantage that makes us unique is that we are the only ones that can show clinicians the heart in real time during interventions, easily and really fast. But how did we do that? How do we make this possible in 10 minutes? Basically because the device is great, but also because there is a lot of software inside, a lot of engineers working on making something unique, and most of that unique is based sometimes on artificial intelligence and databases that we need to be sure are good enough for what we are going to help. That's one of the important things of the regulation, and I think we will discuss it.
Andreas Schriefl 05:58
Hello there. I'm Andreas, Founder and CEO of eMurmur. We built auscultation AI. For those of you who don't know what auscultation is, it's when a provider listens to your heart and lungs using a stethoscope. What they're trying to do is to hear something that's abnormal that should not be there, via we saw a crackling lung or a murmur in the heart.
Andreas Schriefl 06:19
What we have done is we have created an AI that can analyze heart or lung sound recordings. We got our first FDA clearance in 2019, and CE marking as well under MDD. At the time today, we hold six regulatory clearances. Number seven is on the way on the FDA side. So we are MDR cleared now and FDA cleared, and I look forward to discussing how the AI Act is going to impact current filings we have and future filings we are working on.
Jordina Arcal 06:51
Hello, everyone. My name is Jordina, and I'm the Deputy CEO at MJN Neuroserveis. We have developed the first available device in the world that is able to predict tabletop access seizures. It's a portable EEG that continuously records the electric brain activity through the ear canal, and it sends this information via Bluetooth to a mobile phone, where we embed an artificial intelligence algorithm that is able to continuously assess the risk of suffering a seizure, so we can alert patients within one minute of a seizure, allowing us to avoid accidents. We can provide peace of mind, improving the quality of life, not only for patients but also for caregivers. At the same time, we are providing valuable information to doctors in order to better manage their patients. We have the CE mark, MDR, UKCA, and we have a contract for distribution in Europe in 32 countries with pharmaceutical companies. So we are currently selling in Europe and trying to jump to the US with the FDA. Yes, happy to talk about the new AI Act, new regulation, as we are struggling with the FDA. So that would be nice.
Sophie Ehrlich 08:13
Some of you may be sitting in the audience thinking, you know, we're a company based in Boston. How does this apply to us? The AI Act applies to either European or any global companies that are deploying AI in the European region. So it applies to everybody.
Sophie Ehrlich 08:30
In terms of the AI, the European Union's definition of AI, this is something that I really recommend everybody to take a look at because what we can all call AI might be slightly different from what the European Union defines as being AI. There may be companies here that may not need to apply for this new regulation. So that's certainly something to take a look into, and that's all publicly available online. In terms of what this means for you as companies, has this new regulation that's been in effect for six weeks changed anything that you're doing, any of the work that you're working on? How's this in the past six weeks or past few months? Have you done anything? Are you doing anything differently in your companies? Andreas, maybe we'll start with you.
Andreas Schriefl 09:13
We started researching it, partially driven by the deadline that came up today.
Andreas Schriefl 09:20
As a medical device company that has CE marking, the way the system works is that the AI Act is built on risk classifications, and automatically, if you are a class IIa to B or higher in Europe, you classify as a high-risk device under the AI Act, meaning it applies.
Andreas Schriefl 09:44
We looked at this broadly in terms of what that means, in terms of what we have to do, and when we have to do it by. So in terms of timeline, we have three years as medical device companies for this to be implemented.
Andreas Schriefl 10:00
Most of what's required is already part of being regulated under MDR, except it's not harmonized yet, meaning that within our quality management system, it's built for MDR. We still have to show that we are compliant now with the AI Act on top of it, and there's a set of rules that are not part of MDR that we'll have to add in addition to what we're doing today. But overall, I'd say, as a medical device company with clearance, we're in a pretty good spot because a lot of the big work we need to do setting up the quality management system and the whole risk management is already in place, versus other companies that might be affected. They don't have that yet today.
Sophie Ehrlich 10:49
Okay, super. Jordina, tell us a little bit about what you've been doing at MJN.
Jordina Arcal 10:53
So when it came out, I guess as every company, we said, "Oh no, another regulation that we have to go through." So the first thing we did was try to understand what this is about, and we talked with our regulatory experts so we understood if we fit in or not. Are we AI, as the European Commission understands AI? Yes, we are. Then we said, "Okay, now let's go." So what do we need? There are lots of tools available that you can use in order to understand what work you need to do to be compliant with this AI Act. What we found out is what Andreas said: if you have the MDR with a medical device, you are like 93% compliant with the AI Act. This doesn't mean that we don't have to work; it means that we have lots of things in place, but we still have to work to be compliant. So what we did so far is to understand it, get to know a little bit about it. As you said, there are no experts on that, so we just managed to understand it and to understand what we should do and how many times we have to do it. Just as we say, start the engines to be prepared and to be one of the first ones because we foresee that this will be a huge thing, and regulatory bodies will be overwhelmed, so we have to be the first ones in line because if not, it will take a long time. So that's what we did. So ahead of the game.
Anjany Sekuboyina 12:21
I think we were lucky for two reasons.
Anjany Sekuboyina 12:25
We have been hearing about the Act for almost a year now, right? Or probably more than that. I think we started our regulatory journey right about when we started hearing about these things. So we started preparing for it. The second point is we are based in Germany, and we love regulations. There is a working body of all the notified bodies in Germany that came up with this very long list of what they call the AI guideline checklist. It covers different parts of different regulations. It brings in MDR 62304, the risk management, and all that. It adds a lot of questions in the form of a checklist: do you have this? Do you have this? And so on. It took us almost, I don't know, a week to fill this checklist of just saying, "Okay, this is here. This is here." Because we worked with this checklist, I think we were already pretty prepared for this, so I'd imagine we still have some work to do, but definitely not as much.
Sophie Ehrlich 14:12
And for anybody here, this AI checklist is actually publicly available. It's IG-NB, and if you look for the latest version, it's been put together by the German notified body association. Even if you're not a German company, it's still quite a helpful checklist to take a look at to see whether your AI fits this criteria or not. If you do seem to align with most of it, it should be pretty positive, but it may also highlight areas where maybe more work needs to be done. It's certainly a good source to take a look at, and I'm happy to send you the link if anybody wants to reach out to me afterward.
Sophie Ehrlich 14:47
So it seems, it sounds as though, you know, it's not a huge change. It's not a big change; it's an additional layer, I guess, the strawberries on the cheesecake. So it sounds as though you very quickly have been looking into this and understanding what you need to look for and what is more relevant and less relevant. To understand what the criteria is, because this is a new act, we don't yet have, you know, there is no one expert that can really guide a company to say everything that's needed, but I guess here we're looking at a group. Law firms have tremendous knowledge; notified bodies have tremendous knowledge. At EIT Health, we can, of course, provide additional support, but it's also within the ecosystem, companies speaking to other companies to try and understand what this new regulation means. I think especially in this next 12 months, it's going to be the power of the ecosystem to help us all navigate together what this actually means because it is new, and as we all understand in innovation, you know, being the first is exciting, but it also has challenges, and nobody's really been there or done that before. I think there are many more questions than answers today, and it's a question of understanding this through the months to come. One of the things to highlight in terms of notified bodies: notified bodies can apply to support companies from November this year, so it's going to take time for the notified bodies to actually be able to work on supporting companies with the approvals. In addition, each European member state, each country, has 12 months in order to set up their national governing bodies in order to be able to enforce this regulation as well. So we're really talking about, it's going to take months, especially in this coming year, for the setup of this to be kind of more established.
Sophie Ehrlich 16:32
If we think about fundraising, and if we think around, you know, you're all fundraising today. Have investors asked you about this in their diligence? Has this kind of topic come up yet? It is new, but just out of interest, is this something that you're being asked about? Maybe Anjany, I start with you.
Anjany Sekuboyina 16:59
We are looking for a seed round, around one or two million, and we have not been asked anything specific. I think until now they are just more interested in CE and MDR. So yeah, probably it will change the more we speak to or at least as time passes because it has just been six weeks, right? So, yeah, it's early. You have to see.
Andreu Climent 17:38
In your case, we recently closed a Series A, and it wasn't one of the important questions. We are a technology-driven company, so our algorithms are one of the important assets, although we are a medical device with a disposable. They had some questions, but mainly during the due diligence. As soon as they go deep into the cybersecurity, they go deep into how the MDR software is working, and they go deep into the checklist about AI and more or less they wanted to learn, "Okay, how do you plan to be compliant with this new regulation, and how much is it going to cost?" So probably these two questions came, and we tried to answer.
Andreas Schriefl 18:11
Okay.
Jordina Arcal 18:14
We are currently doing a 2.5 million Series A round, and I also have not been asked by investors. But I do mention, I do make sure that I bring it up because in our case, I think it's advantageous for two main reasons. Since we already have the CE mark and the MDR, the investment needed to build a system like this from the ground up has already happened. The ongoing addition, let's say you now have the audit, is a bit more expensive. That's part of the regulatory budget anyway. But the other reason is that our AI models that are cleared have been trained, validated, and tested. The requirements around that part of the framework within the AI Act are quite different than under MDR. If I had to start today with the AI Act in place, the investment needed to be compliant would be much higher than what it was for us at the time.
Anjany Sekuboyina 19:16
Yeah, I absolutely agree with you. I mean, we are now raising a 4 million round, and I only had one investor asking for it, but I have to say that it was in a meeting where they were talking about the AI Act, so it made sense that this person was asking about the AI Act. The other investors I've been talking with, I'm making sure I'm making it aware that, hey, we are aware of that, so we are in the MDR and blah, blah, blah. But I think it's normal they are not asking about it because it's super new. I mean, it launched six weeks ago, and as you said, no one is an expert, and it's one year to come that we will understand it. So it's a topic that I guess that when we close a round with the due diligence, I will make sure that everything is ready for this new AI Act. As you said, we are one step ahead because we have the CE mark under the MDR, and this is, as you said, the strawberries in the cheesecake. So it's another layer that it will have to be placed, but it won't be that hard as if we started from scratch.
Sophie Ehrlich 20:27
Interesting, interesting. So in terms of fundraising, I understand it's not yet in the full due diligence. It will be; it should be. Regulatory aspects are certainly something that investors dive into as part of the diligence, and it will be a part of it. It's certainly a good indication for investors to start asking about it, to understand if the companies that they're meeting with are aware of where they stand with this or if they're starting to look into this themselves. In terms of costs, as we all know, another regulatory approval is going to take time out of your time, your team's time; there might be additional costs working with external advisors.
Sophie Ehrlich 21:18
How are you thinking about these costs? Are you fundraising a little bit more than you originally planned to compensate for this? Just to give you a little bit of insight on the estimated cost, the European Commission has estimated that the costs for an SME to become compliant with the AI Act—this isn't healthcare specific; it's AI in general—may be up to 400,000 euros. Now that's assuming they don't have a quality management system in place. It's assuming they're starting from day one and that they don't have other approvals that have some elements of this in place. So it's really kind of a high number, but that's estimated in an impact report of the estimated costs of becoming compliant. At the end of the day, we're all going to know how much this cost three years from now, you know, calculating backwards to see really how much was spent on this. But what are your thoughts around this as you're looking at your cash runway, your fundraising? Share that with me.
Anjany Sekuboyina 22:30
I think audits are expensive, and the more regulations we have, the more money we have to pay. That's my view on it. On top of this, I think, as Jordina and Andreas were saying, we are not starting from scratch. This is some budget we already put in; we already accounted for it. Probably because we are already doing this, the extra cost is a differential. I guess probably the more experienced have a different perspective, but we have already accounted for this. For example, look at it as an investment. Let's tell it in that way. We are making an effort to become compliant with a regulation that is trying to make patients safer. That makes our companies a little bit more prepared for what is coming. Everyone will have to pay for it sooner or later, so the sooner we do it, probably the easier it will be because we will learn. We will make the path. The first companies that had to fight with it found it difficult, but later they were the ones that succeeded. I think it's not going to be as much more expensive as it's worth it. It's already in the air to get it because it's the additional layer; it's not something brand new.
Andreu Climent 24:18
Exactly. I don't think it's something that fully changes. We are software companies. The last regulation in software already made it much more complex to explain all the software to the notified bodies. We already had to do that. That's another layer. Now you have to explain the databases; now you have to explain how you train it; now you have all these kinds of things. It makes sense. To some extent, it will take money, of course.
Andreas Schriefl 24:45
I agree. I think the cost question, if I step back from what we do at my company, it depends on your product, and it's important to understand what is qualified as AI. I was surprised to learn it's also logic-based systems. If your product uses some decision tree as part of the software, you qualify as AI, which I assume AI and machine learning refine for us. It doesn't make the difference because we do machine learning. But if I had a product that utilizes some decision tree, I would not have been regulated under the MDR the way we are. But now I have to adhere to the AI Act, which would be, I think, much more expensive than the number quoted if.
Anjany Sekuboyina 25:00
Four. So for us, on a regulatory basis, the increase is going to be there, but it's not going to be too significant. I'd say, of course, if you look back at what the audit cost under MDD, MDR times three, and now, plus the AI Act next year, it does add up over time, that's for sure, for smaller companies, yes, of course. I mean, it's what Anjany said: more audits, more money, more regulatory, more money we have to spend. So we are always thinking about that. As a medical devices company, we are super used to spending lots of money on regulatory. This is what we've chosen; this is the environment we work in, and we are very used to that. So as I said, we were scared at the beginning. Then we understood it's another layer in the cake. We compare it with the UKCA. First, with the MDR, it was one audit, one day. Now we have one day and a half because we have to have the audit for the MDR and the audit for the UKCA. Okay, now it will be maybe today, two days and a half. We know we have to work. We know we have to save money for that. It will be more money. As you said, with the MDD, it was this money; with the MDR, it's this money; with the UKCA, it's this money. Now with the AI Act, it's this money.
Andreas Schriefl 26:15
But on the other hand, it's a stamp of quality, and it's a competitive edge. We can see it as a value for our product, a value for our services.
Sophie Ehrlich 26:30
It's a double-edged sword, let's say. It's good for one thing; it increases costs. But as we said, it's not that I'm super scared of this 400k because everyone will feel like, okay, it's 400k. For example, we, and I speak as we, as a company, as a medical device class II, it's not going to be 400k; it cannot be. I'm sure that those companies that only have software will spend more than 400k because if I look at the cost of getting the MDR, it was much more than 400k. So if this is the equivalent, they'll spend lots of money on that. I'm sure the estimated cost is not for healthcare, but for AI becoming compliant in general for SMEs. So, you know, it's a kind of average number. But I guess from what I'm hearing, you know, this number doesn't really resonate, and you think it's going to be a lot lower, given where you're at and what you've done so far on this path as well. But Jordina, you were mentioning here, you know, you see this as a positive giving you a competitive edge. How do you all see, you know, let's say three years from now, you've got the AI approval? How do you see this as you're scaling your companies?
Jordina Arcal 27:49
For me, it has benefits. Like, as I said, it's a competitive edge, another barrier for competitors, which for us, it's good. It also takes quality time, so it will make professionals feel safer. Now when you go to a physician, they ask you, "Do you have the CE mark?" Yes, okay, so I trust your technology. They will ask, "Do you have the AI Act approval?" Yes. So now I trust your technology; it's a quality stamp. I think it will help us enter new markets. We are going through the FDA; it will help us with the FDA, and when we go to China, it will help. In three years, let me say in five years, it will be a good thing for the companies that went through that process.
Andreu Climent 28:34
I agree. I think it adds a moat, another layer around it. Regulatory is one; the Act is another. I think the details, some of it, are still open and out there. Harmonization is supposed to come next year, around May, I think. Don't quote me on that, but I think around May is when the first organization is going to come out, which will be important to understand the impact it has. As a company with three CE marks under the hood, and with the AI Act now being law, from a competitive advantage, it is significant because it's one thing to say I can create an algorithm that can analyze heart and lung sounds, and here is my sensitivity and specificity. That's another thing then to be able to file this and show, going back from the beginning that you have adhered to all the regulations, how your data was collected, etc. So that's, I think it's a positive once you're at where we are at.
Andreas Schriefl 29:42
Always when you think about regulation, it's difficult to find, okay, this is positive. The first thing that you say may not be that, oh, okay, this is something that is going to be complex, and it's scary. But at the same time, when you think about AI and artificial intelligence, we are just at the beginning. All of us are scared a little bit. We need that kind of regulations. The European Union has made a great innovation with this regulation. Regulations, in many aspects, are drivers of innovation because they will put the walls, but also they will put the doors, and we will learn before other regions of the world where the doors hold or where there are the windows. So how to make our companies, our devices, go through these kinds of regulations and how to succeed in that. Of course, it will be a little bit harder at the beginning, but at the same time, probably it will train us to be competitive internationally, and that's what I want to think.
Anjany Sekuboyina 30:46
I think the harmonization touching upon everything that the other guys said, the moat is a very important point. For me, I think it's the perception of it. You will always have people, especially in our case, it will be the doctors, right? There will be a group of doctors who will not like AI. The hope is that regulations like this will bring them under the fold, saying, "Okay, this is regulated. The EU has looked at this. There's a stamp of quality. Now I'll trust this a bit more than I would have earlier." This is the hope, and also this would generalize to other countries. We already see during our initial investigations that there are some similarities between the FDA's machine learning files and the AI Act's requirements. The hope is that they get a lot more harmonized so that the work we have to do is mostly replication after that.
Sophie Ehrlich 31:45
Just a step towards a better future as well.
Sophie Ehrlich 31:50
I think the potential of AI in healthcare to transform healthcare is huge. As you mentioned, the trust by hospitals, clinicians, patients alike, and the safety aspect, this, you know, becoming compliant and getting this new AI approval will almost mitigate some of that concern and hopefully help for more AI healthcare products to be deployed both in the European region as well as globally once other regions launch their new regulations regarding AI, which we're seeing. Other regions are working on this, and it's all coming. So this is the first, which means, you know, you're the brave ones that are working to get the approval here and to figure this out together, with our support, notified bodies, law firms, and the power of the ecosystem.
Sophie Ehrlich 32:56
Hopefully, it will open the doors in lots of other regions as you scale afterward. But I think, in summary, if you, you know, for other innovators in the audience, what would you suggest from your initial experience here so far for other companies to do if they haven't looked at this yet?
Anjany Sekuboyina 33:18
I would say, don't procrastinate. Start working. I mean, it's a complex thing, and I understand it can be scary, but you have to understand it. I would suggest everyone to first just settle it down. For example, the AI Act is offering great free workshops on that, and they are showing what the AI Act is. So first understanding it and then understanding if the company fits, and then understanding what's needed to be done. Not waiting these three years, I think would help companies a lot.
Andreas Schriefl 33:56
Yeah, I think I would say so. I see how the FDA handles this and how Europe handles this, and I find some of the things the FDA does are very much forward-thinking, and I'm hoping that's going to apply here as well. With AI, there's this concept of locked algorithms, right? The way it used to be a few years back when we got our first clearance was we had one model to find a pathologic murmur in this patient. If we make an improvement, you're not allowed to just release the improvement; it's a new model. You have to go back to file another 510(k). But now they introduce something like a PCCP, a predetermined change control plan where you can tell the FDA, "Here is my plan for how I'm going to validate internally that my improvement is safe." Once they approve that PCCP, we can now do this without having to do a new filing each time you make an improvement on the algorithm. That doesn't exist on the European side yet, and that's, I think, where I'm hoping that Europe can learn a few things from the FDA. It seems to be a bit more industry-driven versus a top-down approach here. The last thing I want to add that we haven't discussed today is cybersecurity. It's another maybe for young companies. Interesting also on the FDA side, the new cybersecurity regulations are significant. We are going through our third FDA filing now, and the work required to be compliant was massive, really massive for a small company. That is something on the EU side we have not had to do to that extent. Even on the GDPR, we weren't nowhere near that; it was just not that much work. So I'd say, as a European company, I'd still go for Europe first, but make sure to think about the FDA side when you build up the system because I'd say it has an 80% overlap, and the rest you can design smartly from the beginning to save you the work to do parallelization later.
Anjany Sekuboyina 35:58
Fully agree. I think that we are just at the beginning. Some of them have more experience, but most of the companies are quite new in these technologies, and most of the devices that are using artificial intelligence are not yet in the real market or are not yet globally exposed. This is going to be like at the point computers started. This is something that is going to be everywhere. Every single company will have it. So as soon as you are, you may be right now thinking, "Okay, I'm not really using AI." Forget about AI because let's say that we do not use AI. That's something that we are afraid of. Companies that may say we do not go in that direction, but I will say exactly the opposite. This is the way to go. We will need AI for sure in any software company. We will need it almost in any medical device. So let's make use of this to start as soon as possible and be the ones that are ready to make it. New models will come, new technologies will come, and we will need to be ready to make them a product, make it in the market.
Andreas Schriefl 37:05
Advice for new companies, I think I'm a bit too green behind my ears, but I think if you're passionate about your solution, this is just one more hurdle. If you look at it as a hurdle, if you're more passionate about the solution and you want users to use it, I think you can push through. There is an ecosystem that will support this. EIT, like you mentioned, there are investors who know about this. There are grants you can apply for. If there is effort, for sure, there is a financial burden. But if you account for this, and if you really want to see your solution being used in the real world, I think you should go for it. All these landscapes will be changing; they have been changing forever. That should not stop you because I've heard a lot of startups say this. The first time I tell them I'm in med tech, I'm in healthcare, they say, "Oh, the regulations must be a pain." Like, yeah, of course, but I still want to build that one solution. This AI regulation is for all industries.
Sophie Ehrlich 38:39
Yeah, true. It's not just for healthcare. I think really for companies, stop procrastinating. Yes, there are three years, and there is time, but there isn't really. Just understanding whether your AI fits the definition is certainly a first step. Speak to your notified body to understand if they're going to be applicable here, if they're going to be working on the AI approvals or not. If not, then you're going to have to rethink which organizations you're working with. Speak to your law firms; speak to everybody that has some insight on this is certainly a suggestion. For investors, your portfolio companies should start thinking about this. When you're looking at new investments, it's probably good to add this into the due diligence to start looking at these things. I do think it's exciting. I think AI can transform and improve healthcare tremendously, and I'm looking forward to a safer and more trusting environment. Thank you very much. Thank you. Applause.
Market Intelligence
Schedule an exploratory call
Request Info17011 Beach Blvd, Suite 500 Huntington Beach, CA 92647
714-847-3540© 2024 Life Science Intelligence, Inc., All Rights Reserved. | Privacy Policy