Wednesday, June 2, 2021

YTD US BIotech VC $19B Through May

Biotech VC is on a record pace, with $19 billion invested through the end of May.  Deal sizes in Boston, SF, San Diego, and New York have been averaging $50 million compared to $25 million in smaller markets.




Monday, December 28, 2020

Thursday, November 5, 2020

Gene Silencing vs. Gene Editing


CRISPR and Gene Editing are grabbing the headlines, but Gene Silencing and RNAi continue to keep investors interested.


Recent Nobel Prize awards for CRISPR pioneers Jennifer Doudna and Emmanuelle Charpentier have brought a renewed round of interest in the gene editing technology.  In addition to CRISPR-designed plants, trees, and therapeutics, gene editing is even being evaluated as a way to diagnose and test for COVID-19.


But editing genes is just one way to use genetic information to treat disease.  Gene silencing often doesn’t get the press that CRISPR or even gene therapy does, but its first FDA-approved product hit the market in 2018, a second one did so late last year.  Additionally Takeda and Arrowhead Pharmaceuticals recently announced a deal worth up to $1 billion to use the technology to treat a rare liver disease, RNAi developer Siranomics has completed  a $105 million D round, and leading RNAi supplier Alnylam’s two FDA approved RNAi therapies are on course to surpass $300 million in annual revenue this year, and the company could have its third therapeutic approved by the FDA by early next year.


Gene silencing, also known as RNA interference, or RNAi, leaves the existing DNA in the cell nuclei in place.  Rather than editing DNA like CRISPR or replacing it like gene therapy, it works by breaking up the messenger RNA carrying DNA instructions on its way to the cell’s ribosomes where it would otherwise produce proteins.  This could be particularly helpful in stopping cancer cells or other harmful cells from producing proteins and ultimately stopping their growth.


Image for post

                                                           RNA "sliced and diced" in a cell, Source: NIH



It should be noted that RNAi is very different than mRNA therapies like the one Moderna is trialing for COVID-19.  These produce healthy proteins, rather than inhibiting the transmission of disease-causing ones. 


One of the main benefits of RNAi is that there’s no potential health risk from changing the DNA.  The messenger RNA is intercepted and destroyed post-transcription, but pre-translation, so the impact is not permanent.   


RNAi is more of a defensive technique against disease while CRISPR can be seen as an offensive one.  It binds to the messenger RNA (mRNA) traveling through the cell in order to break it up.  But like a linebacker tackling a much smaller quarterback, RNA is fragile, and can come apart easily.  This has both benefits and drawbacks in how effective it can be preventing the unwanted effects of a particular gene expression.


RNAi depends on two variants of the nucleic acid: single stranded micro interfering RNA, or miRNA, and double stranded small interfering RNA, or siRNA.   siRNA works by attacking the target mRNA and triggering a RNA-induced silencing complex, or RISC.  This includes cleaving off a single strand sequence that is complementary to the messenger RNA and halts is progression.  miRNA, unlike siRNA, only has to be partially complementary to the messenger RNA, which also means it can reach a wider range of targets. A third variant, shRNA, or short hairpin RNA, functions similarly to siRNA, but has the added benefit of permanently altering the nucleus to produce the interfering RNA, allowing for more than a one time treatment.  The “hairpin” refers to its structure, which connects the ends of the sequence with a semicircular structure, not unlike the end of a paper clip.


With RNA being much more fragile than DNA, delivering RNAi therapies has proven to be far more challenging than simply encapsulating them in viral vectors as is typically done with DNA therapies.  Additionally, mRNAs need a shell that can penetrate a cell membrane in spite of both being negatively charged.   As a result, drug developers are increasingly looking to LNPs, or lipid nanoparticles, that can maintain efficacy and structure of RNA and offer lower overall risk than injecting viruses into the body.   


             mRNA Overview, Source: NIH


LNPs are tiny lipid bubbles that can encapsulate nucleic acids and other proteins.  Their main drawback traditionally has been their interactions with immune cells and their small size makes them easy to be swept out of the body by an immune attack.  However, the body does not develop antibodies against them as it does traditional viral vectors.  Moreover, their use was validated after Alnylam’s Onpattro (Patisiran) became the first FDA-approved RNAi therapeutic in 2018.  


While LNPs can be used to package DNA as well, interest in using them to deliver both RNAi and RNA therapies like Moderna’s COVID-19 vaccine has been growing dramatically since Onpattro was approved.  Moderna recently fought and lost an important patent battle against Arbutus over the LNP it could have been using in its COVID vaccine that is currently in Phase III trials.  Moderna has since said it has advanced its LNP technology beyond what it licensed from Arbutus, but the key point is that the delivery mechanism is a major economic and therapeutic component of the vaccine, not just the mRNA sequence.


The LNP research going into mRNA COVID vaccines could ultimately stimulate further advancements in its use for RNAi.  Delivery, which has traditionally been a challenge for RNAi is actually turning into a strength as more drug producers have started to look at LNPs as a safer alternative for encapsulating DNA than viral vectors. 


Existing RNAi therapies have been targeted at rare diseases, but advances in LNP vectors could help open up RNAi to wider scale uses not just in COVID vaccines, but oncology and other areas where delivery has been uncertain and less tested.  While gene editing might continue to grab the headlines, there’s a promising future developing for gene silencing.


Thursday, October 29, 2020

What's the Future for AAVs in Gene Therapy?

 AAVs are the most common vector for delivering in vivo gene therapies, but accommodating growing demand will require manufacturing innovation.

Source: FDA

Gene Replacement vs. Gene Editing

Since its advent in the 90s, gene therapy has held promise as a way to defeat genetic diseases. Unlike CRISPR which edits the DNA, gene therapy keeps the existing DNA sequence in place, but replaces it with a healthy copy of the gene grown in vivo or ex vivo. For in vivo gene therapies, this sequence is typically delivered through an AAV (Adeno-Associated Virus) vector. Some of the common in vivo gene therapies include treatments for hemophilia, muscular distrophy, and cystic fibrosis.

In total, the FDA has now approved nearly twenty gene and cell therapy products, with an emphasis on blood and hemophilia-related disorders. But while further research can address lingering safety concerns with AAVs, new manufacturing processes will be needed to scale up production of these viral vectors in order meet growing demand.

Viruses Transport DNA, but Don’t Make Proteins

Before getting into how they’re made, it’s important to look at why viruses are used to deliver “replacement” genes. Viruses can carry DNA, but they cannot produce proteins from that DNA like a traditional cell can through transcription and translation. They’re great at transporting DNA, but need your cells to turn that DNA into a protein. In some ways, they’re like freight trucks that can bring raw materials into a factory, but lack the factory’s ability to transform those raw materials into a finished product.

Viruses engineered to carry repaired DNA for gene therapy can also be manipulated not to replicate beyond the target cells. This prevents the virus from sickening the patient. But it still means you can develop immunity to the viral vector, which is why gene therapy is often used as a one time treatment.


Ex Vivo and In Vivo Gene Therapy Overview, Source: FDA


Why AAVs?

There are a few viral vectors that can be used to deliver gene therapy, but the AAV has become one of the most common because it can infect both dividing and non-dividing cells, and few people have been exposed to it so they don’t have immunity against it. Moreover, AAVs generally have low immunogenicity, that is to say they don’t produce a strong immune response from the body after insertion, which helps maintain their effectiveness as delivery mechanisms for new DNA.

There are some inherent challenges with AAVs, notably their DNA carrying payload is limited to 4,000 base pairs. Any sequence longer than that simply won’t fit. Longer strands can be accommodated by Adenoviral vectors, but those introduce an additional set of safety and quality challenges.

Are they Safe?

In addition to limited production volumes, AAVs and gene therapy still have safety concerns to overcome, particularly with high dosages. Two children died earlier this year in trials for a high dose AAV-delivered gene therapy for X-Linked Myotubular Myopathy (XLMTM). The particular capsid used for this trial, AAV8, has been used without any safety issues in 14 other trials, highlighting how sensitive AAV’s can be to shifts in their dosage and payloads.

Additional safety concerns have arisen around the potential to cause cancer observed in longitudinal studies of both dogs and humans. However, the data here is inconclusive. Nonetheless, the safety concerns obviously raise the risk for both gene therapy and its viral vectors, and for the time being have constrained their use in certain clinical trials.



Gene Therapy at the Cellular Level, source Genome.gov

Safety is the Top Priority, but Better Production Efficiency will be Essential to Meeting Future Demand

While additional research will be needed to address safety concerns, new manufacturing processes will be needed to ensure AAVs can be produced more efficiently and economically, which remains a challenge with existing manufacturing processes. This will be essential to meeting the growing demand for gene therapies.

Requiring substrates to which they can adhere, AAVs don’t scale up like mAbs (monoclonal antibodies) in a bioreactor. To grow they ultimately require a tremendous amount of plasmids, which both pushes up materials costs and creates potential quality risks. However, AAVs to date have faced limited pressure to scale up due to the low volumes and few approved gene therapies on the market. German drug developer Cevec claims it can deliver a cell line up front, similar to a master cell bank, from which AAVs can be drawn from, not unlike how mAb cell lines are stored. On top of that, they claim they can support standard 2000 L bioreactors, allowing a manufacturing scale not available with plasmid-intensive AAVs that struggle to sustain batches greater than 200 L.

Scaling Up Production

The main barrier to scaling up AAVs is putting their pieces together. There’s the “cap” or the process of building the capsid that encapsulates the virus, the “rep”, which allows the DNA to replicate, a helper virus needed to manage replication, and the payload, i.e. the nucleotides inside that actually deliver the therapy. By bundling it all together, Cevec believes they can overcome the volume limitations of existing approaches. But it remains unclear how they maintain quality and cost creating the cells. It would appear the upfront cost of doing this could still be significantly more expensive, made economical only through very high volumes that might be beyond current demand for existing gene therapies.

In contrast to Cevec’s approach, AGTC (Applied Genetics Technology Corporation) is proposing to accelerate scale up by using Herpes Simplex Viruses as “helpers” through a medium of BHK (Baby Hamster Kidney) cells. AGTC has been working on this approach for years, but like other AAV delivery methodologies, has been waiting for a gene therapy that could offer the volume to take advantage of it.

With just around twenty gene and cell therapy products approved by the FDA, AAVs primary use could still be in the clinic for another few years, allowing some time for more cost effective, high production growth techniques to develop. Either way, continued innovation in their manufacturing processes will be as important as innovation in the gene therapies they serve in order to serve a larger market.


Wednesday, October 28, 2020

COVID Vaccine Production Could Accelerate the Use of AI in Biomanufacturing

 Discussions about using machine learning in biomanufacturing have increased recently as COVID vaccine production gets closer to its anticipated scale up. Technology providers see a huge opportunity to bring AI products to a market that might have sat in development had there not been the immediate need to get a COVID vaccine out the door.

Illustration © Visual Generation | Dreamstime.com
                                                          

Different suppliers have different approaches. While some are looking to machine learning to detect patterns in treatments, others are looking at measuring tools needed to monitor biologic production. Either way, biomanufacturing and traditional pharma offer plenty of opportunities to reduce cost, to improve quality, or to increase speed. Moreover, many observers believe biomanufacturing processes hold the greatest near-term promise for AI within the larger biopharma industry outside of drug discovery. But machine learning in this field looks very different than it does in other industries. Not surprisingly, detecting the behavioral patterns of bacteria requires very different approaches than detecting the behavioral patterns of people.

Measuring Cell Cultures isn't like Measuring Website Visitors

More advanced analysis of cell cultures is now taking place now in academic research in order to increase the quality of these cultures that are used to produce biologic drugs. Computational biologists are applying algorithms to improve yields on E.coli by optimizing batch quantities without monitoring actual cells. This approach isn’t based on the predictive analytics that are often used in AI-based customer analysis in marketing and website optimization, but rather trial and error analysis on data from a “virtual” cell culture. Essentially, the algorithms provide rapid pattern recognition on activities in the culture that would take people much longer to assess. The next step though, is to move out of the virtual world and bring the techniques to real bioreactors.
With AI in biomanufacturing heading down a different path than the heavily software-based approaches used in marketing and website analytics, it is also heading to an environment where algorithms have to learn continuously from ongoing manufacturing processes, not just by sifting through massive databases. Measuring cell cultures also requires monitoring hardware that does not contaminate the dividing cells.

Hardware Plays a Crucial Role in Biomanufacturing Machine Learning

Taking advantage of photonic products used to monitor and manage fiber optic networks in telecommunications, Raman spectroscopy is one technology that can measure product features using lasers and infrared wavelengths. Chinese Hamster Ovary cells, which are commonly used to grow antibody therapies, emit biophotons during bioprocessing that can be measured by Raman spectrometers without interfering with production. While these devices add to the capex needed to buildout a facility, they can reduce opex through labor savings and more precise quality measurements during both upstream and downstream processing.
Beyond Raman spectroscopy, many other IoT (Internet of Things) sensors are built for higher volume manufacturing environments with less sensitivity. Chemical manufacturers and oil refineries use these devices to a large extent, but these same IoT sensors can’t be placed in a bioreactor without impacting the cell culture. Outside of basic environmental monitoring in the space surrounding the bioreactor, customizing industrial sensors for biomanufacturing would be very expensive for the device suppliers, and leave them with limited volumes over which to cover their fixed development costs. As a result, these types of devices are not used as widely in biomanufacturing.
Sensors and spectrometers are needed on the hardware side, on the software side there is growing use of AI and analytic tools to help automate biomanufacturing. While the additional data can be useful in places, it is still faster and easier to adopt in less regulated industries. Biomanufacturing process analytics can certainly take advantage of more frequently sampled data, but development of Process Analytics Technology must be coordinated with the FDA.
Dependent on learning through experience, and needing to conform to the needs of a regulated industry, biomanufacturing AI will likely see its growth accelerated by COVID vaccine scale up in the next 12 months, but perhaps not at the torrid pace some are hoping for. Nonetheless, there could be greater uses a few years down the road as biomanufacturing faces new unit cost, quality, and scale up challenges with the higher volumes brought about by biosimilars, cell and gene therapy, and even gene editing. COVID might just be a stop along the way to a deeper industry embrace of AI.


Monday, October 12, 2020

DNA Reads are Cheap, What About DNA Writes?



Thanks to large gains in compute power, the costs to read DNA have dropped exponentially (23andme, Ancestry, etc are the result of this), but the costs to write DNA have only dropped linearly. But there is a lot of work going on in DNA write technologies to address this.

Most synthetic DNA has been made the same way since the early 80s using a serial process known as Phosphoramidite Synthesis.  This method requires attaching each base pair to the existing strand one at a time, and it limits synthetic DNA strands to about 100 base pairs in total length before errors, time, and cost challenges set in. However, a series of companies funded over the last couple of years is seeking to change this, and improve the reliability and economics of writing DNA.


Enzymatic DNA is being proposed as a way to overcome the limits of traditional synthetic DNA construction.  By using an enzyme, in this case Terminal deoxynucleotidyl Transferase (TdT), the addition of new nucleotides can be done without engineering each additional attachment to an existing strand.


To date, enzymatic DNA hasn’t gone much further than traditional synthesized DNA, and has been running at a functional limit of 300 base pairs.  But a series of recently funded companies are looking to expand that limit and reduce the cost of synthetic from the 9-15 cent per base pair range it has been in the last few years.


Among these companies, DNA Script and Nuclera are developing Enzymatic DNA Printers that are planned to hit the market in the next 6-18 months.  Both are following the business strategy Illumina has set in DNA sequencing, by selling hardware to companies to labs.  DNA Script, based in France with a US headquarters near San Francisco raised $50 million in July.  

San Diego-based Molecular Assemblies raised a $12 million Series A last year and has partnered with protein engineering firm Codexis to license enzymatic DNA technology to other companies.  In a recent Nature Biotechnology article, the company referred to its strategy of providing the ink, not the printer.


Bay Area-based Ansa Biotech recently raised $8 million to become an enzymatic DNA service provider, similar to what publicly traded Twist Bioscience currently does with traditional synthetic DNA.  Twist itself is also using traditional synthesis to archive digital with DNA, which recently funded Kern Systems is looking to do as well with enzymatic DNA.


Enzymatic DNA will likely need to get to 500-1000 base pair strands to start to push costs down for synthetic DNA.  In turn, lower costs could stimulate demand for new applications, much as lower costs did for DNA reads and sequencing.  The emerging enzymatic DNA industry is structuring itself for that growth, and within a few years we could see strong businesses develop in writing DNA, not just reading it.


A Biotechnology Definition for the Non-Scientist

To a non-scientist, I would say biotechnology is three things: building physical goods with living organisms, modifying living organisms to improve their health or utility, or developing information products using biological data.   

Living organisms like yeast can ferment into beer, which is obviously an old practice.  But since the 1980s, living organisms have been used increasingly in drugs, which used to only be chemically produced pills.  A biotechnology drug like growth hormone is made out of living organisms, namely bacteria, while your bottle of Tylenol is all chemicals.

Living organisms can also have their genes modified to improve their health or utility, which is a second category of biotechnology based on the definition above.   This is the part of biotechnology that produces  "genetically-modified" crops.  It also includes genetically modified ethanol used to make cleaner fuels.      Gene modification therapies are still fairly experimental on humans, but there are FDA approved drugs now that allow for this.  There will likely be more in the future

The third category of biotechnology is information products based on biological data.  That biological data is typically DNA, of which you have 3 billion pairs in each of the 100 trillion cells in your body, so the amount of data you as a human being produce is exceptionally high.  Advances in computing technology have made it much cheaper to read this data, and has led to services like 23 and me, Ancestry, and Invitae which can read your DNA to trace your roots or test you for genetic diseases.    This is also the part of biotechnology that covers DNA testing  used to solve crimes and solve paternity cases.     

Because biotechnology can provide information about people more cheaply than it could even 10 years ago, and also treat many diseases more effectively than chemically-produced drugs, its use is likely to expand significantly over the next decade. 

Saturday, October 10, 2020

What is Biotechnology? A Biotechonomics Definition

Biotechnology is the use of living organisms to build physical items. While often associated with therapeutics, there are a wide range of products in agriculture, food, energy, plastics, even electronics where biotechnology components can serve as raw materials. 

Biotechnology can also be used to build information-based products, particularly in genomics, but these are often used as inputs to select therapies. 23 and me, Ancestry, personal DNA sequencing and genotyping are probably the most widespread information-based biotech products. 

Many well known biotechnology products are therapeutics. Genentech got approval for the first therapeutic using recombinant DNA technology in 1985 and since then biotech-based therapies such as Humira, Remicade, and Rituxan have generated more than $1 billion in annual revenue.

While fermentation has long been used to make foods, a new generation of biotech-based foods has come to market replacing animal-based dietary items. Most notably, Beyond Meat produces hamburger patties using a mixture of plant extracts and proteins.

Sometimes overlooked, electronics and digital hardware are a growing area for biotechnology. Zymergen, for example, is developing microbial-based films that can be used on printed circuit boards and ultimately produce thinner screens and more power efficient electronic components. Their work builds on the success of OLEDs (Organic Light Emitting Diodes) that are now commonly used in flat panel TVs and iPhones. Another highly advanced use of biotechnology in electronic hardware is Twist Bioscience’s DNA storage platform, which can be used to archive digital data, by coding digital bits into DNA bases adenine, thymine, cytosine, or guanine. Netflix recently announced it is using this technology to back up one of its Biohacker series. 

With biotechnology permeating such a wide variety of applications now, it is poised to move well beyond therapeutics, and could well become a dominant raw material for all sorts of goods in the future.

Thursday, September 24, 2020

Can Process Intensification Reduce Biomanufacturing Costs?

Rentschler Biopharma, a global CDMO, claims to have sped up its upstream cell culture process by 35% by seeding a bioreactor with a greater number of cells.  The objective was to reduce the time needed for cells to grow in a typical fed batch process.   By increasing the seeding, more biomass was created, which in turn accelerated the proliferation of cells within the culture.

With the expectation of reduced time to peak cell production, monitoring the culture took a higher priority.  To support this, the company invested in spectroscopy devices that could automatically probe the bioreactor without having to take human measurements.  This also reduced the risk of contamination because there was no need to have a person pull out samples.

Not only does this process save time, it reduces the number of steps needed to get to the “N” bioreactor phase.  Only the N-1 bioreactor step is needed before putting the culture into production.  Therefore, this approach fits into the category of “N-1 Perfusion” many in the industry are investigating as a means to better cost efficiency.

The article provided little detail on how time savings translated into cost savings, and while it mentioned the process was cGMP compliant, little data was offered to compare quality metrics.  Another major item missing from the story was the cost of raw materials.  There’s clearly a financial trade-off here, and it would have been useful to quantify that.   Moreover, in the crowded CDMO market, Rentschler, which entered the US in 2019 when it acquired a manufacturing facility near Boston, has been trying to distinguish itself as an innovator in process intensification for both upstream and downstream manufacturing.  This requires it to make investments elsewhere, and the cost trade off might not pay off.

Rentschler believes that moderate intensification that reduces the time in the bioreactor from 16 days to 12 days minimizes costs the most.  Reducing time further drove up material costs to the point that they exceeded the financial benefits of a shorter scale up time.

In addition to materials savings, there are other approaches to intensification other CDMOs are implementing as they prepare to increase production to accommodate biosimilars.  Samsung Biologics, Repligen, and others have developed N-1 perfusion schemes aimed at achieving similar productivity gains.  

While specific approaches may be proprietary, potential cost savings could make N-1 perfusion more standardized across CDMOs as they aim to reduce costs.  However, it appears that better optimization of raw materials will be essential to making process intensification as cost efficient as it is time efficient. 

Wednesday, September 23, 2020

Scale Up vs. Scale Out

As the biotech industry continues to grow, and production volumes expand, biomanufacturers continue to look for new ways to reduce cost as they accommodate this demand.  One way is to continue to scale up through larger, 10,000 L+ stainless steel bioreactors that have the potential to reduce cost per gram through larger cultures.  An alternative is to scale out, using 2.000L and smaller single use bioreactors that can run in parallel, or in different geographical locations in order to reduce distribution costs to reach patients across the world.  

In practice, many biomanufacturers are looking at combinations of scale up and scale out that can meet various demand levels.  However, with CDMOs taking a larger share of the industry’s manufacturing volume, they need to sustain processes for far more drugs than firms that only manufacture  products developed internally.  In turn, they are looking at scale out designs that allow them to produce a greater range of products than traditional scale up systems.


Single use bioreactors reduce the risk of cross contamination from products sharing a facility, and that plus their flexibility makes them ideal for scaling out production.  Additionally, multiple lines create redundancy should something go wrong with any particular batch.  


The growth of scale out systems does not mean scale up will disappear.  Low titer products like antibodies are often sold in high doses and at high volumes, which still fits well with reusable, large bioreactors.  Similarly, scale out is seen by some as being linked to specific products, particularly autologous therapies like CAR-T.  Due to the sensitives of these therapeutics coming from and going back into the patient, scale out can preserve quality better than scale up in some cases.  However, for more traditional allogenic therapies, scale up can still offer not just lower cost, but provide greater consistency.


While sometimes presented as alternatives to each, it appears that both scale up and scale out will both have applications to serve for years to come.  The decision to use or another will ultimately come down to the type of biologic being produced, not an overarching philosophy about how to manufacture therapeutics.


Tuesday, September 22, 2020

Bioprocessing 4.0 Will Require More Than Analytics


Industry 4.0 is a global effort to bring the latest digital technologies to manufacturing facilities across industries.  It is intended to reduce costs, improve quality, and a host of other “good” things.  It grew out of European efforts in the early 2010s to update their factories, and is often referred to as “bioprocessing 4.0” within the biomanufacturing world.  However, biotech manufacturers have been fairly slow to implement many of its technologies, and its benefits remain a little unclear.


One area bioprocessing 4.0 is showing promise within biologics production is process analytics.  Collecting data during cycles, as opposed to just after batch completion, or only after steps within batch production, could return faster quality indicators.  However, unlike many industrial products, biomanufacturing needs new measurement tools, not just sensors and computers like other industries, due to the very sensitive nature of the product.   


Taking advantage of photonic products used to monitor and manage fiber optic networks in telecommunications networks, Raman spectroscopy is one technology that can measure product features using lasers and infrared wavelengths.  CHO cells emit biophotons during bioprocessing that can be measured by Raman spectrometers without interfering with production.   While these devices add to the capex needed to buildout a facility, they can reduce opex through labor savings and more precise quality measurements during both upstream and downstream processing.


Beyond Raman spectroscopy, many other IoT (Internet of Things) sensors are built for higher volume manufacturing environments with less sensitivity. Chemical manufacturers and oil refineries use these devices to a large extent, but these same IoT sensors can’t be placed in a bioreactor without impacting the cell culture.   Outside of basic environmental monitoring outside the bioreactor, customizing industrial sensors for biomanufacturing would be very expensive for the device suppliers, and leave them with limited volumes over which to cover their fixed development costs.  As a result, the devices are not used as widely in biomanufacturing.


Sensors and spectrometers are needed on the hardware side, on the software side many consultants are promoting “AI” and analytic tools to help automate biomanufacturing.  While this data can be useful in places, it is more widely used, and easier to adopt, in less regulated industries.  Biomanufacturing process analytics can take advantage of additional data points, but development of Process Analytics Technology must be coordinated with the FDA. This is unlike algorithms that can find usage patterns among logs of website visitors, a common application for AI.


While bioprocessing/industry 4.0 could deliver some benefits for biomanufacturing, particularly on the hardware side in difficult-to-measure mammalian cell cultures, it seems like it’s still a long way from being widely accepted in this heavily regulated sector of the economy.