Discover
Cell Culture Dish Podcast
Cell Culture Dish Podcast
Author: Brandy Sargent
Subscribed: 119Played: 1,230Subscribe
Share
© Copyright 2025. All rights reserved.
Description
The Cell Culture Dish (CCD) podcast covers areas important to the research, discovery, development, and manufacture of disease and biologic therapeutics. Key industry coverage areas include: drug discovery and development, stem cell research, cell and gene therapy, recombinant antibodies, vaccines, and emerging therapeutic modalities.
84 Episodes
Reverse
In this podcast, we spoke with Troy Ostreng, Senior Product Manager and David Burdge, Director of Cell and Gene Therapy at CPC about the development of the MicroCNX® aseptic micro-connectors and how they’re helping biopharma teams streamline closed-system operations for cell and gene therapies. What unfolded was a detailed and forward-looking conversation that touched on CPC’s 47-year legacy, the technical demands of advanced therapies, and the company’s plans to drive the future of automation and sterility in manufacturing.
A Legacy That Positioned CPC for Today’s Advanced Therapy Boom
When asked how CPC’s long history in biologics and hospital environments prepared the company for today’s cell and gene therapy landscape, David took us back to CPC’s roots. “CPC was founded in 1978, so that’s 47 years of innovation within connection technologies,” he said. “The first biologic was released in 1982, synthetic insulin, and we were there supporting the industry with open-format connectors on single-use bags.”
From the early development of biologics through the shift to single-use and the rise of stainless-steel/single-use hybrid systems, CPC continuously evolved its connection technologies. They launched steam-through connectors as bioprocessing grew more complex, released their first aseptic connector in 2009, and introduced their first connector specifically targeted for the cell and gene therapy market in 2017.
David explained how that history matters today: “Biologics has about a 35-year head start on advanced therapies. So the question becomes, what lessons can we transfer from biologics to cell and gene therapy as that industry grows at three to four times the rate biologics did in its first decade?”
That perspective, combining biological manufacturing experience with the needs of new therapy modalities, forms the foundation for CPC’s MicroCNX platform.
MicroCNX: The First Aseptic Connector Built for Small-Format Tubing
As cell and gene therapy developers began scaling up manufacturing, they quickly discovered a problem: the connectors used for biologics were not designed for small-volume, patient-specific therapies. Troy described it plainly: “Several years ago, we started hearing rumblings that current connectors weren’t meeting what cell and gene therapy required.”
CPC responded with a deep Voice of Customer (VOC) initiative, interviewing process engineers, operators, manufacturing leaders, and platform developers. Over and over, the same needs emerged.
Operators wanted something simple. “Ease of use was the number one requirement,” Troy said. “Operators needed a product that was easy to use so they could make sterile connections in a short amount of time.”
Processes demanded robustness. “Customers needed a connection they could trust—no contamination, no failures, no weak spots in the connection process,” he added.
Small-volume precise applications required connectors actually designed for them. With autologous therapies, he noted, “We aren’t talking about 1,000 liters; we’re talking about 250 milliliters. And if there’s a mishap, that could mean the difference between life and death for a patient.”
All of this laid the groundwork for MicroCNX, which became the first aseptic connector engineered for small-format tubing.
The “Pinch-Click-Pull” Process: Sterility Meets Speed
One of the standout features of MicroCNX is its elegantly simple pinch-click-pull operation. Troy explained how simplicity came directly from user feedback. “As operators walked us through their pain points, what they needed was clear: a connector they could learn immediately. So MicroCNX has a three-step process—pinch, click, pull. You can literally do it as fast as I say it.” He continued,“Once someone does it one time, they’re basically an expert. That ease of use dramatically reduces operator error.” For an industry where operator variability remains one of the biggest sources of risk and batch loss, eliminating complexity is critical.
Cryogenic Challenges Call for Cryo-Rated Solutions
As the conversation shifted to cryopreservation, a critical component of cell therapy manufacturing,Troy introduced the MicroCNX® ULT and MicroCNX® Nano variants. “These were really developed because therapies were being frozen to –150°C, even –190°C. You need a connector that can be frozen to those temperatures, thawed, and still be as robust as it was before.”
The ULT and Nano were engineered with:
Low-profile geometries to fit inside freezing cassettes
Specialized materials to withstand thermal stress
Chemical compatibility with DMSO and other cryoprotectants
Enhanced durability to survive impacts while frozen
Troy emphasized how critical it was to get the materials right: “We searched extensively for a material that could handle those harsh chemicals and temperatures. What we landed on was PPSU—polyphenylsulfone. It’s chemically sound, and it’s incredibly impact-resistant at very low temperatures.” CPC built these connectors because customers repeatedly told them: existing solutions were cracking, leaking, or becoming brittle. MicroCNX was engineered to overcome all of that.
True Closed Systems vs. Functionally Closed Systems: Why the Difference Matters
A substantial part of the conversation focused on the differences between closed, functionally closed, and open systems—distinctions that are often overlooked but critically important.
Troy broke down the differences clearly: “An open system is exposed at some point. A functionally closed system is inherently open but gets closed temporarily to let fluid transfer. In comparison, a closed system is never open at any point.”
Examples of functionally closed systems include:
Biosafety cabinets (BSCs)
Luer-based connections
Closed system transfer devices
These approaches require:
Sanitization
Careful environmental controls
Operator expertise
And, as Troy noted, “a mishap in one of these can mean losing a very valuable therapy.”
CPC’s sterile connectors—including MicroCNX minimize these risks: “Our connectors allow the system to remain closed 100% of the time. That greatly reduces contamination risk.”
This distinction isn’t merely academic—it has direct regulatory implications as well. David added,“In Annex 1, they refer to intrinsically sterile connection devices—like sterile connectors and tube welders—that allow operations normally requiring Grade A or B to occur in a Grade C or D environment.” That ability to operate safely in lower-grade spaces is increasingly critical as the industry tries to overcome facility and labor bottlenecks.
Why Teams Are Moving Away from Tube Welding
Tube welding has been part of bioprocessing for decades, but David explained why its era may be ending for CGT. “Tube welding was born out of the blood banking industry when no other solution existed. But sterile connectors don’t require capital investment. They’re faster. They eliminate issues like tubing alignment or pinhole leaks. They’re simply more reliable.” As biologics manufacturers have already done, CGT teams are now transitioning toward connectors like MicroCNX® that provide sterile, consistent, low-burden operations.
The MicroCNX® Luer Variant: Supporting Transitional Workflows
Not all workflows are ready to move away from luer-based devices. That’s where the MicroCNX Luer variant fits in.
Troy described how it works.“You connect a syringe or bag with a luer inside the BSC, but then because the MicroCNX® connector itself is sterile, you can take it outside the hood and make a sterile connection elsewhere.” This capability bridges legacy workflows and fully closed systems—critical during process development, technology transfer, or when working with specific devices.
Co-Development: The Heart of CPC’s Innovation Process
As the conversation returned to CPC’s broader philosophy, David highlighted how important customer collaboration is. “It’s all about the customer for CPC,” he said. “We start with Voice of Customer. Our business and applications managers are out in the field understanding real applications and guiding them to the right products.”
This feedback fuels CPC’s two major development tracks:
Catalog product development (platforms like MicroCNX)
Custom-engineered solutions for unique applications
David added: “We maintain a full new product introduction roadmap. Some products will be released broadly. Others will be developed specifically for one customer. But both are driven by real application requirements.” This process ensures CPC’s products evolve in lockstep with the needs of advanced therapy teams.
Looking Ahead: Designing Connectors for Robotics and Automation
Toward the end of the conversation, David turned to one of CPC’s biggest focus areas: the future of automation. “The ultimate customer in this industry is the patient,” he said. “And right now we face barriers—capacity, speed, accessibility, cost. Process automation can significantly reduce those barriers.”
Automation requires connectors designed not just for human hands but for robotics:
Predictable geometries
Features optimized for machine vision
Forces and actuation steps compatible with robotic grippers
Designs intended for automated loading and unloading
David summarized CPC’s future direction: “We’re taking a fresh look at our connectors, reimagining them as something designed for robotic manipulation. It’s a high priority for us.” Troy echoed the sentiment: “Our connectors are awesomely designed for humans. But automation is coming, and we’re focused on the features robots need.”
A Future Built on Innovation and Patient Impact
The interview closed with both guests reflecting on CPC’s mission.
“We’re incredibly passionate about innovation and meeting the needs of our customers through thoughtful product development,” Troy said.
Surfactants are indispensable in the production of biologics, vaccines, and cell therapies. Yet for years, they’ve posed a persistent challenge: they are notoriously difficult to monitor accurately and in real time. That challenge is now being addressed by Nirrin and its groundbreaking Atlas platform, a real-time spectroscopy solution that is reshaping how biomanufacturers measure and manage surfactants.
In this episode of The Cell Culture Dish podcast, Editor Brandy Sargent spoke with Bryan Hassell, Founder and CEO of Nirrin, and Hannah Furrelle, Analytical Scientist at the company, to discuss the science behind Atlas and its implications for bioprocessing.
Real-Time Data Without Compromise
At the core of Atlas’s innovation is its ability to provide high-quality quantitative data in under a minute—without any sample preparation. “The real breakthrough with Atlas is speed with confidence,” explained Hassell. “Time to market for biopharma is increasingly critical, yet a lot of critical decisions still rely on data from assays that take days or even months. Atlas changes that.”
Unlike traditional techniques, which often require significant sample manipulation and suffer from matrix interference, Atlas uses high-precision tunable laser spectroscopy to directly analyze samples in their native form. “What makes Atlas so powerful is that we’re looking at the sample without altering it,” Furrelle explained. “That means the data we get is true to the process—there’s no distortion from prep steps or artifacts introduced by the method.”
Moving Beyond PLS: A New Modeling Approach
One of the technological breakthroughs enabling this leap in performance is Nirran’s move away from PLS models in favor of an iterative optimization framework. This approach eliminates the need for extensive training data, reducing model complexity while increasing robustness and flexibility.
“Where a PLS model might need 20 to 30 bioreactor runs to build a dataset, Atlas delivers data on the fly,” Hassell said. “It’s not only faster, it’s more robust, more compliant, and more versatile—especially for applications like scale-up or tech transfer, where traditional models often break down.”
Applications Across the Biomanufacturing Workflow
Atlas is already being integrated into real-world bioprocessing environments, including both batch and continuous manufacturing. In batch processes, manufacturers use Atlas to confirm critical parameters—like protein and excipient concentrations—before proceeding to the next unit operation. This enables earlier course corrections and helps prevent downstream failures.
“In the past, you either waited days for lab results or moved forward at risk,” said Hassell. “Atlas provides the immediate answers needed to make confident decisions in the moment.”
For continuous manufacturing, the value is equally profound. Atlas provides the real-time, quantitative feedback necessary for dynamic process control. “You can’t have continuous processing without real-time data,” he said. “Atlas gives you the insights needed to support real-time decisions at every step.”
Eliminating Risk with No-Prep Analysis
One of Atlas’s standout features is its ability to deliver no-prep analysis. This eliminates sources of variability that often arise during sample handling and processing.
“We’re scanning samples in their native form,” said Furrelle. “That means what we’re measuring reflects what’s actually in the process—without distortion from dilutions or centrifugation.”
This no-prep capability also speeds up workflows and eliminates risk by allowing operators to verify component concentrations instantly before committing to the next step in production.
Laying the Foundation for Smart Biomanufacturing
Nirrin sees Atlas not just as a data tool, but as a stepping stone to smart biomanufacturing. Although full automation isn’t yet widespread, Atlas is helping to lay the groundwork by delivering trustworthy real-time data, something most operations have historically lacked.
“Right now, we’re focused on validating the technology and educating the industry,” said Hassell. “Without sensors like ours, you can’t have smart manufacturing. But once real-time data becomes available, everything else,automation, digital twins, AI,can start to fall into place.”
Furrelle agreed, adding, “You can’t automate without sensors. Atlas gives you real-time insights that teams can actually use, not work around.”
From QC Tool to Strategic Platform
As teams adopt quality-by-design (QbD) approaches, Atlas is being used well beyond its initial QC role. It’s becoming a platform for optimizing surfactant levels, improving batch-to-batch consistency, and proactively preventing formulation issues.
“You can’t have quality if you don’t start from a place of quality,” Furrelle said. “Validating components before you use them is no longer just ideal, it’s achievable.”
Supporting Regulatory Compliance and GMP Readiness
With growing expectations around regulatory compliance, Atlas offers distinct advantages for GMP workflows. By capturing data directly from the process, without modification, it ensures traceability, eliminates method error, and delivers reproducible results.
“Getting data that wasn’t previously accessible is foundational to compliance,” said Furrelle. “Most of the time, we assume buffer concentration is correct based on scale readings and signatures. With Atlas, you can actually measure and prove it.”
Hassell added that Atlas’s direct-measurement capability enhances confidence in audits and inspections. “When you can take a sample straight from the line and analyze it without modification, you eliminate layers of error. That’s critical for compliance and it’s something our customers are really gravitating toward.”
Built for Sustainability and Greener Excipients
Atlas also aligns with the industry’s increasing focus on sustainability. Its optical system requires no consumables, a rare feature in today’s single-use-dominated industry. “We deliberately avoided a consumables-based model,” Hassell said. “Our customers want tools that align with their long-term goals and sustainability is a big part of that.”
The platform also supports real-time validation of novel excipients, including greener alternatives. Its broad spectral range is packed with chemical information, enabling confident measurement of even chemically distinct surfactants.
“We’ve had great success supporting teams developing new excipients,” said Hassell. “The system’s flexibility and depth of information make it future-proof.”
Real-World Impact: Surfacing Hidden Issues
Early use cases illustrate the power of the platform. In one of the first beta tests, a biopharma team used Atlas to trace a suspected surfactant stability issue. “They didn’t know where the problem started, but within minutes, we helped them pinpoint exactly where the formulation went off track,” said Hassell. “It was a powerful example of real-time debugging.”
Furrelle recalled a similarly impactful use case involving ultrafiltration and diafiltration (UF/DF). “Traditionally, you wouldn’t know if you lost or enriched an analyte until after the step was complete. Atlas allows you to monitor the process live and optimize it. That’s a huge advance.”
A Broader Role in Process Development
With increasing adoption, Atlas is quickly evolving into a comprehensive process platform. “I’ve always thought of it as more than QC,” said Furrelle. “It’s helping teams move from retrospective quality control to proactive process design.”
More teams are now asking not just whether a step worked, but how to improve it. Atlas provides the data needed to make those decisions in real time.
What’s Next for Nirrin and Atlas
As of now, Nirrin is focused on validating the technology and supporting its integration into real-world workflows. With backing from the NIH and a growing customer base, the company has no shortage of momentum.
While the current focus is on protein-based therapeutics, Nirrin is already preparing for broader applications. “We have line of sight into other modalities, like cell and gene therapies,” said Hassell. “But first, we want to solve the problems in front of us really well and help the industry move forward with confidence.”
Looking Ahead: The Future of Surfactant Analytics
Both Hassell and Furrelle are confident that Atlas will play a central role in the future of surfactant monitoring. “We want Atlas to be the go-to instrument for companies developing new formulations,” said Hassell.
Furrelle emphasized the importance of robust analytics during tech transfer. “Right now, that hand off is often based on assumptions and broad specs. With Atlas, you can measure exactly what’s happening and that changes everything.”
Conclusion
Atlas isn’t just solving an old problem, it’s setting a new standard. With its speed, accuracy, and no-prep design, it’s enabling smarter biomanufacturing, better compliance, and more sustainable practices. As the industry moves forward, Atlas stands poised to become an essential platform for quality, innovation, and transformation in bioprocessing.
See Atlas in Action: https://www.nirrin.tech/nirrin-atlas-evaluation-contact-us/
In this podcast, we spoke with Nainesh Shah, Sr. Application Engineer, Asahi Kasei Bioprocess America, about how inline buffer formulation and their MOTIV® system offers a more efficient, scalable, and cost-effective approach to buffer preparation. Traditional methods require large storage spaces, pose risks of leakage, and create inefficiencies that can disrupt production. In contrast, inline buffer formulation enables real-time mixing of concentrated ingredients, eliminating storage constraints and allowing for dynamic adjustments based on demand. With benefits like reduced waste, lower costs, and improved regulatory compliance, this technology is streamlining operations while ensuring precision and adaptability. As the industry shifts toward smarter manufacturing solutions, inline buffer formulation is paving the way for the future of pharmaceutical production.
How Inline Buffer Formulation is Changing the Industry
Nainesh, who has over 40 years in the pharmaceutical industry and six years at Asahi Kasei, highlights the evolution of buffer preparation. "Traditionally, buffer dilution involved a concentrate formulated in advance, which was then diluted with water to achieve the desired solution.”
Modern inline buffer formulation transforms this process by enabling real-time mixing of individual components. "Instead of storing pre-made buffer solutions, MOTIV allows for real-time formulation using individual components. The system precisely combines these ingredients on demand, ensuring accuracy and eliminating storage-related inefficiencies," Shah explains.
Enhanced Efficiency, Cost Savings, and Waste Reduction
The advantages of MOTIV extend beyond storage and formulation flexibility. "With traditional methods, production can be delayed if pre-made buffers aren’t readily available. If a change in concentration or formulation is required, additional time is needed for sourcing and preparation," Shah notes. "With MOTIV, you can use a single concentrated solution to create multiple buffer variants by adjusting the dilution ratio. This eliminates the need for multiple pre-concentrated stocks, reducing storage space, waste and increasing efficiency."
Cost efficiency is another crucial factor. "Return on investment (ROI) depends on whether the facility has an existing buffer preparation setup or is installing a fresh system. For existing setups, ROI typically takes around two years due to transition considerations. However, for new installations, ROI can be achieved within 1.5 years," Shah states. He adds that Asahi Kasei provides an easy-to-use ROI calculator to help companies assess their financial benefits.
Additionally, inline buffer formulation improves sustainability by minimizing waste and reducing the environmental impact of excess buffer storage. By eliminating the need for large buffer stockpiles, facilities can lower their material costs and optimize resource utilization.
Scalability and Customization for Diverse Production Needs
One of the standout advantages of the MOTIV inline buffer formulation system is its scalability. "Our smallest system supports up to 1,200 liters per hour with three inlets—one for water and two for concentrates like acid, base, or salt solutions. On the higher end, we can scale up to 5,000 or even 12,000 liters per hour, completely customizable with multiple inlets based on customer requirements," says Shah.
This flexibility is particularly valuable for pharmaceutical manufacturers with varying production demands. Facilities producing multiple types of buffers can benefit from the system’s adaptability, allowing them to switch formulations with minimal downtime. Instead of maintaining separate storage tanks for different buffer types, inline buffer formulation enables dynamic adjustments based on real-time requirements.
Addressing Complex Formulations and Space Constraints
MOTIV is particularly beneficial for high-volume buffer requirements and complex formulations. "As pharmaceutical processes advance, buffers require multiple ingredients, not just simple acid-base-salt combinations. MOTIV automates these complex formulations with precision, ensuring consistency while reducing human error," says Shah.
Moreover, these systems help optimize space usage. Traditional buffer preparation requires large storage tanks, which can be a logistical challenge for facilities with limited space. By replacing bulky storage units with a compact inline formulation system, pharmaceutical companies can free up valuable floor space for other critical operations, leading to improved overall facility efficiency.
Streamlining Global Operations and Regulatory Compliance
MOTIV enables pharmaceutical companies to streamline global operations. "A single facility’s buffer formulation data can be easily transferred to other locations, reducing the need for redundant validation and documentation. This simplifies global production expansion while maintaining quality and compliance," says Shah.
The ability to standardize buffer preparation across multiple facilities also aids in regulatory compliance. Pharmaceutical companies must adhere to stringent guidelines set by agencies such as the FDA and EMA. Inline buffer formulation ensures consistent production parameters, reducing variability and ensuring that each batch meets regulatory standards. By integrating real-time monitoring and control systems, manufacturers can also enhance traceability and data integrity, which are essential for compliance.
Overcoming Misconceptions and Ensuring Adaptability
Shah also addresses common misconceptions about inline buffer formulation. "Some worry that once the system is set up for a specific product, it can't accommodate new formulations. However, as long as the flow capacity and analytical requirements remain compatible, new buffers can be integrated seamlessly. Even additional analytical technologies can be incorporated if planned ahead."
Another concern among industry professionals is whether inline buffer formulation can maintain the same level of accuracy as traditional methods. Shah assures that modern inline systems are equipped with precise monitoring technology to ensure consistency. Real-time adjustments can be made to maintain optimal formulation parameters, eliminating the risk of deviations.
The Future of Buffer Preparation: Embracing Innovation
Asahi Kasei’s MOTIV system is setting a new standard in buffer preparation. As pharmaceutical manufacturing continues to evolve, adopting innovative solutions like inline buffer formulation is becoming essential. By offering flexibility, cost savings, and process optimization, inline buffer formulation provides a robust solution for modern pharmaceutical production.
"The industry is shifting towards efficiency and adaptability. With advanced inline buffer formulation, companies can future-proof their operations while ensuring consistent quality and regulatory compliance," Shah concludes.
In an era where precision and efficiency are paramount, inline buffer formulation stands out as a transformative technology. As industries evolve, solutions like these ensure that production remains agile, scalable, and ready to meet the demands of the future.
Upgrade to Better Buffer Management: Inline Buffer Formulation Systems | MOTIV®
In this podcast, we spoke with Dr. Jorge Escobar Ivirico, Product Manager, Bioprocess Solutions at Eppendorf, about the fascinating world of induced pluripotent stem cells (iPSCs), exploring their groundbreaking potential in regenerative medicine, personalized therapies, and drug development. Our guest explained how iPSCs, created by reprogramming adult somatic cells, can differentiate into virtually any cell type, making them invaluable for research and therapeutic applications. We delved into the importance of consistency, quality control, and reproducibility in iPSC production, alongside the challenges of culturing these cells, such as maintaining pluripotency and scaling production for clinical use. The discussion highlighted exciting advancements, including the development of organoids and universal T cells, as well as the ethical considerations distinguishing iPSCs from embryonic stem cells. Looking to the future, Jorge envisioned iPSCs becoming a cornerstone of standard medical practice, while acknowledging the need to address safety, scalability, and regulatory hurdles to fully realize their potential.
What are Induced Pluripotent Stem Cells (iPSCs)?
"Induced pluripotent stem cells are a type of stem cell created by reprogramming adult somatic cells, like skin or blood cells, back into an embryonic-like state," explains Jorge. This process involves introducing specific transcription factors, often called Yamanaka factors, to transform these cells into a versatile state. Once reprogrammed, iPSCs can differentiate into almost any cell type, making them invaluable tools for research, drug development, and potentially life-changing therapies.
The Growing Importance of iPSCs
iPSCs offer a range of advantages, particularly their ability to sidestep ethical concerns tied to embryonic stem cell use. “What makes iPSCs so important today,” Jorge notes, “is their versatility and potential applications. Researchers can create patient-specific cell lines, which are essential for drug screening, disease modeling, and personalized medicine.”
This technology is pivotal for regenerative medicine, offering hope for repairing damaged tissues and organs. “From neurodegenerative diseases to heart damage, iPSCs open the door to innovative treatment possibilities,” he adds.
Mastering the Production Process
Producing iPSCs is a meticulous endeavor. "Consistency is key," emphasizes Jorge. Researchers must ensure that each batch of cells meets strict criteria to avoid unpredictable outcomes, especially when precision is vital in both research and therapeutic applications.
Standardized protocols and quality control measures are essential to achieve consistency. These involve monitoring for contamination and verifying the cells' ability to differentiate into various cell types. “Imagine developing a therapy based on a specific batch of cells, only to find that subsequent batches behave differently,” he warns. “Such inconsistencies can jeopardize patient outcomes.”
Tackling Challenges in Culturing iPSCs
Culturing iPSCs presents its own set of challenges. High cell numbers are often needed for large-scale research or therapeutic applications, but scaling up production without compromising quality is no small feat. Maintaining the cells’ pluripotent state is another hurdle, as they can easily differentiate prematurely under certain culture conditions.
"Environmental parameters like temperature, pH, oxygen levels, and nutrient availability must be rigorously controlled," Jorge explains. “Even minor fluctuations can negatively impact cell health and their ability to remain pluripotent.”
Innovations Addressing Culturing Hurdles
To overcome these challenges, researchers are turning to advanced techniques like 3D culture systems and bioreactors. These provide a more natural growth environment for the cells, enhancing their viability and functionality. “By transitioning from traditional 2D cultures to 3D systems, we can better mimic the natural environment, significantly improving outcomes,” Jorge shares.
3D systems also increase growth surfaces for cell attachment and allow for higher cell density, which is critical for producing the large quantities required for therapies. "It’s not just about quantity," he notes. "These systems help maintain the quality of the cells, ensuring consistent results across labs and facilities worldwide."
Transformative Applications of iPSCs
One of the most exciting applications of iPSCs is in creating universal T cells for therapies like CAR-T cell treatments. “Traditional CAR-T therapies require harvesting a patient’s cells and modifying them—a time-intensive process that isn’t suitable for all patients,” Jorge explains. "With iPSCs, we can create a bank of genetically modified T cells that can be used across multiple patients, reducing risks like graft-versus-host disease and speeding up treatment timelines."
Balancing Potential with Risk
Despite their promise, iPSCs come with risks. “If iPSCs are not fully differentiated before transplantation, they can form teratomas—tumors containing various cell types,” Jorge cautions. Prolonged culturing can also lead to genetic instability, raising concerns about mutations that could affect therapy outcomes.
Additionally, advanced techniques like CRISPR gene editing, while revolutionary, carry risks of off-target effects that could lead to unintended consequences. “Ensuring the safety and efficacy of iPSC-derived therapies requires rigorous preclinical testing and long-term monitoring during clinical trials,” he emphasizes.
The Role of Embryonic Stem Cells
While iPSCs offer an ethical alternative, Jorge argues that embryonic stem cells still have a role in research. "They provide unique insights into early human development and serve as important controls for studying pluripotency and differentiation," he explains. "Each type of stem cell has its own advantages, and we should leverage them where they are most effective."
A Glimpse into the Future
Looking ahead, Jorge envisions significant advancements in iPSC-based therapies over the next 5-10 years. “We’re likely to see more approved treatments for conditions like neurodegenerative diseases, heart disease, and certain cancers,” he predicts. Improvements in manufacturing and scaling processes will make these therapies more accessible, potentially transforming them from experimental solutions to standard medical practices.
Remaining Challenges
Achieving these milestones will require addressing several hurdles. "Safety and standardized production protocols are critical," Jorge stresses. Regulatory frameworks also need to evolve to keep pace with these advancements. “The future of iPSCs is incredibly promising, but we must remain diligent in overcoming the scientific, logistical, and ethical challenges ahead.”
The field of iPSCs continues to inspire hope for groundbreaking medical treatments, making this an exciting time in regenerative medicine and stem cell research. “iPSCs are truly a testament to human ingenuity,” Jorge concludes.
In an era where industries are increasingly driven by data and automation, the bioprocessing sector is embracing digital transformation to streamline workflows and improve productivity. However, blending the complex and highly regulated world of bioprocess with digitalization poses unique challenges. In this podcast, we talk to Dr. Simon Wieninger, Head of Portfolio and Applications at Eppendorf SE about how the journey toward digital integration requires well-defined goals, user-centered design, cross-industry learning, and, crucially, trust.
Setting Clear Goals: Purpose-Driven Digitalization
“Digitalization shouldn’t happen for digitalization’s sake,” Dr. Wieninger advises. While the temptation to adopt cutting-edge technology is high, each digital tool or system must serve a specific purpose. For bioprocessing organizations, establishing these objectives upfront is critical to ensure that digital investments yield meaningful results. Whether the aim is to boost productivity in production facilities, refine R&D processes, or improve operational efficiency in support functions like HR, having clearly defined goals anchors digital efforts in purpose.
This intentional approach is especially significant for production and R&D sectors within bioprocessing. Here, digitalization can streamline processes such as real-time data monitoring, automated adjustments to culture environments, and improved reporting and compliance tracking. By aligning digital goals with broader business objectives, organizations can make more effective use of resources and ensure that digitalization contributes positively to organizational growth.
Bridging Skill Gaps and Building Trust: Making Digital Tools Accessible
A successful digital transformation relies on the people who will use these tools day-to-day. However, not everyone in bioprocessing has a background in software or programming. Simon points out that for digital tools to be effective, they must be intuitive and accessible to all team members, from scientists in the lab to technicians on the production floor. "We need to design solutions that everyone can use," he says, noting the importance of user-friendly interfaces that require minimal technical knowledge to operate.
Part of building an accessible digital framework is understanding the varying comfort levels with technology within the workforce. Some employees may be tech-savvy, while others are less familiar with digital tools. Recognizing and accommodating these differences is crucial to creating a smooth transition. Moreover, as Simon explains, trust is fundamental—not only trust in digital tools but also in the partnerships with vendors and technology providers who support this transformation. Organizations should leverage the expertise of these partners, building collaborative relationships to create solutions that meet specific needs and ultimately make bioprocess workflows more efficient.
Learning from Other Industries: Adopting Best Practices in Automation and Standards
The bioprocess industry has much to learn from sectors like automotive, finance, and telecommunications, which have long relied on automation and standardized processes to boost efficiency. In automotive manufacturing, for instance, high levels of automation allow for the production of thousands of vehicles with minimal human intervention. Bioprocessing, by contrast, has historically been more manual and labor-intensive, particularly in R&D and small-batch production.
According to Simon, one of the greatest opportunities for bioprocessing is to adopt industry standards that facilitate automation and improve interoperability across devices. One such example is the OPC (Open Platform Communications) standard, widely used in other sectors for seamless communication between devices. Applying such standards to bioprocessing could simplify data integration across lab instruments and production equipment, allowing researchers to capture and analyze critical information more efficiently. Additionally, industry leaders could set standards for digital protocols and automation practices, which would pave the way for faster digital adoption across the field.
Addressing Regulatory Challenges: Transforming Compliance through Digitalization
The bioprocess industry operates under stringent regulatory standards, which some view as an impediment to digital innovation. However, Simon argues that regulatory requirements shouldn’t be seen as obstacles but rather as factors that digital solutions can help address. In industries like finance and insurance, also highly regulated, companies have successfully incorporated digital tools to streamline compliance. Similarly, in bioprocessing, digital platforms could help standardize and organize the data needed for regulatory filings, potentially reducing the time and resources required for compliance.
Digital solutions, including cloud-based platforms and automated data management systems, could be particularly transformative in regulatory processes. By harmonizing data and ensuring accuracy across workflows, digital tools can make compliance not only faster but more consistent. Additionally, some regulatory bodies are already engaging in discussions about how to incorporate technologies like AI and machine learning into compliance filings. Although these innovations may require time to be fully accepted, regulatory agencies are increasingly open to exploring how new technologies can support safe and efficient bioprocessing.
The Power of Digital Twins: Small Steps Toward Big Goals
A popular concept in many industries, "digital twins" offer exciting potential for bioprocessing. These digital replicas of physical systems enable real-time monitoring, predictive maintenance, and even scenario modeling. Yet, as Simon notes, digital twin technology is still relatively new to bioprocessing, and its implementation will require gradual steps.
In the construction industry, for example, a similar concept known as BIM (Building Information Modeling) integrates all data related to a building project, from design specifications to material inventories, making it available to all stakeholders. While bioprocessing might not yet be ready for an all-encompassing digital twin platform, companies can start small—implementing digital twins for specific workflows or equipment and gradually expanding over time. This incremental approach can yield immediate benefits while setting the foundation for larger-scale digital initiatives.
Embracing Cloud Technology and Real-Time Monitoring: Moving to Industry 4.0
Cloud platforms and AI have been at the forefront of Industry 4.0 for years, but only recently has the bioprocessing industry begun to fully embrace these technologies. Cloud storage allows for centralized data management, enabling teams to access information from any device, anywhere. This flexibility has profound implications for researchers and technicians, who can now monitor processes remotely. “Imagine checking on a bioreactor’s status from your phone while you’re out with your family,” Simon explains. This real-time access not only enhances convenience but also provides peace of mind to scientists who can verify that their cultures are growing as expected.
AI-powered real-time monitoring and data analysis tools also have significant benefits for process efficiency and productivity. By automating data capture and analysis, these technologies enable faster decision-making and reduce the likelihood of errors. Over time, the integration of cloud-based tools and automation could lead to smarter, more efficient bioprocessing workflows. With every iteration, companies will gain a better understanding of how to refine and optimize these processes, driving the industry toward more consistent and scalable digital practices.
The Future of Bioprocessing: A Hybrid of Digital and Analog
Looking ahead, Simon envisions a future where bioprocessing laboratories achieve a balance between digital and analog technologies. While complete digitalization may not be feasible—or even desirable—within the next five years, incremental advances in data processing and automation will help researchers and technicians become more efficient and productive. Automation, particularly in areas like sampling, data integration, and real-time monitoring, will streamline lab work, allowing scientists to focus on research rather than routine tasks.
Ultimately, the goal of digital transformation in bioprocessing is to bring life-saving therapies, such as cell and gene therapies, to market faster. By reducing time spent on data management and process optimization, digital solutions empower scientists to concentrate on breakthrough discoveries. As data standards and digital infrastructure improve, the industry may also see an increase in collaborative projects, as cloud platforms make it easier for teams from different locations, and even different companies, to share data securely.
A Call to Collaboration: Sharing Knowledge for a Digital Future
Closing with a call to action, Simon invites bioprocess professionals to reach out, share insights, and learn from each other’s experiences with digitalization. “Digital transformation has incredible potential to accelerate progress in the bioprocess industry, and I’m excited to be part of that journey,”. By fostering a collaborative spirit and supporting each other in the quest for effective digital solutions, bioprocess organizations can collectively improve human health outcomes and bring new treatments to the world more rapidly.
In the end, bioprocessing’s journey toward digital transformation is not just about adopting new tools—it’s about building an industry that is faster, more resilient, and better equipped to meet the challenges of tomorrow.
In this podcast, we spoke with Dr. George Wang, Vice President of Discovery and Preclinical Services at WuXi Biologics about the importance of identifying potential manufacturing, stability, and scalability challenges early to mitigate risks, reduce costs, and streamline drug development timelines. By evaluating factors such as solubility, stability, and manufacturability during initial candidate screening, companies can avoid costly setbacks later in the process. Advanced tools like high-throughput assays, computational modeling, and AI-based predictions are now essential for these evaluations.
What Is Developability?
Dr. Wang began by defining developability as the assessment of whether a drug candidate possesses the necessary attributes to be scaled up for production during Chemistry, Manufacturing, and Controls (CMC) development and, ultimately, for clinical trials and commercialization. He explained, “It’s about identifying potential red flags early on—issues like aggregation, degradation, or manufacturing inefficiencies—that could derail a candidate further down the line.”
Why Focus on Developability During Discovery?
Traditionally, discovery efforts have focused on identifying antibodies with the highest efficacy and safety profiles. However, the increasing complexity of biologics, including bispecific antibodies and antibody-drug conjugates, has shifted industry focus. Dr. Wang emphasized the costly consequences of overlooking developability in the discovery phase. “Imagine investing millions into a molecule, only to discover insurmountable stability or manufacturability issues during development,” he said. “Performing these assessments early is like an insurance policy, mitigating risks and saving time and resources.”
The Economic Case for Early Developability Assessments
Dr. Wang highlighted the economic rationale for incorporating developability assessments during the initial discovery phase. “The cost of discovery is less than 1% of the total development cost. Spending a bit more upfront can save millions in reengineering or restarting development,” he noted. He also pointed out that superior developability attributes can provide a competitive edge, enabling faster clinical trial entry or product approval.
Key Challenges and Industry Solutions
Despite its benefits, the integration of developability assessments in discovery labs faces challenges. Labs often lack the tools, materials, and expertise required for systematic evaluations. “Developability attributes must be assessed using a robust combination of computational methods, analytical tools, and high-throughput assays, which many labs are not equipped to handle,” Dr. Wang explained.
Companies like WuXi Biologics have stepped in to bridge this gap. “Our Discovery unit collaborates closely with our CMC team to identify and address developability issues early on,” said Dr. Wang. WuXi’s “WuXiDEEP™,” platform has become a cornerstone of their success, helping fix more than 50 problematic molecules and guiding hundreds of projects through the development pipeline.
A Stepwise Approach to Developability
Dr. Wang outlined a stepwise approach to developability assessments, starting with high-throughput evaluations during the initial screening of hundreds of candidates. “We use computational analysis to identify red flags such as post-translational modification hotspots or aggregation risks,” he explained. Promising candidates then undergo more detailed assessments, requiring larger material quantities and lower-throughput methods.
Even when issues arise, solutions like protein engineering can salvage candidates with strong biological functions. “It’s not about discarding problem molecules outright but addressing and optimizing their developability profiles,” Dr. Wang emphasized.
The Role of AI in Developability Assessments
Artificial intelligence (AI) is playing an increasingly significant role in drug discovery, and Dr. Wang underscored its potential in developability assessments. “AI-based modeling and prediction tools enable us to anticipate developability challenges early, optimizing the process even further,” he said.
Practical Impact and Case Studies
Dr. Wang discussed the webinar he gave, Developability Assessment and Optimization in Antibody Drug Discovery, in which he shared case studies that demonstrated the transformative power of early developability assessments. From rapid decision-making to optimized engineering solutions, these examples illustrated how WuXi Biologics helps clients navigate the complex landscape of biologics development.
Final Thoughts
As the biologics industry continues to evolve, developability assessments have become indispensable for reducing costs, saving time, and maintaining competitive advantages. Dr. Wang’s insights underscore the importance of integrating this approach during the discovery phase, laying a strong foundation for successful drug development.
To learn more, please listen to the full podcast and watch Dr. Wang’s webinar, which provides a detailed guide to implementing developability assessments. Developability Assessment and Optimization in Antibody Drug Discovery
In this podcast, we spoke with Isha Dey, Senior Scientist, Cell Biology R&D, at Thermo Fisher Scientific about the challenges researchers face in selecting appropriate cell culture conditions due to variability in cell lines, lack of standardized protocols, and inconsistent reagent quality. Thermo Fisher Scientific's new Cell Culture Select Tool was developed to address these challenges by providing specific recommendations for media, FBS, and cultureware for over 150 cell lines, backed by extensive R&D data.
Understanding the Challenges in Cell Culture Selection
Thermo Fisher Scientific's new Cell Culture Select Tool addresses a persistent challenge in laboratory science: identifying the appropriate cell culture conditions and selecting the right media, supplements, and reagents for different cell lines. The process is complicated by factors like cell line variability, lack of standardized protocols, and inconsistent reagent quality. These issues can introduce variability and impact experimental results, posing a challenge for scientists across labs.
“Different cell lines have unique requirements,” explained Isha. “It’s challenging to pinpoint optimal culture conditions due to variability in cell line responses. Additionally, there isn’t always a standardized protocol across labs or comprehensive information on specific culturing needs. This can make it difficult to select the most appropriate media, supplements, and other materials.”
Ensuring a consistent supply of high-quality products is essential for reproducibility in experiments. Thermo Fisher Scientific's trusted brands, such as Gibco, Nunc, and Invitrogen, are known for their quality, which is critical for minimizing variability in experimental readouts.
The Inspiration Behind the Cell Culture Select Tool
The idea for the Cell Culture Select Tool originated from an update to Thermo Fisher Scientific's online technical reference library. Previously, the website listed recommended media types segmented by cell line culture methods—adherent, semi-adherent, or suspension. While helpful, this list was lengthy and lacked interactive functionality.
Isha said, “We realized that we could streamline this information into a user-friendly tool”. “In our R&D labs, we culture over 150 cell lines using various media, supplements, and equipment. By making this data accessible to other researchers through an interactive tool, we hoped to eliminate the guesswork and enable reproducible cell culture success.”
The tool now provides recommendations for specific media, supplements, and cultureware for culturing, passaging, and freezing over 150 cell lines. With in-house data supporting 75% of these lines, researchers gain access to the resources and insights gathered from Thermo Fisher’s extensive R&D experience.
Selecting Cell Lines for the Tool
The team started with cell lines listed in their technical reference webpage and expanded the list based on the lines frequently cultured in their R&D labs. These labs conduct heavy cell culture work for various applications, including media development, fluorescence imaging, Western blotting, flow cytometry, transfection, transduction studies, and more.
“We wanted to make our R&D data available to researchers for convenience,” shared Isha. “This effort involved many scientists across R&D sites who contributed data and images showing how each cell line appears in recommended media.”
Quality and Verification in Thermo Fisher’s Labs
The tool’s data is backed by rigorous testing in Thermo Fisher’s R&D labs. Cells are grown in their respective media, culture plastics, and consumables over multiple passages to ensure accuracy. For cancer cell lines, STR profiling and mycoplasma testing are conducted regularly, while stem cell cultures are assessed for pluripotency and purity using imaging and flow cytometry.
“Representative images of cell lines, captured using our EVOS imaging system, are available in the tool to help users understand how cells should look in the recommended media,” said the spokesperson. This visual support is particularly helpful for researchers new to specific cell lines, as it aids in verifying successful cultures and maintaining reproducibility.
Future Updates for the Tool
Thermo Fisher plans to continuously update the tool by adding new cell lines and products as they become available. Recent updates included new products like the Gibco CultureCEPT Supplement and Nunc cultureware with Nunclon Supra surface.
“The Gibco CultureCEPT Supplement, for example, reduces cellular stress and improves viability during handling and processing steps where cell damage and death could occur,” explained the representative. “Meanwhile, the Nunclon Supra Surface simplifies work with complex cell lines and primary cells by enhancing attachment and morphology.”
These updates reflect Thermo Fisher’s commitment to innovation, responding to evolving research needs with products that simplify and improve cell culture work.
A Collaborative Effort with Researchers in Mind
The development of the Cell Culture Select Tool was a collaborative project involving many different teams at Thermo Fisher. The goal was to create a one-stop solution for researchers to understand cell culture requirements for any cell line, offering a resource that combines convenience and scientific rigor.
“We hope that researchers around the world find this tool useful,” the Thermo Fisher spokesperson said. “We also welcome feedback to further improve the tool and ensure it continues to meet researchers’ needs.”
The Cell Culture Select Tool is available at thermofisher.com/cellcultureselect, where researchers can explore its features and contribute feedback for future enhancements.
Conclusion
Thermo Fisher’s Cell Culture Select Tool represents a significant advancement for researchers in the field, combining the company’s R&D expertise with an easy-to-use, interactive format. By addressing the challenges of cell culture variability and consistency, Thermo Fisher is enabling more reproducible research and helping scientists achieve better outcomes in their work.
To try the Cell Culture Select Tool for yourself, please see thermofisher.com/cellcultureselect
In this podcast, we spoke with Ryan Bernhardt, CEO of Biosero and Jesse Mulcahy, Director and Head of Automation at Cellino about the importance of utilizing automation in cell therapy research and production and the potential of these technologies to transform the healthcare landscape and improve patient access.
The Challenge of Accessibility in Cellular Therapy
The traditional methods of creating induced pluripotent stem cells (iPSCs) are notoriously laborious and expensive, often costing hundreds of thousands, if not millions, of dollars per patient. This high cost poses a substantial barrier to accessibility for many patients in need of personalized cell therapy treatments. Cellino is leveraging advanced automation, AI, and linear technology to dramatically redefine and improve on traditional production processes.
Advancing Automation in Cell Therapy
Cellino’s approach employs its innovative technology, known as NEBULA. This system utilizes self-contained units, referred to as cassettes, to cultivate personalized cell therapies directly in hospitals.
NEBULA uses AI to monitor cell growth while incorporating laser technology to selectively eliminate unhealthy cells. This level of automation has the potential to reduce the manufacturing costs of personalized stem cell therapies by at least tenfold, making treatments more accessible to a broader range of patients.
Supporting automation for Cellino is Biosero’s Green Button Go software suite, which plays a crucial role in automating the workflows of life science organizations. Ryan explained how their technology empowers life science organizations to automate essential scientific processes, facilitating the scheduling of workflows and direct communication with lab instruments. With the capability to run processes continuously—day or night—labs can maintain and cultivate cells without the constraints of a conventional workweek. This 24/7 operational capacity allows for the rigorous demands of cellular therapeutics to be met more efficiently.
Bridging Gaps with Integrated Automation
Ryan describes how lab automation can no longer be seen as merely robotic arms and conveyor belts; it integrates three key elements: physical, logical, and data. By orchestrating these components, automation streamlines and accelerates research across labs that were traditionally siloed and specialized in specific areas. This approach connects different labs, unifying knowledge, expertise, and data systems, enabling real-time decision-making and data-driven insights. Automation enhances workflows by eliminating delays and optimizing project timelines. It serves as a performance tool for scientists, improving efficiency, consistency, and the ability to address complex challenges, while also incorporating AI and machine learning for smarter, continuous processes.
Jesse Mulcahy, Director and Head of Automation at Cellino emphasized the significance of Biosero’s orchestration software in improving efficiency by optimizing scheduling, reducing downtime, and maximizing throughput in cell therapy production. The Green Button Go orchestrator improves consistency by automating key steps and minimizing human intervention, ensuring reproducible results for quality control. The software is flexible and modular, allowing for easy adaptation of workflows as needs evolve, whether adding new instruments or changing protocols. This scalability is crucial for producing personalized cell therapies more efficiently and at a larger scale.
Addressing Pain Points and Future Trends
Despite the advancements, there are still hurdles to overcome in the biologics’ development landscape. Ryan notes that the field is evolving rapidly, with significant advancements in cell culturing, automation, and decision-making processes. Traditional cell culturing is being automated to assess key factors like cell viability, confluence, and other qualitative aspects, aiding decisions on feeding, splitting, and harvesting. While improvements are still needed, especially for various cell lines and techniques, the integration of cryopreservation and storage/retrieval is also advancing. Newer technologies like spheroids, organoids, and gene/protein therapies are driving automation efforts, with a focus on achieving robustness, cost-effectiveness, and higher quality outputs in life science organizations.
Looking ahead, both Ryan and Jesse envision a future of enhanced modularity and integration of AI in biologics development. As more laboratories adopt decentralized manufacturing systems, the potential for on-site production of personalized therapies will drastically reduce costs and time associated with transporting and scaling these treatments.
The integration of advanced imaging and robotics technologies will also allow for unprecedented precision in monitoring and manipulating cells, opening new avenues in cell and gene therapy.
A Bright Future for Automation in Biologics
The future of automation in biologics development appears promising, with the potential to transform the industry and enhance patient access to innovative therapies. Ryan’s parting words capture the excitement of this new era: “It’s hard to think about science being done in the absence of automation.” Meanwhile, Jesse advises organizations to approach automation strategically, starting small and focusing on areas that promise immediate benefits while ensuring team involvement to facilitate smooth transitions.
As these technologies continue to evolve, the collaboration between companies like Cellino and Biosero will be pivotal in shaping the future of healthcare, making groundbreaking treatments more accessible to patients in need.
To learn more, please see:
Cellino's website at https://www.cellinobio.com/
Biosero's website at https://biosero.com/
In this podcast, we spoke with Fran Gregory, Vice President of Emerging Therapies at Cardinal Health about the cell and gene therapy landscape, innovative solutions to reduce cost, the regulatory environment, and reimbursement.
Fran Gregory brings extensive experience in the biologic drug sector. As a pharmacist, she has worked across various areas, including payer PBM, pharmaceutical manufacturing, and now at Cardinal Health. Gregory oversees the advanced therapy solutions and biosimilars business units, which focus on cell and gene therapies and cost-saving biologics, respectively.
Cell and Gene Therapy Landscape
In the cell and gene therapy landscape, there are about 25 FDA-approved products in the U.S., split into 35% cell therapies and 65% gene therapies (at the time of the interview, now 38). This field has rapidly evolved since the approval of the first CAR-T cell therapies in 2017, and the FDA continues to support innovation, with a pipeline of around 1,500 products under development. The agency aims to approve over 100 products by 2030, potentially benefiting more than 100,000 patients. Therapeutic areas include oncology, hematology, neurology, diabetes, and even conditions affecting vision and hearing.
Gregory notes the unique challenges in this field, such as conducting clinical trials with small patient populations, complex manufacturing processes, and stringent cold chain logistics for distribution. The high cost of these therapies also poses a challenge, as some treatments can cost millions. However, opportunities abound as the healthcare system innovates to improve regulatory processes, distribution methods, and patient experiences.
Reducing Cost
She explains that the high cost of cell and gene therapies is due to the intensive research, development, and manufacturing requirements, particularly for treatments targeting rare diseases. Although the upfront cost is high, these therapies can offer long-term savings by reducing ongoing medical expenses for patients. New payment models, such as outcomes-based agreements, annuities, and warranties, are being developed to increase patient access and manage costs.
One innovative approach is the Cell and Gene Therapy Access Model, inspired by President Biden’s 2022 executive order on lowering prescription costs. This model enables CMS to negotiate with manufacturers on behalf of states, enhancing access for patients and encouraging manufacturer participation. The first products under this model, aimed at sickle cell disease, are expected to launch in early 2025. Gregory expresses optimism about the future of these therapies and their potential to drive further healthcare innovation.
Regulatory Environment
The regulatory environment for cell and gene therapies is evolving quickly as the FDA is committed to expediting the market availability of these products. The agency offers pathways like accelerated approval, where manufacturers can bring products to market based on indicative clinical outcomes and continue gathering real-world evidence post-approval. The FDA’s regenerative medicine advanced therapy (RMAT) designation also addresses the small patient populations in cell and gene therapy trials, focusing less on traditional statistical significance. Looking ahead, the FDA will increasingly emphasize outcome measures and closely monitor manufacturing processes to ensure safety and efficiency as technologies evolve.
Reimbursement
Organizations like the Institute for Clinical and Economic Review (ICER) will also play a significant role in evaluating both clinical and economic outcomes, influencing pricing and reimbursement discussions between manufacturers, governments, and insurers. As the landscape grows, these evaluations will guide not only regulatory decisions but also payment models, ensuring that gene therapies offer value and affordability.
Cardinal Health is deeply involved in the commercialization of cell and gene therapies, providing end-to-end support from regulatory consulting and market access preparation to logistics and patient services. The company assists manufacturers with FDA processes and market readiness, including payer and provider engagement well before a product’s launch. Cardinal Health’s expertise in distribution, particularly with cold storage and just-in-time delivery, ensures that products reach the right place at the right time. They also offer financial risk management and patient support services, including benefits verification and travel logistics, ensuring that patients can access and adhere to their treatments seamlessly.
Biosimilars
In the biosimilars space, there are currently over 50 FDA-approved biosimilars (at the time of the interview, now 60), with a pipeline promising even more competition in coming years. Biosimilars are expected to deliver significant healthcare savings, estimated to reach over $180 billion by 2027. Cardinal Health emphasizes that optimizing the adoption of biosimilars is essential for reallocating resources to fund new, innovative therapies. Uptake has been strong in oncology but slower in other areas like immunology, mainly due to provider education gaps and pharmacy benefit manager (PBM) formularies.
Cardinal Health works to influence biosimilar adoption through provider and patient education, streamlining processes, and addressing regulatory complexities unique to the U.S. market, such as the interchangeability designation. The organization emphasizes the importance of PBM placement of biosimilars on formularies and the removal of reference products to accelerate adoption. By simplifying processes and ensuring healthcare providers understand biosimilars, Cardinal Health aims to expand access, reduce costs, and create opportunities for more innovative treatments.
The biosimilar adoption is crucial for the healthcare ecosystem as it offers a rare opportunity to reduce costs, especially in biologics. By leveraging biosimilars, the healthcare system can reinvest savings into groundbreaking treatments like cell and gene therapies, ultimately improving patient access and outcomes while managing the economic burden.
In summary, optimizing biosimilar use and integrating innovative gene therapies are pivotal to the future of healthcare. Cardinal Health is positioned to facilitate both, ensuring the healthcare system remains sustainable and capable of accommodating new treatments as they emerge.
In this podcast, we spoke with Luca Alberici, Senior Vice President and General Manager, Milan Facility, AGC Biologics about the road to their recent EC and FDA approval to commercially manufacture Lenmeldy™ and their future plans in cell and gene therapy.
What is Lenmeldy?
We began the podcast by talking about AGC Biologics’ Milan site and their FDA approval to commercially manufacture Orchard Therapeutics’ Lenmeldy. Luca explained that Lenmeldy is a gene modified cell therapy product for the treatment of Metachromatic leukodystrophy (MLD), an ultra-rare hereditary disease characterized by accumulation of fats that causes neurodegenerative symptoms. It is a pediatric disease, and patients generally die by the age of five.
With this therapy the patient’s stem cells are collected and modified through the use of a lentiviral vector to add a gene called ARSA that encodes for the right form of the enzyme that these patients are encoding wrong. These modified stem cells are then administered back to the patient so they can immune reconstitute not only the immune system, but the cells will also cross correct through secretion of the right form of the enzyme. After just a single shot of the therapy, there is an improvement in their condition and they develop normally, especially if treated in a pre-symptomatic phase. This is the power of gene therapy at its best.
The Pathway to FDA Approval for Commercial Manufacture
We then discussed the pathway for receiving approval to commercially manufacture this product and how the AGC Biologics Milan team navigated this process. Luca described that it was quite a long journey. AGC Biologics were manufacturing this product at preclinical and clinical phases dating back roughly 15 years. They worked with a series of different sponsors, it was developed by the San Rafaelle Telethon Institute for Gene Therapy in Milan Italy, then GSK continued the clinical development before it was acquired by Orchard Therapeutics. AGC Biologics remained the manufacturer during this time and in 2020 received approval for commercial manufacture of the product in Europe, but FDA requirements are different so over the last two years, they partnered with Orchard Therapeutics and worked to meet the FDA requirements for approval.
Luca explained that approval required a great deal of work on the process, the analytics, the quality system, supply chain, and raw materials. One of the most transversal aspects of the validation of a product is getting it ready to be manufactured for the market and it was great to go this last mile with a strong partner like Orchard Therapeutics. He also credits the infrastructure of AGC Biologics, which is a multi-site global organization and provided the Milan site great support in terms of general quality, standards, procedures, and simply by having faced multiple FDA inspections before. The combination of all these factors was what carried them to the finish line, it required extensive teamwork, not only at the Milan site but also the entire organization.
The Only Site to Receive EC and FDA Commercial Manufacture Approval
I followed up by mentioning how with approval from the European Commission and the FDA, the AGC Biologics Milan facility is the only one in the cell and gene therapy industry to have commercial manufacturing authorization from both the FDA and the EC for LVV and cells. I asked Luca why there are so few CDMO's who have achieved this and what makes the FDA and EMA approval process so challenging? He explained that the Milan site was the first site to receive clinical manufacturing approval for an ex vivo gene therapy in 2003, 21 years ago, when cell and gene therapy was almost nonexistent at the time. They were the first facility to receive approval for commercial manufacturing in Europe for a marketed product in 2015/2016, 10 years ago. Now they are the only site who can do viral vector and cell therapy, both approved from the main authorities, the EMA and FDA.
Most of the time, different services are provided by separate facilities, but AGC Biologics handles everything internally. This approach dates back 28 years when market capacity for such processes was lacking, prompting AGC to build its internal capabilities for vector and cell supply across all geographies. They continue to benefit from their 30 years of accumulated experience. Few Contract Manufacturing Organizations (CMOs) are commercially authorized, and none have been authorized for a decade and for multiple products. AGC Biologics is the excess manufacturer for Strimvelis and has four other filings under review (two with the EMA and two with the FDA). Additionally, they have completed three PPQ filings that are expected to be reviewed in late 2024 or early 2025.
Luca explained that there are currently only 15-20 approved cell and gene therapy products, and AGC Biologics manufactures 5-7 of them, handling about 30% of all late-stage and commercially approved assets. This is due to the newness and difficulty of cell and gene therapy, meaning only pioneering CDMOs have reached this stage. AGC Biologics is proud of their position, and customers continue to rely on them because they offer early innovation, strong manufacturing capacity for late stages, and the quality needed for commercial approval.
Keys to a Strong Partnership
Next I asked Luca what he thought made the partnership with Orchard Therapeutics so successful. He emphasized that mutual trust and transparency have been crucial. They began collaborating with Orchard Therapeutics around 2017-2018, building daily on transparency, technical expertise, and science. Orchard Therapeutics owns the clinical portion and the product but relies on AGC Biologics for the manufacturing of the viral vector and cell therapy. They work together as an integrated value chain, fitting perfectly into the process of collecting cells from a patient, processing them, and then returning them to patients.
Their partnership operates as a single team, continually innovating new solutions and processes. Transparency is key, with a "bad news first" policy to openly discuss any issues with their customers and find solutions together. This approach led to a realistic and ambitious plan that ensured they were well-prepared for regulatory inspections, resulting in their FDA approval. They aim to leverage this success for future approvals and product developments to treat more patients in need.
Advice for Developers in Cell and Gene Therapy
Regarding advice for developers working on cell and gene therapy products, Luca highlighted the importance of being realistic and pragmatic. Balancing the life-saving potential of these therapies with the need to maintain quality and effectiveness is crucial. A conservative approach might not be viable economically or too slow to meet the needs of the therapy. Developers should rely on experienced partners who have demonstrated credibility in the market.
The field has seen significant ups and downs, often driven by unmet expectations and poorly executed ventures. Many new CDMOs have emerged, some overpromising on timelines and costs. Some developers may have been thinking that cell therapy is easier than it is, or simply a copy and paste of other modalities and some CDMOs found themselves having extra capacity and therefore bid aggressively with timelines and cheaper prices. This has set back progress, underlining the importance of realistic expectations and partnering with trusted, experienced organizations to navigate the complexities of cell and gene therapy development successfully. Therefore partnerships are essential, not just with CDMOs but with all players in the value chain who have proven their credibility.
The Future for AGC Biologics
I also asked what the future holds for AGC Biologics. Luca explained that in the coming year, they anticipate manufacturing 7-8 commercially approved products, significantly impacting the industry. Despite this growth, their focus remains on innovation and continuing to serve their early-stage customers, including academia, startups, and mid-sized biotechs in preclinical to Phase 1-2 stages. They are continuously innovating their proven processes in gene therapy (lentiviral and retroviral vectors) and cell therapy (stem cells and T cells) and are expanding their capabilities in AAV (adeno-associated virus) for in vivo gene therapy and investing in emerging cell types like exosomes, T regulatory and others. They enjoy partnership and bringing their customers into commercial manufacturing, but they always start very early in the process, and this is where their innovation is crucial. Milan develops the platform processes and then transfers these technologies to their other sites in Colorado, USA, and their upcoming site in Yokohama, Japan.
AGC aims to establish manufacturing sites for cell therapy across three geographies (Europe, US, Japan) and for viral vectors in Europe and the US, with Milan as the innovation center. This strategy enhances their platform offerings and geographical reach, driving forward their comprehensive and strategic approach in the industry.
I closed the interview by asking if Luca had anything else he would like to add. He emphasized that AGC is more than just a manufacturer; they are a Contract Development Organization (CDO) focused on development. While manufacturing is a significant part of their operations and they excel at it, their true value lies in their development capabilities.
They can work with mature processes, adding value through suggestions, modifications, and scaling up. However, their strength is supporting clients with less mature processes. With 28 years of experience primarily in development, they have created processes from scratch for various cell types and developed comprehensive analytics. They have conducted over 160 tests, many of which are validated for market readiness.
This panel discussion was originally published in the eBook
“ Monoclonal Antibody Manufacturing Trends, Challenges, and Analytical Solutions to Eliminate Bioprocessing Bottlenecks”
You can download all the articles in the series, by downloading the eBook.
Panel discussion members:
Carrie Mason - Associate Director, R&D at Lonza Biologics
Laura Madia - Independent Industry Consultant
Alan Opper – Director of HaLCon Sales at RedShiftBio
David Sloan, PhD – Senior Vice President, Life Sciences at RedShiftBio
Brandy Sargent, Editor in-chief, Cell Culture Dish and Downstream Column (Moderator)
In this panel discussion, we talked with industry experts about antibody process development and manufacturing. Specifically focusing on current antibody titer expectations, analytical challenges and how real time titer measurement is a game changer for bioproduction moving forward.
Where is the industry at today with titer expectations and what are the best practices for measuring titer?
Laura Madia
With respect to expectations regarding titer over the years, what we’ve seen is a need for increased titer
within the upstream development of a drug. As an industry, we have moved from the 80s where titers were closer
to .2 to .5 grams per liter to the early 2000s where concentrations of titer production rose to 3 to 5 grams per
liter. What we see today is a continued increase in titer concentrations, which creates a challenge to make sure
that you have technologies that can accurately measure titer concentration without introducing any errors.
The other thing that we have seen within the industry is the need for more data to not only understand what is
happening in the tank, but also to be able to make decisions about the product as the process is running or
shortly after.
Lastly, it is important to consider people and resources. It has been exacerbated by COVID, but it is difficult
to find people to work within the industry and there are fewer people within a production suite. This has helped
to drive the need for online and remote monitoring and automation to make it easier to get the necessary
measurements.
David Sloan
To follow up on the lack of workers, one of the things that we constantly hear from the customers we are
working with is that training employees can be a real challenge and a very time-intensive process. Technologies
that are easier to use and require less expertise help get people up and running and minimize errors amongst new
users of a technology.
Laura Madia
As for the current best practices for measuring titer, HPLC is the gold standard. But HPLC presents some
challenges including training and HPLC requires a highly skilled person to get accurate results. There is a need
for something that is simple and easy to use when it comes to measuring titer. You will still need HPLC results
for approval and decisions at the end, but to be able to monitor titer throughout the process is important.
What are the challenges associated with the way that titer is measured today and what can we do as an industry
to improve?
Laura Madia
One of the challenges is that most of the assays available today are batch processes, so that lends itself to
providing a retrospective look and means that most people don’t run samples throughout the process. This is
because most people save these tests until the end when they can run a batch and make it more cost effective,
and it is typically a long time to result so running it during the process isn’t helpful. Systems today are more
for batch process and are not set up for at-line measurement, unless you are lucky enough to be able to have an
HPLC that’s dedicated to that tank.
Another challenge is speed and accuracy. Many of the techniques that are offline today are longer assays
because they’re running as a batch. You must wait for the entire batch, which is a long time to first result.
That is where having a system like the HaLCon, which you can move online and do a simple one or two injections
to get the concentration is a nice thing. To be able to move measurements online and closer to the tank with an
easy to use system is critical.
David Sloan
One of the challenges with some technologies is that they just don’t have the dynamic range that’s required to
cover from the low concentrations you see earlier in the tank to those high concentrations that you’re getting
later or at the end of the production run. With a technology like HaLCon, you can measure from the low
concentrations of the .1 grams per liter all the way up to the higher concentrations of 8-10 grams per liter
without needing to do any sort of dilutions or serial dilutions.
What it really comes down to is accuracy and how closely does it agree with HPLC and do you have the dynamic
range that’s required to be able to cover the whole concentration range without needing to do multiple
dilutions. As soon as you start bringing dilutions into the equation, you’re increasing the chance of error and
the variability sample to sample, run to run, and user to user. If you have a technique that is user friendly
and even better if it’s automated to some extent, you minimize the human aspect of it and the potential error
that the lab analyst brings to the assay, thus providing a more reliable and reproducible result.
Alan Opper
I typically see that the instruments out there, except for HPLC, are not fit for purpose. They are used for a
lot of different things. For example, there are some instruments that can only measure titer after it’s
purified, not prior to purification or not in the upstream lab. There are other instruments that analyze dozens
of metabolites, and they can measure titer, however, they’re using methods such as turbidity, which is not as
accurate throughout the whole entire production run. Results could be anywhere from 10 to 30% off the expected
value.
Second, retrospective analysis. Why are we doing retrospective analysis? Why are people pulling samples at the
end of the run to send it to an analytical lab? Well, it’s because the current methods besides HPLC, which takes
a long time, are not very accurate. End users don’t trust the results or making process controlled decisions
based on the results.
Lastly, HPLC is a wonderful test, but it is complex to run. It’s often in a separate analytical lab which can
take, depending on the company and whether it’s production or development, anywhere from hours to days to weeks.
HPLC also requires a very well trained and well versed operator. With the HaLCon, it is a small instrument that
sits directly in the lab, it takes 5 minutes to run using liquid chromatography Protein A. It is very accurate,
and you don’t need to do batch methods of the cells because you could take daily samples and rest assured,
you’re going to know that those are very accurate and reproducible.
What benefits can be gained from real time titer measurement?
Carrie Mason
In the development laboratory, a lot of time you’ll do a batch study for bioreactors. You’re running small
scale bioreactors to optimize the process and taking samples. The process may run 10-15 days and then you batch
the samples all together and send them to get results.
One of the exciting aspects of real time titer measurement with the HaLCon is seeing that it really simplifies
the end use, and it allows the bioreactor operator to run a test on the HaLCon, as easy as injecting any of the
other types of methodologies, they use for monitoring a bioreactor. It allows you to get titer results within
the day, so now your researcher has an actual snapshot of what’s happening. They can start charting that
information and see what their titer looks like in relation to all the other bioreactor parameters. That’s very
powerful, because now instead of having to wait until the end when you gather up all your results and see if
bioreactor one was better than bioreactor two conditions, you can now start understanding what’s happening and
speed up the time to decide on what conditions are the best.
In a perfect world you can use this information to start making changes in your bioreactors, to start tweaking
media feeds or other conditions to get optimal titer. In the past you were just extrapolating cell growth as you
know best cell growth gave you the best titer, but that’s not always what we see in a bioreactor. So, in the
development lab, this gives a lot of power to the end user and now allows them to see insights in their process
and make faster decisions.
Looking at continuous manufacturing, you are not going to have the liberty of waiting for an HPLC result
because your process is running continuously. We are going to have to have controls within that process to
ensure that we’re running within the set points that we want the process to execute in. With technology like the
HaLCon and it’s fast results, you could have this sitting next to your continuous process and be taking small
samples across a day. This would ensure that what you’re putting into your downstream process that is coming out
of your continuous upstream process is exactly what you think it is. This would ensure good process control when
it comes to continuous manufacturing.
Alan Opper
Real time monitoring and real time data can lead to much more accurate process adjustments or real time process
adjustments, which will also increase your process understanding and that will enable you in the development
scheme to save a tremendous amount of time, resources and money in bringing the product to market.
On the flip side, when you get into CGMP production, typically they’ll close down the bioreactor and then take
the sample and measure titer. Based on the titer, they will properly load their columns, because Protein A can
be quite expensive. With the HaLCon,
In this podcast, we spoke with Nainesh Shah, Senior Application Engineer at Asahi Kasei Bioprocess about buffer management including the benefits of inline buffer formulation, and single use inline buffer formulation systems.
Buffer Management
We started the podcast by talking about how critical buffer management is to bioprocessing. Mr. Shah discussed how buffers are required in large quantities during the biomanufacturing process and that traditionally buffers were made in large tanks, stored, and used as needed. However, now real estate in the bioprocessing industry is at a premium and companies are looking to utilize new technologies that can reduce facility footprint. For buffer management, it makes sense to create buffer on demand to reduce the footprint dedicated to buffer production in the past.
Inline buffer formulation is a hot topic with companies who require a large quantity of buffer because it provides a way to create buffer on demand in a much smaller footprint. The interesting thing is that it is now also a hot topic among small R&D scale buffer users as well. Inline buffer formulation systems are ideal for users who need 200 to 500 liters of buffer at a time. The system takes the concentrate and adds clean water to provide just the right amount of buffer on demand. Another benefit of inline buffer formulation is that you can achieve a quick process changeover and move on to the next buffer formulation without spending valuable time cleaning the tank, taking samples, and readjusting the critical parameters.
Recently, any new manufacturer, whether it's a large scale or small scale tends to move into this field of buffer management and operates one or two Inline Buffer Formulation (IBF) systems like the MOTIV™. They then use these systems to make all sorts of buffers needed for their various processes.
The MOTIV Family of Inline Buffer Formulation Systems
Next, I asked Nainesh if he could talk a bit more about the MOTIV family of inline buffer formulation and fluid management systems that Asahi Kasei Bioprocess America (AKBA) offers. He explained how the award-winning MOTIV family has evolved into a series of inline buffer formulation systems designed to help companies move past downstream bottlenecks by driving buffer productivity. The product family includes 3-pump, 5-pump, and custom IBF configurations that can fit most any space, cost, or performance requirements. The MOTIV is a leader in buffer production with a range of scale from 4,500 liters per hour to 10 liters per minute to fit an entire range of volume requirements.
He went on to say that they have added a new feature where MOTIV can fill up bags with buffer and monitor the quantity in the bag to make buffer on demand even easier.
MOTIV SU
Then we talked about the new MOTIV SU, a single use inline buffer formulation system, built to produce complex buffers on-demand effectively and efficiently, all from one pump head, and without the need for CIP/SIP procedures between batches. The innovative design modulates flow through control valves while simultaneously integrating buffer solutions and mixing. As with all the MOTIV systems, OCELOT System Control ensures precise blends every time, controlled by pH and conductivity feedback or flow.
The MOTIV SU is perfect for a biomanufacturer who does not want to spend time with cleaning and validation. It is great for one time use as it does not require time spent in cleaning, validation, and making sure that it is free of all the contaminants and all the buffers which may be harmful for the next process. Another benefit would be if a biomanufacturer used a buffer which had a chemical or ingredient which would be problematic for other processes, and they wanted to eliminate any risk of contamination.
Since the MOTIV SU has replaceable parts, which come as a pre-built unit, it is easy to replace the components and then the system is ready to run again. The MOTIV SU is on the same platform as the MOTIV and is an ideal solution for companies that don't want to disrupt their regular production lines, but still want to use the same production criteria that in the future could be added to production lines.
Nainesh went on to talk about why single use is such an important option. He said that because there is so much time, energy, and money being spent to clean and provide the proper validation, which then sometimes needs to be reverified or revalidated later. Single use is a good option, and if the process permits, it can be used for a long duration of time provided you have the backup data that you are not contaminating. If you're using the same material repeatedly, yes, it's a single use, it's a replaceable part, but it doesn't mean that it's a one time use and then it is gone. It is just like the bioprocess industry using 50 liter and 100 liter bags for holding their buffers, which are all single use and this goes along that same line for the bioprocessing industry.
Single use versus stainless steel
Next he walked through when you would choose a single use option versus a stainless steel option. He explained that a primary consideration would be if you are using anything that would be considered corrosive. If a company is using something in their process and they are not sure whether it is corrosive and they don't have any documentation or any test reports to show whether it is corrosive or whether stainless steel will hold up and they don't want to damage the stainless steel equipment, it would be good to have a single use option as a fail proof. They can start bioprocessing studies in the single use system and at the same time start doing some studies with stainless steel to see if there is any issue with corrosion. In six months time, if they don't find any corrosion issues, they can switch over to a stainless steel option. Another reason to go with single use is color. Any chemist or bioprocessing person would know whether it's going to impart coloration or affect the stainless steel from the inside, that is again a quick answer to go for a single use option.
Another area where single use is a good option is if you are considering whether to invest and want to prove that your technology works. You could start with the single use and then when you have a more permanent situation you can move into stainless steel.
I followed up by asking if there are any striking differences between the single use and the stainless steel option. He explained that there are a few things that need to be considered when you're using a single use system. First, because it uses only one pump, which is also a mixer and it creates a suction force to take the concentrates in, there are limitations on the maximum or minimum flow that you can get and the concentrations. It is not good to use very concentrated material and then use a small quantity per minute to do the dilution. So that is a limitation that the user should consider.
The MOTIV SU works on the same principle of mixing feedback control loop for the pH and the conductivity, or the flow and it can control in all the three parameters. It has its own tolerance values that the user can put in of what is acceptable or not acceptable. When it is not acceptable, the system will divert buffer out to a waste outlet and then when it gets back into specification it will revert the buffer back into the product outlet or the designated outlet. Collection of the buffer will start and the system will monitor how much is collected. The user can define and set all these parameters in their method, which are very easy to use and intuitive, just a quick 15-20 minute walk through can get a team up and running.
Set up and operation
Next, I asked if he could talk specifically about the operating system for this unit and also answer the question of if an end user wants to create a new buffer, do they need to come to AKBA. He said that the operating system is based on a Windows based PC housed inside the electrical enclosure with Allen Bradley controls for all the control mechanisms. The user does not have to contact AKBA or work with AKBA if they want to add any new buffer or modify a new buffer. They can make their own buffers, but we teach the user how to make methods via a drop down menu with fill in blanks. Then they can run the method and the system is ready to go. Customers can always come to AKBA if they're having any issues with making a buffer and we are always there to help guide them through.
Lastly, I asked what the set up is like for end users who are interested in implementing the system. He explained that the regular MOTIV and the MOTIV SU are all skid units which are mounted on wheels that can be easily wheeled in and out of an area as needed. As he said earlier, real estate is very important in the bioprocessing industry and rooms are being utilized for multiple purposes. The mobility of the MOTIV family means that the user can connect to an electrical power supply that can be 110, 220 or 230 volts, in some instances with a higher flow rate it may need the three-phase power supply, but those are all plug ins. Then they connect the air supply for the valves and everything to open. Then connect the concentrate containers, that can be bags, 1 meter cube totes, drums, etc. Lastly, they direct where they want the buffer to go whether it's going to a container sitting right next to the system, or it could go into a port that can transfer the buffer into another room. That’s it, very easy and no special tools required for starting the system up or shutting it down. The floor size requirement is a typical office table space.
It has a small HMI on top of it to see how the process is moving and all the pertinent data is is displayed there. If there are alarms it will show the user right there what is happening and the operator or the user can take care of it right away
For more information,
In this podcast, we spoke with Margherita Neri, Director of Vector Process Development, Milan Site at AGC Biologics, Andrew Laskowski, Global Product Manager Bioreactors at Cytiva and Andreia Pedregal, Upstream Applications Specialist Manager at Cytiva about large-scale viral vector manufacturing. Our conversation included discussions around scalability, AAV (adeno-associated virus) and lentivirus production platforms, adherent culture, and next generation bioreactor improvements.
I began the interview by asking Margherita about her work at AGC Biologics. She explained that as the Director of the Vector Process Development Unit, her team is responsible for process development of large scale viral vector production for gene therapy applications. Her team is also the first point of contact for new clients.
Next, we talked about the types of viral vector platforms that AGC Biologics operates. Margherita described that at their Milan site, they offer AAV (adeno associated virus) and lentiviral vector production platforms in adhesion and in suspension, at 50-to-200-liter scale with expansion planned for up to 1,000 liters.
I then asked her about some of the differences between adherent cell culture and suspension cell culture paths to commercial manufacturing. Margherita said that the first consideration is that most clinical trials in gene therapy have been sustained with vector produced from adherent cells, typically via processes performed using Cell Factory™ or Cell STACK®. Now that those gene therapy products are being commercialized, manufacturers need to increase scale and demonstrate comparability using a minimal comparability exercise. So, systems that allow adherent scale up are very useful in this process.
Suspension processes are appealing from a scalability point of view because historically they were used for traditional protein bioproductions which can be scaled up to 20,000 – 30,000 liters. Of course, this scale still needs to be demonstrated for vector production that is performed mainly using transient transfection at 200-500 liter scale for lentivirus and between 500-to-1,000-liter maximum scale for AAV. Margherita went on to say that another important aspect in comparability between adherent and suspension systems is quality of the vector in terms of impurity profiles. She said that with adherent processes, cells are attached to the growth support, and the levels of host cell protein and cell DNA are lower when compared to suspension processes. This is very important for lentiviral vector production that is used in vivo where the requirements for impurity levels are very challenging, especially considering that for lentiviral vectors there is currently no affinity step for purification.
I followed up by asking her how AGC Biologics can help customers that want to stay in adherent culture to scale up from current processes, for instance, from flatware to larger-scale production. She explained that when customers ask for a scale increase, they usually offer the iCELLis™ platform. First, they demonstrate at small scale the feasibility of the transition from flatware to the iCELLis bioreactor using the iCELLis Nano bioreactor. Using the iCELLis Nano bioreactor, AGC Biologics has developed a full upstream and downstream process that is highly representative of their process using the full-scale iCELLis bioreactor.
AGC Biologics can then propose that customers use the vector produced in the iCELLis scale-down model to perform a comparability study between a clinical vector and the future commercial or large-scale vector. This comparability should be based not only on the comparison of titers, residuals, and all the CQA, but also AGC Biologics suggests performing a test of cell transduction on the target cells (i.e. CD34 or T cells) and evaluation on these cells of transfection efficiency – vector copy number, residuals and functionality.
I followed up by asking Margherita about whether the iCELLis system was scalable. She said that yes, in their experience the system is scalable, with some work required to perform the scale up from the iCELLis Nano bioreactor to the iCELLis 500 + bioreactor. They did a lot of experiments to verify the scalability from the iCELLis Nano to the iCELLis 500 +bioreactor. After these preliminary studies the productivity data in the full-scale runs was very comparable to the runs that were obtained in the iCELLis Nano.
Next, I asked Andrew about any recent improvements that had been made to the iCELLis bioreactor. He explained that the iCELLis bioreactors have been commercially available for a little over ten years. The company has been launching new products and improvements to this family on an almost annual basis. For the iCELLis 500 bioreactor specifically, they launched a significant hardware upgrade in 2017 of the iCELLis 500 + and this upgrade allows the system to be implemented into GMP clean rooms much more easily. They have also continually upgraded the single-use system (the vessel) for the iCELLis 500. They began offering genderless sterile Presto™ connectors on the vessel also in 2017. In 2019, they integrated more stable and reliable single use sensors. “And most recently, in 2021, they launched an improved and more robust iCELLis 500+ bioreactor, called the Generation R vessel".
For the iCELLis Nano, they launched the mPath™ control tower with a new docking station and Pall Link software back in 2018. Then they launched the second-generation vessel and weldable manifolds in 2020. They recently launched mPath Link software, which is their second-generation control software. It includes new features such as enhanced multiphase recipes, a mobile phone application that allows monitoring and control of the bioreactor from the convenience of your phone, as well as the ability to receive e-mail or text message alarm notifications remotely. Finally, they have validated the bioreactor to a higher standard of electromagnetic compatibility. The system has immunity against more severe electromagnetic interference events, which makes it a more stable and reliable system.
AGC Biologics was one of the first companies to receive electromagnetic compatibility upgrade; I asked Margherita about the improvements. She said that with the previous version of the iCELLis Nano that they experienced some electrical issues that affected some runs, but after the upgrade, they didn’t experience any more events and the upgrade improved their successful batch completion rate by over ten percent.
Next, I asked Margherita if she had any data that she could share on how the scaled-down iCELLis Nano is representative of the results in the iCELLis 500. She shared that they have completed more than 100 runs in the iCELLis Nano and about 10 runs in the iCELLis 500, both with LV and AAV vectors. The process development work was performed with GFP transgene as the gene of interest. They obtained very comparable titers in the iCELLis Nano and the iCELLis 500 in the bulk vector. For example, for AAV they used the 333 square meter packed-bed and they obtained 7×10 vector genomes per ml (AAV6) and 1×10 VG/mL (AAV8). For lentiviral vector production, they used the 133 square meter packed-bed and they produced an average of 4×10 per ml. This data was confirmed with different genes of interest from different clients.
Next, I asked Margherita what she found to be the biggest benefits of working with the iCELLis portfolio. She said that it was the first instrument that allowed them to scale up their gold standard process previously in Cell Factory vessel. They were able to scale up to 133 square meters very consistently, exploiting efficient control of gases and temperature that of course was absent in the traditional Cell Factory asset. The iCELLis system also allowed them to move to a more compact and closed system suitable for GMP requirements, particularly important with a commercial product. She also emphasized that they have had a very good experience with Pall Life Sciences (now Cytiva) technical support that helped them transition from the Cell Factory to the iCELLis bioreactor.
I followed up by asking Margherita about specific examples of the technical support she received. She said that she can give very good feedback for the technical support. From a scientific and engineering point of view, they were supported in the development of their platforms, in particular by Andreia who helped them understand the instrument and provided local technical support in Italy. This was a key factor in ensuring the speed and readiness of intervention in case there were any issues with the instrument.
Since Andreia was part of our panel, it was the perfect opportunity to ask her about the kind of support that she provides for iCELLis system customers overall. Andreia started by saying that she was representing the worldwide upstream field application team at Pall Life Sciences (now Cytiva), a team of scientists with bioreactor and bioprocess expertise. Their mission is to help customers successfully run their processes with iCELLis technology. The team listens to their customers to understand the process that has been developed. They then translate that process or help design a process that will be transferred to the iCELLis system. Then they come to the client site and conduct operator training to execute that experimental plan. She added that when they finish that training, the collaboration is only beginning as they plan follow up meetings to discuss the data generated and discuss process optimization and process scale-up.
I then asked Margherita about how they have integrated the iCELLis Nano and iCELLis 500 + bioreactors into their manufacturing platform. She explained that they have the AAV platform with the 333 square meter packed-bed that allowed them to obtain 480 liters of viral vector.
In this podcast, we talked with Dr. Ma Sha, Head of Bioprocess Applications at Eppendorf SE about advancements and challenges in cell and gene therapy production along with solutions for scale up and transition to stirred-tank bioreactor suspension culture.
We began the interview by talking about the biggest advancements in cell and gene therapy, including CAR T-cell therapy development, clinical results and FDA approvals. Another area of great advancement is induced Pluripotent Stem Cell (iPSC) Culture technique. Dr. Sha explained that it used to be very difficult to culture iPSCs until it was possible to culture iPSC suspension spheres in stirred-tank bioreactors, which was a big breakthrough in the cell and gene therapy area.
I followed up by asking Dr. Sha what he sees as the major challenges in the development and production of cell and gene therapies that still need to be addressed. He said that one of the major breakthroughs has been the autologous therapies that have been approved, particularly CAR T-cell therapies. However, this has also been a major challenge, because the autologous model is not cost effective. As a result, there has been a shift toward developing allogeneic therapies and building this production model will be a major challenge moving forward. Another challenge on the manufacturing side is ensuring to follow Good Manufacturing Practice (GMP), as it has been a common request from cell and gene therapy companies.
Next, I asked him about the move from 2D to 3D culture and his experiences with this transition. Dr. Sha shared that several of the projects that Eppendorf bioprocess works on start as 2D culture in flasks, it is a natural place to start for most of the cell lines since they are attachment cells. They must then be converted into suspension culture to enable 3D culture, since 2D culture significantly limits the yield and productivity. He went on to say that if you look back at the evolution of antibody production, it was important to convert production to suspension cell culture and this is also necessary for the cell and gene therapy field. Moving from 2D to 3D culture and especially utilizing stirred-tank bioreactors enables much higher yields. As it stands, the yield for cellular therapy cell production is fairly low, especially compared to the industry standard of CHO cells used for antibody production, so a lot of improvement needs to happen.
We then discussed stirred-tank bioreactors and their increased use in cell and gene therapy development and production. I asked Dr. Sha what are the key factors that developers should consider when choosing stirred-tank bioreactors. He explained that stirred-tank bioreactors fit the model of allogeneic production. Autologous models are not suitable for stirred-tank bioreactors. Developer companies need to keep in mind that if they want to move to stirred tank bioreactor platform, they need the production model to be allogeneic. In addition, it is important to consider the support available with respect to scaling up and leveraging supplier experience. For example, Eppendorf bioprocess over the years has produced many application notes to help customers scale their manufacturing. They have even built model production systems in Eppendorf stirred-tank bioreactors. The program called “Scale up Assist” allows customers to skip much of the difficult calculations required to achieve reproducible yields when moving from smaller to larger vessels.
Eppendorf has a very long history of working in protein-based therapeutic cell culture production and about ten years ago expanded to include cell and gene therapy. I asked Dr. Sha in his experiences, what are the most important takeaways in terms of areas that still need work and advancements on the horizon. For instance, what can we learn from protein-based therapy cell culture to apply to cell and gene therapy production? He said that he thought that allogeneic production is a great lesson learned because simplicity is king. The ability to produce one therapy for different patients from one source cell line would greatly simplify the workflow and cost model. He stressed that the one takeaway that must happen for the field to become more successful would be the allogeneic model. This model also fits perfectly with stirred-tank bioreactors. Another takeaway for stem cell therapy would be the need for more success stories, including more FDA approvals to build successful examples so companies can follow the same path. There are a lot of cell therapies in clinical trials, but the number of approved therapies is small, especially when compared to CAR T-cell therapy, for example.
I closed the interview by asking Dr. Sha if he had anything else to add. He said that in addition to Eppendorf’s long history with stirred-tank bioreactors, they have acquired two bioprocess companies previously known as New Brunswick Scientific and DASGIP GmbH to add to their significant history and expertise in bioprocessing. In addition, Eppendorf has a lab in the United States and one in Germany where they are developing sophisticated cell and gene therapy applications including iPSC suspension culture in their SciVario® twin bioprocess controller. Using this tool, they have developed a very high-density mesenchymal stem cell culture of around half a million cells to 1,000,000 cells per ml. In an upcoming application note, they will showcase their extremely high yield 15 million cells per ml bioreactor solution. Eppendorf is committed to developing cutting edge applications to support customers in the cell and gene therapy space.
To learn more, please see:
Eppendorf Bioprocess
Application notes links:
Stem Cell Exosome Production on the SciVario® twin, a Flexible Controller for Your Bioprocess Needs
The DASbox ® Mini Bioreactor System as a Tool for Process Development And Stem-Cell Derived Exosome Production in Standardized Culture Conditions
In this podcast, we talked with Nathalie Dubois-Stringfellow, Senior Vice President of Product Development and Management at Sangamo about Sangamo’s work in gene therapy and the latest data on Sangamo’s gene therapy product candidate for Fabry disease.
I began the interview by asking Nathalie if she could talk about Sangamo and the company’s pipeline. She explained that Sangamo is a genomic medicine company dedicated to translating groundbreaking science into medicine. Their technology includes gene therapy, genome editing, and cell therapy.
Their zinc finger nucleus platform allows them to edit genes either by adding genes, deleting genes, repairing mutation, repressing the expression of the gene, or activating. It is a vast area of technology that they can apply to a variety of diseases.
Using their breakthrough technology, they were the first to edit human genes, treat patients with gene edited T cells, treat patients with in vivo genome editing, and treat patients with engineered T cells.
Our current clinical focus is on Fabry disease, a rare genetic disease and Hemophilia A sickle cell disease.
She then described their recent clinical data on ST-920, a gene therapy product candidate for Fabry disease, that continues to be generally well-tolerated and presents sustained α-Gal A activity based on data from nine patients.
She said that they were extremely excited about the result of this Phase I-II clinical trial. Fabry disease is an inherited disorder that is caused by mutation of the galactosidase alpha (GLA) genes which leads to deficient alpha-galactosidase A (α-Gal A) enzyme activity. This enzyme normally breaks down a fatty substance called globotriaosylceramide and without this enzyme this fatty substance builds up in the cells throughout the body, particularly in the skin, kidneys, heart, and nervous system.
The current standard of care for Fabry disease is an intravenous infusion of the missing enzyme, the treatment being called enzyme replacement therapy or ERT. This provides a high concentration of the missing enzyme for a very short time and the treatment has to be repeated in those patient every two weeks. It's a very cumbersome infusion that can take several hours and typically needs to be done in the hospital, thus negatively impacting patient quality of life.
Sangamo’s approach is a one-time therapy treatment where the gene for the missing enzyme is delivered to the liver cells of the patient, which are then acting as cell factory for producing the missing enzyme.
Please listen to our full interview using the player above or download on the player using Apple podcasts, Spotify or More.
In this podcast, we talked with Dennis Hodgson and Phil Sanders from Agilitech about the benefits of single-use mixers, dealing with supply chain concerns, ensuring scalability, and tailoring a mixer to meet specific process needs.
Benefits of Single-use Mixers
We began the podcast by talking about the overall benefits of single-use technologies for mixing. Dennis explained that single-use mixers are very versatile and can be used to replace stainless steel vessels within the manufacturing area. Single-use mixers all have the same advantages of other single use components, such as coming fully sterile and eliminating the need to steam and clean in place.
Dennis went on to say that another big advantage that single-use mixers have over stainless steel is the ability to customize. For example, a 500 L single-use mixer can be used with a virtually unlimited array of customized vessel configurations, which would include the inlet outlet, port configurations, sampling ports, vent filters, and various process analytics that can be added.
Next, we talked about adoption of single-use technology for mixing and possible concerns that customers might have. Dennis shared that a big concern recently has been supply chain shortages that have created limited availability and long lead times for single-use consumables. He said that he has heard from some clients that they have had to skip planned production batches because the single use bags that they needed to process the batch were not available. Phil added that supply chain concerns have caused some of their clients to think about moving to stainless steel systems to avoid any production delays.
Single-use Technologies Supply Chain Challenges
I followed up by asking what could be done to address single use supply chain issues moving forward. Dennis explained that Agilitech has the luxury of not being tied to any one supplier, so they can source from multiple vendors. This allows them the flexibility to move between vendors and load projects based on their capacity and lead times. This also allows them to make sure that they are offering competitive pricing because vendors know that they're not the sole source of a component.
Ensuring Flexibility in Single-use Mixing
We then talked about mixers presenting unique challenges in that they are used for a variety of applications with many different demands. I asked how Agilitech can ensure that their single-use mixer has the flexibility needed for multiple applications. Dennis explained that because Agilitech isn’t tied to a single design, they are able to have conversations with the client to customize a solution for their needs. Their main goal is to make a product that meets the needs of the individual companies and their process. Additionally, they design their systems purposefully to handle many different capabilities such as sampling, analytical measurements, weight measurements, temperature control, etc. Because they use standard control hardware, their mixing vessels can easily be integrated into existing control systems such as Delta V or Wonderware through the available Ethernet IP connection. This allows users to read and write to certain control parameters.
I then asked about which options are available for customization on the single-use mixers. Dennis said that they can customize all the inlet and outlet ports with regards to port size, tubing length, connector type, etc. As far as the mixing units themselves go, they can be jacketed or not, have load cells or not, have probe analytics such as pH, conductivity, temperature, DOE, and optical density, so all those different analytical devices can be incorporated as well.
Phil added that if there are specific standards within an organization, for what control systems need to be installed on these systems Agilitech is flexible with Rockwell, Delta V, Siemens, all the major platforms that customers might need.
I followed up by asking about how these customization options affect the cost of these mixer systems. Phil said that frequently when customers hear the word customization or even tailored, they think that is going to drive the price up or maybe even drive the delivery time out. When Agilitech tailors a solution, it speeds up the delivery time and reduces the cost because everyone is on the same page and has alignment on what will be delivered. Agilitech works through all the issues up front with clients, so customers get exactly what they’ve asked for.
Single-use Mixer Scalability
We then discussed scalability and about how low volume mixers are difficult to find. I asked how Agilitech has been able to build scalability in their product line. Dennis described that their current sizes that range from 10 L to 650 L. He shared that their 10 L size has a minimum working volume of 1.25 L, which permits a process application such as a highly concentrated TF step to be able to be used with that mixing vessel. Additionally, because Agilitech has a total of seven different sizes, it allows customers to scale up their process for additional capacity using identically designed equipment. Levitronix is working on motors and impeller sizes that would permit Agilitech to offer up to a 3,000 L size in the future.
Easy Process Implementation
Next, I asked Dennis if he could talk about process implementation. He explained that each vessel comes with its own easy to use touchscreen interface that allows users to locally control the start and stop of the motor, change the setpoint of the speed on the agitator, and read local measurements for the weight, temperature, and other process analytics that are connected. Also, by using standard industrial hardware and standardizing on up-to-date future proof communication protocols such as Ethernet IP these systems can very easily be integrated into existing control systems.
Lastly, I closed the interview by asking Dennis and Phil if they had anything else that they wanted to share with listeners. Dennis said that the industry has gotten too used to there being too few options in the market and perhaps have started to shape their processes to align with what is currently available. At Agilitech, the aim is to shift that perception and put the power back with the end user to define their equipment requirements that meet their process instead of vice versa.
For example, changing the inlet and outlet size of your single-use mixing bag to better fit your process shouldn't increase the costs and time to receive that bag significantly. End users should remain rigid in their requirements and process. Vendors should be the ones that remain flexible to meet the needs of the user.
Phil added that the solution should be plug and play to some extent. It shouldn't be a monumental effort to implement a new solution in an existing process. A plug and play approach allows companies to get to market faster and to source components quicker.
To learn more, please visit Agilitech Single-Use Mixers.
In this podcast, we spoke with Emanuel Krobath, Biopurification Specialist and Chiara Pacini, Bioprocess Specialist both with Pall Corporation about gene therapy process development including challenges and resources that are available for support.
I began the discussion by asking Emanuel and Chiara to tell listeners a little bit more about their jobs and how they support gene therapy developers on the bench. Emanuel started by saying that as a bioprocess product specialist, he performs customer bench case studies at the customer site, specifically for the downstream process including vaccines, recombinant proteins, monoclonal antibodies and gene therapy products. He shared that the customers he works with are usually in preclinical or Phase I studies and he supports them from clarification to the final sterilizing grade filtration. This scale up, optimization, and technical support is offered free of charge to help customers succeed in their process development. He said that he also finds new technologies and ideas for the Pall R&D team during these visits.
Chiara shared that she supports customers from bench scale studies through the manufacturing process on downstream starting from clarification to sterile filtration. She spends most of her time traveling to her customers’ laboratories or manufacturing sites to provide general support, conduct optimization studies and technical support training to find the best practice or membrane selection for their process.
I then asked if they could share what are the most common questions that they get from their customers. Emanuel said that what size filter do they need for a specific product and what is the best material to use is one of the most common. Chiara said that for her it is how to intensify a process or make it more robust for clarification, TFF, chromatography, and membrane filtration.
We also talked about a series of videos on Pall’s website and how these were created to help translational academics who work in gene therapy. Emanuel explained that they wanted to support academia specifically in their scale up and small-scale process development, because often in academia, the user will take the first filter that is available at their site. It is important that they understand and have the support to select the correct filter for their product, so that the process is optimized at manufacturing scale. Chiara agreed that the videos were designed to show we can support the development process not just for manufacturing scale, but also for initial bench scale studies. This and the initial optimization study that Pall performs with the customer ensures scalability to large scale processes and identifies the critical process parameters needed to reach high yield and product productivity.
Next, we discussed what they like most about the work that they do. Chiara described how being a bioprocess specialist gives her the opportunity to meet the people in both large and small companies who are working on these therapeutics. She enjoys supporting the development of different molecules and gene therapies and is always updated on the latest techniques used for gene and cell therapy. Emanuel said that he enjoys traveling, which is important because visiting customers in person is a big part of his job. He added that it never gets boring since he is supporting customers as they deal with very diverse processes and challenging problems. His favorite part of the job is that basically they are doing scientific work at the frontline, and he saw this to an even larger extent during the COVID pandemic as they were involved in nearly every vaccine process development.
I followed up by asking which projects that they were most proud of. Emanuel said that with the exponential growth of plasmid DNA demand, as it is either used as a template for mRNA vaccines or the molecular function for DNA vaccines, the upstream and downstream processes have not been optimized. Now, a couple of very eager Pall scientists, including Emanuel, are optimizing the plasmid DNA process map, particularly the clarification, as this seems to be the most challenging.
Since Pall and Cytiva are two sister companies under the Danaher umbrella, the goal for this project is to provide a complete process map for the upstream and downstream solely using Pall and Cytiva products.
Chiara agreed that they are focused on the plasmid platform due to the COVID pandemic. Last year she worked with an Italian customer to develop a COVID vaccine with plasmid DNA to mRNA in a liposome carrier in the clinical stage. She also worked with Pall colleagues and other companies under the Danaher portfolio, Precision Nanosystems and Cytiva, on a global project.
I then asked about what they saw as the most difficult aspects of process development, purification, and downstream processing of viral vectors. Chiara said that she feels the most challenging part is to achieve high virus titer and active particles. For example, she said that they work on the AAV viral vector process and it is very robust and established compared to the lentiviral process which is still very challenging due to stability and therefore you have to manage pressure, temperature and shear stress. Over the last 2-3 years the trend is moving to pDNA and mRNA.
I closed the podcast by asking if they had anything else that they would like to add for listeners. Emanuel said that filtration is not something redundant and should be carefully thought through, because perhaps the product becomes a game changer along the line. Studies done during academia studies could save a lot of money and time during the process development.
Chiara added that she joined Pall to make a contribution to the development of gene therapies and she has worked on covid vaccines, cancer therapy and personalized medicines. She sees that personalized gene therapies for cancer treatments are a huge target for academia and the emerging cell therapy work.
Learn from experts and discover how using scalable manufacturing tools can accelerate your gene therapy developments
In this podcast, we spoke with Cory Hinz, Engineering Manager at Asahi Kasei Bioprocess about the different methods that are available for liquid chromatography mobile phase solutions and the benefits of inline blending. Cory also describes how to implement binary blending feeding of a liquid chromatography process using inline blending.
Liquid Chromatography Mobile Phase Solutions
I began the discussion by asking Cory if he could tell listeners about the different methods that are available for liquid chromatography mobile phase solutions. He explained that for chromatography, it's important to remember that the process and the chemistry should drive the method used. Some chromatography processes use prepared mobile phase solutions that don't require inline mixing, while others blend two or more solutions together to formulate. The mobile phase takes these blends and changes their composition over the course of the elution. Each of these methods is driven by the needs of the process.
Inline Blending
Next, I asked Cory if he could tell us about the benefits of inline blending. He said that inline blending allows solutions to be prepared at their point of use, not just for chromatography processes, but any blending process. This increases the consistency of the blended solution, reduces dependency on the accuracy of raw materials, allows for real time quality assurance, and eliminates the risks and extra resources and space required for traditional tank approaches. Inline blending also adds an element of flexibility, allowing functions such as buffer preparation to become more of a utility than an additional process.
Binary Blending Feeding of a Liquid Chromatography Process using Inline Blending
Cory then provided details about how to implement a specific solution for binary blending feeding of a liquid chromatography process using inline blending. He explained that binary blending is the most common configuration they see for their chromatography equipment customers, because medium and high-pressure liquid chromatography require a dedicated pump to supply the pressure dictated by the process. It is important to design the binary blending at the suction side of the pump. This is done by employing two modulating control valves, one for each of the two components of the mobile phase, and ensuring sufficient supply pressure to each one.
He then told us about the role that each of the valves play in creating the ideal blend. He described how the control valves do most of the heavy lifting for binary blending. The first valve controls the diluent, which will be the purified water or buffer that comprises the majority of the mobile phase blend. The second valve controls the component that is getting diluted. These valves each react to a different process parameter to achieve high accuracy.
The second valve is the most intuitive, the component being diluted can have its proportion increase or decrease based on the movement of the control valve. For example, if the concentrate is below target, the valve will open to allow more concentrate through. This can be based on flow connectivity or any critical process parameter that can be measured inline.
The first valve is less intuitive. It is controlled by the pressure in the system after the two streams have combined. If the blending pressure is too low, for example, the valve will open to increase that pressure.
The result of this configuration is that if the two valves react to one another via the process but are not linked by a system control algorithm. This results in flexibility and accuracy and also provides a way to monitor and mitigate pump cavitation.
Next I asked Cory about controlling the incoming process pressures of each of the valves. He said that in order for the binary blending scheme to work optimally, the incoming supply pressures of each stream should be controlled to prevent fluctuation that can disrupt the automatic blending control. To accomplish this, they employ a booster module on each of the two inlet streams. They have found that this additional feature is successful in isolating the system from fluctuations in supply, due to facility or other environmental effects. He went on to say that it is also convenient to match the two supply pressures to one another because that creates an identical pressure drop over both belts and a more intuitive blending scheme.
He then explained how the booster modules work together to maintain pressure. The booster modules are identical and are each composed of a recirculation loop driven by a load pump. Once primed, the load pump runs at a constant speed and directs flow both through the recirculation circuit and to the control valve. By placing an aero augmented back pressure regulator on the recirculation line, we can automatically control the effective pump discharge pressure. This is the supply pressure that each valve sees. The air augmented design of the back pressure regulator means that the system can target the desired pressure set point by modulating the supply air to the regulator.
I closed our session by asking Cory about other formulation challenges that he gets asked about. He said that not all applications require binary blending, some require combinations of three or more components. This makes system control more complex but is a familiar application for Asahi Kasei Bioprocess. They have tackled this several times in liquid chromatography applications, but the most common implementation of multi component blending is for buffer formulation. Their line of MOTIV™ systems is specifically designed to achieve highly accurate multi-component blends using our proprietary Pro-Yield™ technology. Both their chromatography and MOTIV formulation systems operate on their state-of-the-art, industry 4.0 process control software known as OCELOT™ System Control.
LEARN MORE ABOUT AKBA’S BIOPROCESS CHROM SYSTEMS
In this podcast, we interviewed Katie Keller, Director of Quality and Safety at Asahi Kasei Bioprocess America, about the importance of quality management and how to achieve the best possible results. Topics included the most critical elements of quality management, how to ensure the purchase of high-quality equipment, and future trends.
I started the conversation by asking Katie what she thought were the most critical elements of quality management. Katie replied by saying that a holistic approach to quality is best for any organization. It used to be that the quality unit was considered responsible for product quality, making all the decisions, and driving all the improvements and that's not really the case today. She feels the most successful approach is that since quality is so important, everyone should be responsible for it. She went on to say that when all employees understand how they contribute to product and service quality and therefore customer satisfaction, there is more buy in throughout the organization. People are empowered to take responsibility for the improvement of the processes they manage, and this total quality management is achieved by clearly defining the interaction of each process to another, ensuring employees understand that, and then setting the expectation that quality is achieved from every level of the organization with everyone playing a part.
I then asked Katie what should bioprocess equipment customers be looking for to ensure that they are purchasing high quality equipment? She told me that across industries, it's common for customers to search for suppliers with robust quality management systems. As a supplier, Asahi Kasei Bioprocess America (AKBA) can minimally prove this by achieving and advertising certification to ISO 9001. This shows that Asahi Kasei meets the minimum expectations for a manufacturing company to provide those quality products and services, but it really doesn't stop there. If they can show their customers that they have well designed, thorough processes that are continually improving, this naturally leads to better quality products and customers gain confidence in their ability to meet ongoing needs.
I continued the discussion by asking if she could talk a bit about ISO certification and why it's an important part of their quality management system. Katie explained that ISO 9001 really is the minimum. Their customers in the pharmaceutical industry might stop and look when they see the ISO certification, but what really brings them confidence and satisfaction are the ways Asahi Kasei goes above and beyond this. For AKBA, ISO certification is not just words on a page, there is a reason why every requirement in that standard exists. Katie shared that she believes it is her job to interpret this in a way that means something to her organization, so they can not only live it but improve upon it and take the next step. She elaborated on her point by saying that it is how you build upon those minimum criteria that truly shows a customer who you are and what is important to you as an organization. This is how a company can start to build that quality culture where the employees believe in the message that customer satisfaction, both internal and external, comes first.
I asked her about how these quality management systems affect the design and build of their equipment and how they have an impact beyond the quality management systems. Katie said that having ISO as a guideline is helpful for this, especially if they need to create or revamp a process. Asahi Kasei Bioprocess starts by asking what ISO requires to get a baseline and then looks at what their customers’ expectations for safety, quality, and productivity are. She explained that by keeping both these things in mind, they can create robust processes with controls or checkpoints to ensure they are satisfying all the requirements.
However, that example is at the front end of creating a new process, a robust quality system also ensures you have a mechanism to continue to learn. AKBA uses the data collected from previous equipment builds or customer facing activities to apply lessons learned to future projects. These lessons can come in the form of data compiled from nonconforming product customer feedback at reflection meetings, which are all incredibly important pieces of their quality system. Lessons provide inputs for future process and product improvements. In this way they are always learning, growing, and therefore continually improving their equipment design as well as the customers' experience with it.
Next, I asked her what trends she sees in quality management going forward in the industry and what it might look like in five to ten years. Katie responded by saying that right now she can see a digital transformation, because even less than ten years ago, many companies were working on transitioning from paper-based quality management systems to electronic systems and now everything is in the cloud. Moving forward, she feels we can expect full digitization throughout all different kinds of organizations’ quality systems, both large and small companies, and the new norm could be interacting digitally through cloud-based portals instead of emails. Another example could be that communication will be digitally scanned and accepted rather than receiving paper-based packing lists attached to shipments.
Additionally, she thinks the implementation of AI is growing at a rapid pace in manufacturing and this will result in the automation of more quality judgments. She went on to say that there has been lots of talk in the quality sphere about quality 4.0 and whether quality professionals will be out of a job soon, but she strongly believes there will still be a need for quality professionals to advise on ways to grow a business using quality tools and concepts. She said that we must make sure we can evolve with the times, but skills like problem solving and process improvement are still innately human skills that will always be needed. By continuing to keep people connected to each other and engaged in the quality system through the total quality management approach, we can continue to build a culture where everyone is responsible and accountable and motivated to keep improving. To sum up her answer she said, “I believe that when you have everyone in an organization living and breathing a unified message for quality, you can really do some great things, and I can't wait to see how far we'll have come in 10 years”.
I closed the interview by asking her if she had anything else to add for listeners. She added that with the rapid pace that everything is changing right now, especially in these certain industries like manufacturing and pharmaceuticals, it's an exciting time. She thinks the more that we can embrace the change, the greater things we can do. At Asahi Kasei Bioprocess, they are always innovating and trying to meet customers' needs for tomorrow. She thinks really getting behind that idea with an open mind and supporting employees internally so they have the empowerment and the mechanisms to be successful will be critical moving forward.
To learn more about Asahi Kasei Bioprocess America’s products and services, please visit: https://fluidmgmt.ak-bio.com/
In this podcast we spoke with Klaus Kienle, Global Product Manager for the Mixing portfolio at Pall Corporation about the latest mixing technologies including single-use solutions, the need for increased flexibility, and a more standard vendor agnostic approach.
The Role of Mixing in Biomanufacturing
I started the conversation by asking Klaus if he could talk about the role that mixing plays in biomanufacturing and current challenges in this area. Klaus explained that mixing is an omnipresent process. It starts with upstream buffer media and ends in fill and finish. It is an important part of manufacturing across several modalities, including monoclonal antibodies, mRNA based vaccines, gene therapies and various other processes. Across these various applications, flexibility is key, and it is also the primary challenge for the future. He continued by saying that Pall customers have expressed that they want increased flexibility, better lead times, and less supplier dependency in the future.
Advancements in Mixing
Next, I asked about the latest technological advancements in mixing. Klaus stated that the latest advancements are moving towards tackling the flexibility challenge, which means supplying solutions that are available with shorter lead times and are more vendor agnostic, so they fit with other vendors’ manifolds and full sets. This is consistent with the recent launch of the Allegro™ Ready Standard Solutions from Pall, which is not only limited to mixers, but also includes storage transfer sets and other segments. Pall has launched this new standard set ranging from a 30 liter mixer up to a 3000 liter mixer.
I continued the discussion by asking if he could talk a little bit more about some of the additional advantages of this set of new standards. He described how these standards are ready to go, so if a customer is asking for a manifold, there is no time required to generate a drawing or waiting for pricing back, resulting in a short lead time. Pall is working towards having these standards available off the shelf, reducing lead time further with availability in the range of single digit weeks, depending on manufacturing and where the customer is located. He shared that Pall has invested $1.5 billion to increase capacity and reduce lead times.
Single Use Mixing
I then asked him about some of the remaining challenges that exist with single use technologies. Klaus explained that one of the main challenges that the remains in single use is sustainability, especially since it consists predominantly of plastic components. However, there was a recent publication in New Biotechnology, authored by biopharmaceutical companies, “Streamlined life cycle assessment of single-use technologies in biopharmaceutical manufacture.” It makes the case that single use technology is providing better sustainability in the biopharmaceutical process because single use technology allows customers to use more intensified processes, thereby increasing the efficiency per consumable. Single use technologies also support a closed process and reduced clean room requirements resulting in lower energy requirements.
This is in line with the new standard designs from Pall, where the filtered product line is fully closed and processing ready. For instance, now the powder port on these designs ensures a closed and controlled environment. This then allows bioprocessing companies to lower the cleanroom environment requirements, which translates to significant energy savings.
Modular Mixing Approach
I continued our discussion by asking Klaus about a recent white paper published by Pall and Lonza, that highlighted a modular mixing approach. (need paper link) He described how a modular approach can give customers the flexibility that they are looking for to adapt to new requests, especially in the contract manufacturing organization (CMO) environment. Global CMOs are producing product for developers and as a result, they are frequently changing the product they are manufacturing. Their facility needs to be set up with the greatest degree of flexibility to adapt to changing needs as well as short notice production demands. For example, the need to be able to respond under short notice was clearly seen in the push to produce the COVID-19 vaccine. If a CMO gets this request and they’ve built their facility with a modular approach, systems could be interchanged easily. This would allow them to adapt quickly from a monoclonal antibody process to an mRNA process. A modular approach can also save companies on CAPEX as some consumables they will have in stock. The paper illustrates the ease of use of Pall mixer systems and their flexibility, which allowed a fast change over without buying new equipment.
What to Consider when Selecting a Mixing System
Next Klaus provided advice for listeners about what end users should be considering when selecting a mixing system. He replied that there is a message right now from some suppliers that one size fits all solutions are the best option, but the reality is simply that most processes are so unique and so variable that one size fits all doesn’t work. Pall takes a different approach by offering four different mixers providing a variety of solutions depending on the process. For example, if a customer needs a very powerful mixer, Pall offers the magnetic mixer, but if they need a shear sensitive one, they recommend the LevMixer® system. Because they utilize the same mixing tank, a simple change of the drive unit makes the equipment ready for either the high power demand of the magnetic mixer or the shear sensitive mixing of the LevMixer system. This flexibility allows customers to quickly adapt their entire facility towards a specific process without exchanging the big capital systems.
Before we wrapped up the interview, I asked Klaus if he had anything else that he'd like to add for listeners. He told me that the Danaher philosophy is listening to the voice of the customer. Pall embraces this and takes it very seriously. This means that Pall reaches out to customers to ask what their specific needs are and how they can improve. One key slogan within Danaher is continuous improvement and Pall has applied that to their product portfolio by continuing to work on providing flexibility and interchangeability, not just within Pall, but future interchangeability between several technologies in the market. He added that his personal goal is to deliver the next level mixing experience, which in some cases is not the next level technology. It might be the next level of flexibility instead of the technology, but in the end, the bottom line is that the customer is experiencing happy mixing.
For more information, please see the white paper Trends in Single-Use Mixing Technologies for Biomanufacturing



