Share This

 

In this article, Simon Gane looks forward to what the future holds, on the presumption he survives.

 

Setting aside the questions of the UK even existing, the NHS still working, or the fact we’ll be commuting to our jobs in submarines as climate storms tear the surface world apart, rhinology will still be much the same in 15 years and radically changed in 50.

The near future: 15 years from now

Right now, rhinology is on the edge of a revolution in our understanding of chronic inflammatory disease. Research into disease mechanisms in the shared airway diseases, chronic rhinosinusitis (CRS) and asthma, has improved our understanding of the underlying disease process in CRS and the dawning realisation that we are talking about multiple diseases with a common presenting syndrome (call these ‘endotypes of the phenotype’ for maximum academic style points and make sure to use the term ‘precision medicine’ as if no-one has thought of using biomarkers or other tests to predict treatment response before.) This revolution is important because of the new breed of highly-targeted therapies enabled by monoclonal antibody technology.

As our understanding of the underlying mechanisms of inflammation and disease progression advance, so too does our ability to target these pathways, blocking or activating key steps in the disease biology.

“The computational capacity at the fingers of even the lowliest ENT surgeon will enable marvels we can only imagine.”

This pathway analysis doesn’t only apply to inflammatory disease. We are not far away from the ability to analyse the proteins synthesised in each cell of a tumour cell population (the ‘transcriptome’) for the mechanisms of therapeutic resistance in each cancer. This will allow us to tailor the chemotherapeutic regime to target the pathway’s weak points. Rather than a macroscopic identification of the tumour type correlating to the likely oncogenic mutation, we can know precisely what mutations the tumour has ‘up its sleeve’. Imagine culturing the cells from a biopsy and testing multiple therapeutic strategies in much the same way as is currently done in microbiology and alongside these cell lines, exposing cultures of the patient’s own cells in organoids to simultaneously check the effect of those treatments on the host.

The explosion of this ‘omic’ (genomic, transcriptomic and proteomic) information has necessitated the development of tools to deal with massive amounts of ‘big data’. These statistical and computational techniques go by names such as ‘machine learning’ and ‘artificial intelligence’, so as to make statistics seem sexier than it really is. By collecting a whole population, rather than a sample, the need for statistical inference testing can be removed and we may achieve new insights into epidemiology, although the vaunted ‘hypothesis-free science’ excitement of the early google flu tracking work has not been borne out in later years [1].

The big data will not just be on the population level; we will have access to amazingly finely granular information about our patients and their behaviours and symptoms. If current trends continue, hoards of personal information will be able to be analysed not just by the NSA but by the NHS, looking for symptoms, signs and behaviours. What if every treatment cohort was automatically analysed to perform prospective age-matched control studies or if every natural experiment arising from the normal variation in treatment between healthcare professionals was monitored and learned from?

 

 

Some of the first of these experiments may track the introduction of slimline single-port surgical ‘robots’. As mentioned in Neil Sharma’s article, ‘The future of head and neck cancer surgery’, surgical robotics are already transforming the management of certain head and neck diseases. At the moment, these devices are too bulky to have much application in rhinology but before long the limitations of access beyond the neuromuscular bundles in skull-base surgery will be transgressed with single port ‘wristed’ instruments, opening up the undersurface of the brain in new ways.

In 50 years

These new ways will continue over the subsequent 35 years to places which will seem obvious to our descendants but amazing to us. Assuming we are not back to practising rhinology with the aid of mirrors and the sun using a sharp stick in a smoking crater, the next 50 years will be full of miracles. Richard Feynman, in his canonical lecture, ‘There is Plenty of Room at the Bottom’, asked the audience to imagine a microscopic surgeon, operating at the molecular level [2]. Perhaps in half a century this surgeon will have replaced our clumsy procedures. Certainly, if Moore’s law continues to hold, or even if it slows, the computational capacity at the fingers of even the lowliest ENT surgeon will enable marvels we can only imagine.

We can predict nanometer-sized sensors giving a high-definition, cellular-level map of a patient with a real-time computer simulation of the target physiology, allowing multiple in-vitro simulations of therapeutic responses. Imagine a real-time simulation of the forces involved in a healing nose after a septorhinoplasty, predicting the outcome of each precise suture placement and tension, every incision’s placement and angulation. The promise of quantum computing allows all of the variables to be calculated at the same time, allowing real-time guidance of the surgical technique. The technique will almost certainly be that of an actuator on a device, rather than a human hand. These devices may not be fully autonomous, but incorporate the information predictive power of intelligent systems with the knowledge and skill of a human operator. Already we are seeing centaur chess becoming the most powerful form of the game, the ‘centaur’ being the combination of human and machine strategy, teams of human and computer playing against each other.

It’s unlikely that the human part of the centaur will be sitting in front of a computer screen. The power of these approaches is very likely to be channelled through augmented and virtual reality display technologies. Augmented reality (AR), the overlay of a ‘heads-up display’ of visual information over the real world, can provide additional senses detecting ultrasound and light wavelengths outside the human visual spectrum. Instead of looking at an image guidance screen during complex skull base surgery, the operator could look at the operative field and see the imaging overlaid, with important anatomical areas identified and coloured differently.

“The ultimate role of the physician is to be a human contact to the patient in a world outside of their usual understanding.”

The human centaur may not even need to be in the same geographical location as the patient. One of the most exciting possibilities opened up by this technology is telesurgery, the ability to perform operations from thousands of miles away. Most of this article is about rhinology in the first world, but cheap telecommunications and a huge developing country population moving in to the middle class means enormous demand for the skills of any highly-trained professional throughout the whole world.

The human part of the centaur approach will always have a place in medicine, as long as we have human patients. The ultimate role of the physician is to be a human contact to the patient in a world outside of their usual understanding. Our job of understanding evidence, gathering experience, synthesising and translating this to improve our patient’s health will arguably become more important in the increasingly specialised and technological future. I look forward to seeing it.

 

References

1. Butler D. When Google got flu wrong. Nature 2013;494:155.
2. Feynman RP, Sackett PD. “Surely You’re Joking Mr. Feynman!’’ Adventures of a Curious Character. American Journal of Physics 1985;53:1214-6.

 

Declaration of Competing Interests: SG is PI for an industry-sponsored trial of a monoclonal antibody in CRSwNP.

Share This
CONTRIBUTOR
Simon Gane

FRCS(ORL-HNS), Royal National Throat, Nose and Ear Hospital, London, UK.

View Full Profile