top of page

Evolution of Medicine

Life expectancy

Since 1850, life expectancy has more than doubled mostly due to infection control. Improvements in sewage management, clean water, access to medical help, vaccination, and anti-bacterial drugs that have transformed public health. 

The Covid 19 pandemic was a reminder of the danger of infectious disease. It pushed hospitals to the limit and resulted in 7M deaths world wide, inspite of lockdowns and the quickest vaccine development in history. Covid 19 significantly reduced life expectancy. 

Women live 6 years longer than men or  80yrs  vs 74yrs 

Men of height men of height 5'7" (170.2 cm ) or less lived 7.46 years longer than those of at least 6'1" (182.9 cm).  

pubmed.ncbi.nlm.nih.gov/1600586

Cause of death 

The top 10 causes of death in 2019, before Covid, are: 

US                                     World

           Cardiovascular

           Cancer

           Respiratory 

Alzheimer                    Digestive

Digestive                      Lower respiratory

Kidney                          Neonatal

Lower respiratory      Enteric infections

Diabetes                      Alzheimer 

Liver                             Diabetes

Illegal Drugs                Diarrheal 

The World causes of death show that infections and neo natal care are still an issue in poorer countries. The biggest issues are the same rich or poor; cardio vascular, cancer and respiratory. US has notably higher rates of  cancer, Alzheimer's and kidney disease. 

Barriers to progress

Between 1999 and 2016, there has been significant progress in preventing;                 

        heart disease by early detection, cholesterol mgmt. and routine bypass surgery

        stroke by blood pressure and arrythmia control

        lung infections by reducing smoking 

 

Becoming more prevalent are;

        Alzheimer's 

        accidents from risk taking in social media

        drug overdoses from the opioid explosion 

         lower respiratory  

         kidney,

         suicide,

         diabetes from more obesity

 

Total deaths from Cancer has not changed. 

Cancers

The top 10 cancer killers are; lung, colon, pancreatic, breast, prostate, leukemia, non- Hodgkins, liver, bladder.

The cancers with improved survival are; colon, lung, breast, prostate, stomach, cervical. Most of these have routine early screening programs. 

The cancers with poorer survival are; liver, pancreatic, esophageal, bladder, kidney, brain, gall bladder. Early detection is a problem in these cancers. The most common and getting worse is pancreatic. By the time pancreatic cancer produces symptoms it usually has already mediatized and spread. 

 

 

Role of poverty 

The research by Alice Chen, Emily Oster, and Heidi Williams (2016)  “the observed higher US postneonatal mortality relative to Europe is due entirely, or almost entirely, to higher mortality among disadvantaged groups.”Not just the mortality of infants is higher in the US, but maternal mortality is also much higher. And while maternal mortality is falling in almost all countries in the world the death of mothers is becoming more common in the US.

 

The inequality in life expectancy is large in the US, the difference between the poorest 1% and the richest 1% in the US is 14.6 years, and getting worse. 

Changes in the mortality of the young have a much larger impact on average life expectancy.

The US has 2x higher health expenditure than UK  with 3 years worse life expectancy. When combined with the data showing lower 25% incomes having 7 years worse life expectancy the the higher 25%, the ineffectiveness of the US health care system is clear. 

Factors why UK has better life expectancy than US in  deaths per 100,000 are;

obesity                   Diff =  46 

opiod addiction    Diff = 11.3  

road accidents      Diff =  7, 

suicides                  Diff =  6,

homicides              Diff = 5.

https://ourworldindata.org/us-life-expectancy-low

The take aways for US residents are the importance of following all the screening protocols, regular exercise, avoid smoking and opiods, weight control. The biggest hidden dangers appears to be pancreatic cancer and Alzheimer's

life-expectancy.png
share-of-deaths-by-cause (1).png
share-of-deaths-by-cause.png
COD.jpg
total-cancer-deaths-by-type.png
Cancer deaths.jpg
The-life-expectancy-gap-in-the-United-States-is-rising-1080x763.png
life-expectancy-vs-health-expenditure.png

History of Medicine      

Traditional medicine in China and India focused on herbal medicines as natural drugs  Hippocrates formalized early medical ideas such as famous oath. Over time the understanding of anatomy and how the bodies functions interacted. 

Islamic medicine understood the value of clean practices long before Western medicine. The Tasrif, written by surgeon Abu Al-Qasim Al-Zahrawi, was translated into Latin; it became one of the most important medical texts in European universities during the Middle Ages and contained useful information on surgical techniques and spread of bacterial infection.

In the Middle Ages, religious concepts dominated over organized medicine.

The Renaissance brought an intense focus on scholarship to Christian Europe. A major effort to translate the Arabic and Greek scientific works into Latin emerged. Europeans gradually became experts not only in the ancient writings of the Romans and Greeks, but in the contemporary writings of Islamic scientists. During the later centuries of the Renaissance came an increase in experimental investigation, particularly in the field of dissection and body examination, thus advancing our knowledge of human anatomy.

At the University of Bologna, Julius Caesar Aranzi (Arantius) (1530–1589). He became Professor of Anatomy and Surgery   in 1556, where he established anatomy as a major branch of medicine for the first time. Aranzi combined anatomy with a description of pathological processes, based largely on his own research, Galen, and the work of his contemporary Italians. His books (in Latin) covered surgical techniques for many conditions, including hydrocephalusnasal, polypgoitre and tumours to phimosisasciteshaemorrhoidsanal abscess and fistulae.[134] 

 

Sanitary impact

In 1847 in Vienna, Ignaz Semmelweis (1818–1865), dramatically reduced the death rate of new mothers (due to childbed fever) by requiring physicians to clean their hands before attending childbirth, yet his principles were marginalized and attacked by professional peers.[161] At that time most people still believed that infections were caused by foul odors called miasmas.

A major breakthrough in epidemiology came with the introduction of statistical maps and graphs. They allowed careful analysis of seasonality issues in disease incidents, and the maps allowed public health officials to identify critical loci for the dissemination of disease. John Snow in London developed the methods. In 1849, he observed that the symptoms of cholera, which had already claimed around 500 lives within a month, were vomiting and diarrhoea. He concluded that the source of contamination must be through ingestion, rather than inhalation as was previously thought. It was this insight that resulted in the removal of The Pump On Broad Street, after which deaths from cholera plummeted. 

The breakthrough to professionalization based on knowledge of advanced medicine was led by Florence Nightingale in England, and her Nightingale Training School opened in 1860 and became a model. 

During the U.S. Civil War the Sanitary Commission collected enormous amounts of statistical data, and opened up the problems of storing information for fast access and mechanically searching for data patterns. The pioneer was John Shaw Billings (1838–1913). A senior surgeon in the war, Billings built the Library of the Surgeon General's Office (now the National Library of Medicine), the centerpiece of modern medical information systems.[172] Billings figured out how to mechanically analyze medical and demographic data by turning facts into numbers and punching the numbers onto cardboard cards that could be sorted and counted by machine. The applications were developed by his assistant Herman Hollerith; Hollerith invented the punch card and counter-sorter system that dominated statistical data manipulation until the 1970s. Hollerith's company became International Business Machines (IBM) in 1911.

 

Drug testing protocols have been refined into 3 Phases,  safety, efficacy, and  therapeutic,  by the CDC in the US. The therapeutic phase  is double blind, meaning that neither the patient or therapist know who has the drug and who has the placebo.  

 

Medical Microbiology

In 1860,  Frenchman Casimir Davaine to identify the pathogen of the deadly disease anthrax.  British surgeon Joseph Lister, however, took these findings seriously and subsequently introduced antisepsis to wound treatment in 1865. In 1881, the outbreak of a cholera epidemic in Alexandria, Egypt,  Pasteur and  Koch, wnet to investigate. Koch's group returned in 1883, having successfully discovered the cholera pathogen

In the 20th century, ongoing research concentrated on the nature of infectious diseases and their means of transmission. Increasing numbers of pathogenic organisms were discovered and classified. Some, such as the rickettsias, which cause diseases like typhus, are smaller than bacteria; some are larger, such as the protozoans that engender malaria and other tropical diseases. The smallest to be identified were the viruses, producers of many diseases, among them mumpsmeaslesGerman measles, and polio. In 1910 Peyton Rous showed that a virus could also cause a malignant tumour, a sarcoma in chickens.

Anti bacterials

In 1910, Ehrlich with his colleague Sahachiro Hata, he conducted tests on arsphenamine, once sold under the commercial name Salvarsan. Their success inaugurated the chemotherapeutic era, which was to revolutionize the treatment and control of infectious diseases. Salvarsan, a synthetic preparation containing arsenic, is lethal to the microorganism responsible for syphilis. Until the introduction of the antibiotic penicillin, Salvarsan or one of its modifications remained the standard treatment of syphilis and went far toward bringing this social and medical scourge under control.

Sulphonamide drugs.  In 1936 English physician Leonard Colebrook and colleagues provided overwhelming evidence of the efficacy of both Prontosil and sulfanilamide in streptococcal septicemia (bloodstream infection), thereby ushering in the sulfonamide era. 

In 1928, when Alexander Fleming noticed the inhibitory action of a stray mold on a plate culture of staphylococcus bacteria in his laboratory at St. Mary’s Hospital, London. The mold was a strain of Penicillium—P. notatum—  Ten years later Howard FloreyErnst Chain, and their colleagues at Oxford University   isolated penicillin in a form that was fairly pure (by standards then current) and demonstrated its potency and relative lack of toxicity. By then World War II had begun, and techniques to facilitate commercial production were developed in the United States. By 1944 adequate amounts were available to meet the extraordinary needs of wartime.

Penicillin  was not active against Mycobacterium tuberculosis. However, in 1944 Selman Waksman, Albert Schatz, and Elizabeth Bugie announced the discovery of streptomycin from cultures of a soil organism. Streptomycin, however, suffers from the great disadvantage that the tubercle bacillus tends to become resistant to it. Fortunately, other drugs became available to supplement it, the two most important being para-aminosalicylic acid (PAS) and isoniazid.  

Antibiotic resistance is seen as a growing problem. 

Anti-virals

Dramatic though they undoubtedly were, the advances in chemotherapy still left one important area vulnerable, that of the viruses. It was in bringing viruses under control that advances in immunology—the study of immunity—played such a striking part. One of the paradoxes of medicine is that the first large-scale immunization against a viral disease was instituted and established long before viruses were discovered. The smallpox vaccine is the first vaccine to be developed against a contagious disease. In the 1500's the Chinese reported inoculation using scabs of infected individuals. In 1796, British physician Edward Jenner demonstrated that an infection with the relatively mild cowpox virus conferred immunity against the deadly smallpox virus. Cowpox served as a natural vaccine until the modern smallpox vaccine emerged in the 20th century. When Edward Jenner introduced vaccination against the virus that causes smallpox, the identification of viruses was still 100 years in the future. It took almost another half century to discover an effective method of producing antiviral vaccines that were both safe and effective.

After 1883 Pasteur switched research direction, and introduced his third vaccine—rabies vaccine—the first vaccine for humans since Jenner's for smallpox.

In 1897 English bacteriologist Almroth Wright introduced a vaccine prepared from killed typhoid bacilli as a preventive of typhoid. Preliminary trials in the Indian army produced excellent results, and typhoid vaccination was adopted for the use of British troops serving in the South African War. Unfortunately, due to incomplete test protocols  it was often difficult to decide to what extent the decline in typhoid was attributable to improved sanitary conditions and to what extent it was due to greater use of the vaccine.

In the meantime, however, the process by which the body reacts against infectious organisms to generate immunity became better understood. In Paris, Élie Metchnikoff had already detected the role of white blood cells in the immune reaction, and Jules Bordet had identified antibodies in the blood serum. The mechanisms of antibody activity were used to devise diagnostic tests for a number of diseases. In 1906 August von Wassermann gave his name to the blood test for syphilis, and in 1908 Charles Mantoux developed a skin test for tuberculosis. At the same time, methods of producing effective substances for inoculation were improved, and immunization against bacterial diseases made rapid progress.

The first candidate polio vaccine, based on one serotype of a live but attenuated (weakened) virus, was developed by the virologist Hilary Koprowski. The prototype vaccine was given to an eight-year-old boy on 27 February 1950. leading to large-scale trials in the then Belgian Congo and the vaccination of seven million children in Poland a  between 1958 and 1960.[56]

The second polio virus vaccine was developed in 1952 by Jonas Salk at the University of Pittsburgh, The Salk vaccine, or inactivated poliovirus vaccine (IPV), is based on poliovirus grown in a type of monkey kidney tissue culture (vero cell line), which is chemically inactivated with formalin.[23] After two doses of inactivated poliovirus vaccine (given by injection), 90 percent or more of individuals develop protective antibody to all three serotypes of poliovirus, and at least 99 percent are immune to poliovirus following three doses.[1]

Subsequently, Albert Sabin developed another live, oral polio vaccine (OPV). It was produced by the repeated passage of the virus through nonhuman cells at sub-physiological temperatures. The attenuated poliovirus in the Sabin vaccine replicates very efficiently in the gut, the primary site of wild poliovirus infection and replication, but the vaccine strain is unable to replicate efficiently within nervous system tissue.   Three doses of live-attenuated oral vaccine produce protective antibody to all three poliovirus types in more than 95 percent of recipients. In 1958 ,it was selected, in competition, by the US National Institutes of Health.[56] Licensed in 1962, it rapidly became the only polio vaccine used worldwide.

In 1981, HIV was first detected and had a disastrous impact on sexual partners. 

HIV infects vital cells in the human immune system, such as helper T cells nmacrophages, and dendritic cells.[11]  Upon entry into the target cell, the viral RNA genome is converted (reverse transcribed) into double-stranded DNA by a virally encoded enzyme, reverse transcriptase, that is transported along with the viral genome in the virus particle.   When T cell numbers decline below a critical level, cell-mediated immunity is lost, and the body becomes progressively more susceptible to opportunistic infections, leading to the development of AIDS. 

By 2008, anti-viral cocktails, not vaccines, had made HIV a chronic not deadly disease. Most of the antiviral drugs now available are designed to help deal with HIVherpes viruses, the hepatitis B and C viruses, and influenza A and B viruses.[6]   One way of doing this is to develop  nucleoside analogues that look like the building blocks of RNA or DNA, but deactivate the enzymes that synthesize the RNA or DNA once the analogue is incorporated. This approach is more commonly associated with the inhibition of reverse transcriptase (RNA to DNA) than with "normal" transcriptase (DNA to RNA), and so works against HIV.

The first antiviral drug to be approved for treating HIV, zidovudine (AZT), is  a nucleoside analogue.

Over time understanding of the molecular biology of viruses improved. The virus particle has spikes on its surface that are responsible for breaking into healthy cells. Our immune system is triggered by these spike proteins. The idea is that a vaccine that produced the spike protein would activate an immune reaction without using the dangerous virus.  

Covid 19 emerged in 2021 as a dangerous coronavirus.  It was clear quickly that the virus was highly infectious (each person infects 6 others), compared to flu (each person infects 1.5 others).  Covid 19 infection is both symptom free for many and particularly dangerous in the unlucky oldies and those with pre-existing lung issues. Isolation has been the historical management strategy; for polio close the pools, for aids use condoms, for small pox avoid close contact after rash , for water borne disease use clean bottled water. Airborne diseases much more difficult as you can pass without contact, hence the difficulty with Covid plus many are asymptomatic but infectious. Even with a lock down and mandatory masking, the hospitals got to the edge of being overwhelmed several times.  A vaccine that used a m-RNA string that produced the "spike" protein on the surface of the viral particle was developed quickly, tested to be >90% effective and deployed inside a year. It took 2 years for the world to start back to normal.  The most disappointing side effect was that vaccine became a political football, with the right fighting vaccination even though it caused the death of their followers.  US death count was over 1M at the end. 

The  US had 40% higher excess mortality than the major European countries, suggesting that we much  less effective in managing the infection.   

Cancer

The first early detection test for a cancer was the Paps Smear  test  for cervical cancer was independently invented in the 1920s by the Greek physician Georgios Papanikolaou and named after him. A simplified version of the test was introduced by the Canadian obstetrician Anna Marion Hilliard in 1957.

Breast cancer

Our modern approach to breast cancer treatment and research started forming in the 19th century. Consider these milestones:

  • 1882: William Halsted performed the first radical mastectomy. This surgery will remain the standard operation to treat breast cancer until into the 20th century.

  • 1895: The first X-ray is taken. Eventually, low-dose X-rays called mammograms will be used to detect breast cancer.

  • 1898: Marie and Pierre Curie discover the radioactive elements radium and polonium. Shortly after, radium is used in cancer treatment.

  • 1932: A new approach to mastectomy is developed. The surgical procedure is not as disfiguring, and becomes the new standard.

  • 1937: Radiation therapy is used in addition to surgery to spare the breast. After removing the tumor, needles with radium are placed in the breast and near lymph nodes.

  • 1978: Tamoxifen (Nolvadex, Soltamox) is approved by the Food and Drug Administration (FDA) for use in breast cancer treatment. This antiestrogen drug is the first in a new class of drugs called selective estrogen receptor modulators (SERMs).

  • 1984: Researchers discover a new gene in rats. The human version, HER2, is found to be linked with more aggressive breast cancer when overexpressed. Called HER2-positive breast cancer, it isn’t as responsive to treatments.

  • 1985: Researchers discover that women with early-stage breast cancer who were treated with a lumpectomy and radiation have similar survival rates to women treated with only a mastectomy.

  • 1986: Scientists figure out how to clone the HER2 gene.

  • 1995: Scientists can clone the tumor suppressor genes BRCA1 and BRCA2. Inherited mutations in these genes can predict an increased risk of breast cancer.

  • 1996: FDA approves anastrozole (Arimidex) as a treatment for breast cancer. This drug blocks the production of estrogen.

  • 1998: Tamoxifen is found to decrease the risk of developing breast cancer in at-risk women by 50 percentTrusted Source. It’s now approved by the FDA for use as a preventive therapy.

  • 1998: Trastuzumab (Herceptin), a drug targeting cancer cells that are over-producing HER2, is also approved by the FDA.

  • 2006: The SERM drug raloxifene (Evista) is found to reduce breast cancer risk for postmenopausal women who have higher risk. It has a lower chance of serious side effects than tamoxifen.

  • 2011: A large meta-analysisTrusted Source finds that radiation therapy significantly reduces the risk of breast cancer reccurrence and mortality.

  • 2013: The four major subtypesTrusted Source of breast cancer are defined as HR+/HER2 (“luminal A”), HR-/HER2 (“triple negative”), HR+/HER2+ (“luminal B”), and HR-/HER2+ (“HER2-enriched”).

  • 2017: The first biosimilar drug, OgivriTrusted Source (trastuzumab-dkst), is approved by the FDA for breast cancer treatment. Unlike generics, biosimilars are copies of biologic drugs and cost less than branded drugs.

  • 2018: A clinical trial suggests that chemotherapy after surgery doesn’t benefit 70 percent of women with early-stage breast cancer.

  • 2019: EnhertuTrusted Source is approved by the FDA, and this drug proves to be very effective in treating HER2-positive breast cancer that’s metastasized or can’t be removed with surgery.

  • 2020: The drug Trodelvy is approved by the FDA for treating metastatic triple-negative breast cancer for people who haven’t responded to at least two other treatments.

Imaging

Röntgen 1895 discovered X rays and could be used to see through bodies. The X ray became a universal diagnostic tool for bones and cartilidge. The early experimenters learnt the hard way about the dangers of X rays. 

 

English-born physicist John Wild (1914–2009) first used ultrasound to assess the thickness of bowel tissue as early as 1949; he has been described as the "father of medical ultrasound".[123] Subsequent advances took place concurrently in several countries but was not until 1961 when David Robinson and George Kossoff's work at the Australian Department of Health resulted in the first commercially practical water bath ultrasonic scanner.[124] In 1963 Meyerdirk & Wright launched production of the first commercial, hand-held, articulated arm, compound contact B-mode scanner, which made ultrasound generally available for medical use.

 

CT scanners use a rotating X-ray tube and a row of detectors placed in a gantry to measure X-ray attenuations by different tissues inside the body. The multiple X-ray measurements taken from different angles are then processed on a computer using tomographic reconstruction algorithms to produce tomographic (cross-sectional) images (virtual "slices") of a body. In October 1963, William H. Oldendorf received a U.S. patent for a "radiant energy apparatus for investigating selected areas of interior objects obscured by dense material".[191] The first commercially viable CT scanner was invented by Godfrey Hounsfield in 1972.  CT can deliver significant X ray exposure, so should be used care.

Contrast agents allow  contrast CT to visualize the arteries and veins throughout the body.[37] This ranges from arteries serving the brain to those bringing blood to the lungskidneysarms and legs. An example of this type of exam is CT pulmonary angiogram (CTPA) used to diagnose pulmonary embolism (PE). It employs computed tomography and an iodine-based contrast agent to obtain an image of the pulmonary arteries. Contrast agents can trigger allergic (anaphylactic) reactions.

In MRI  certain atomic nuclei are able to absorb radio frequency energy when placed in an external magnetic field; the resultant evolving spin polarization can induce a RF signal in a radio frequency coil and thereby be detected.  In clinical and research MRI, hydrogen atoms are most often used. Most MRI scans essentially map the location of water and fat in the body. Pulses of radio waves excite the nuclear spin energy transition, and magnetic field gradients localize the polarization in space. By varying the parameters of the pulse sequence, different contrasts may be generated between tissues based on the relaxation properties of the hydrogen atoms therein. Specific organs can be visualized such as  blood flow using  a paramagnetic contrast agent (gadolinium). MRI is much safer than CT, but cannot be used if magnetic metals are in the body. 

In 1973 at Stony Brook University, Paul Lauterbur in the journal Nature,[141] followed by the picture of a living animal, a clam, and in 1974 by the image of the thoracic cavity of a mouse.  Advances in semiconductor technology were crucial to the development of practical MRI, which requires a large amount of computational power. In 1980, Mallard at U Aberdeen built a full body MRI and the first clinically useful image of a patient's internal tissues using MRI, which identified a primary tumour in the patient's chest, an abnormal liver, and secondary cancer in his bones.This was made possible by the rapidly increasing number of transistors on a single integrated circuit chip.[144] Mansfield and Lauterbur were awarded the 2003 Nobel Prize in Physiology or Medicine for their "discoveries concerning magnetic resonance imaging". 

DNA 

Crick and Watson discovered the molecular structure of DNA, and identify  the replication process. The foundation for sequencing proteins was first laid by the work of Frederick Sanger who by 1955 had completed the sequence of all the amino acids in insulin, a small protein secreted by the pancreas. This provided the first conclusive evidence that proteins were chemical entities with a specific molecular pattern.  Crick began developing a theory which argued that the arrangement of nucleotides in DNA determined the sequence of amino acids in proteins, which in turn helped determine the function of a protein. He published this theory in 1958.

Sanger process.  Takes the DNA string that starts with a "primer", make copy using a mixture of bases and labled sequence stoppers. The stopper randomly creates  a new copy at every point where a specific base occurs, producing a pool of segments. Measure the length of the segments, and create the DNA sequence. Works for up to 900 base pairs, or 900 process cycles. Length gives distance to next base, cycle number give relative position to last base. 

The "Next Generation Sequencing" uses a library of tagged sequences as the probe rather than a single base. Multiple tagged sequences can be detected in a single cycle. The overlapping probes are then assembled into a complete genome. 

Metabolic therapy

There are numerous biochemical process, managed by different organs, that regulate our bodies. Mis-regulation is a root cause of many serious conditions, for example, high blood pressure is a trigger for strokes. Mis-regulation of sugar in diabetes causes major long term damage. Drug treatments have been developed to manage many mis-regulation problems. 

Cancer treatment 

As infections largely disappeared as cause of death, people lived long enough for cancers to be the second leading cause of death. Fighting cancer is a combination of early detection when it is small and before it spreads. This allows a combination of surgery, radiation and chemotherapy to be effective. Recently DNA specific chemotherapy has further improved success rates. Radiation and chemotherapy are very challenging for the patient as the cancer cells are very similar to healthy cells, just with uncontrolled growth.  The treatment is typically designed to just poison the cancer without killing the patient. There has been notable progress on lung, colon, prostate, breast cancers. In 2020, liver and pancreatic are the  cancers that are still becoming more common. 

In 2016, scientists at the Washington University School of Medicine in St. Louis, MO, found that jumping genes are widespread in cancer and promote tumor growth by forcing cancer genes to remain switched on. They analyzed 7,769 tumor samples from 15 different types of cancer and found 129 jumping genes that can drive tumor growth through their influence on 106 different cancer genes. The jumping genes were functioning as “stealthy on-switches” in 3,864 of the tumors that the team analyzed. These tumors came from breast, colon, lung, skin, prostate, brain, and other types of cancer.

“If you,” says Ting Wang, who is a professor of medicine in the Department of Genetics, “perform typical genome sequencing, looking for genetic mutations driving cancer, you’re not going to find jumping genes.”

Surgery 

The oldest known surgical amputation was carried out in Borneo about 31,000 years ago.[10] The operation involved the removal of the distal third of the left lower leg. The person survived the operation and lived for another 6 to 9 years.  The next oldest known amputation was carried out about 7000 years ago on a farmer in France whose left forearm had been surgically removed.

In the 13th century in Europe skilled town craftsmen called barber-surgeons performed amputations and set broken bones while suffering lower status than university educated doctors. 

The discipline of surgery was put on a sound, scientific footing during the Age of Enlightenment in Europe (1715–89). An important figure in this regard was the Scottish surgical scientist (in London) John Hunter (1728–1793), generally regarded as the father of modern scientific surgery. Hunter greatly advanced   new methods for repairing damage to the Achilles tendon and a more effective method for applying ligature of the arteries in case of an aneurysm.[51] He was also one of the first to understand the importance of pathology, the danger of the spread of infection and how the problem of inflammation of the wound, bone lesions and even tuberculosis often undid any benefit that was gained from the intervention. He consequently adopted the position that all surgical procedures should be used only as a last resort.[52] Hunter's student Benjamin Bell (1749–1806) became the first scientific surgeon in Scotland, advocating the routine use of opium in post-operative recovery, and counseling surgeons to "save skin" to speed healing; his great-grandson Joseph Bell (1837–1911) became the inspiration for Arthur Conan Doyle's literary hero Sherlock Holmes.

The European way of modern pain control through anesthesia was discovered in the mid-19th century, however, such practices were common in the 16th century Bunyoro-Kitara kingdom of modern-day Uganda Africa where the Bunyoro physicians used banana wine as an anesthetic agent. Before the advent of anesthesia in Europe, surgery was a traumatically painful procedure and surgeons were encouraged to be as swift as possible to minimize patient suffering.  War time amputation was famous for its speed. 

Beginning in the 1840s, European surgery began to change dramatically in character with the discovery of effective and practical anesthetic chemicals such as ether, first used by the American surgeon Crawford Long (1815–1878), and chloroform, discovered by James Young Simpson (1811–1870) and later pioneered in England by John Snow (1813–1858), physician to Queen Victoria, who in 1853 administered chloroform to her during childbirth. In addition, the discovery of muscle relaxants such as curare allowed for safer applications.

Progress in anesthesia and technique have led to many complex procedures becoming routine. Some milestones; 

1952. The first successful open heart surgery using hypothermia.

1967: The first successful heart transplant by Christiaan Barnard.

​1967. The first successful coronary artery bypass surgery.

in 2020, heart bypass, heart valve repair, joint  replacement, ligament repair, to name a few are routine. Minimally invasive arthroscopic surgery have radically reduced recovery times. 

Psychiatry

Founded in the 13th century, Bethlem Royal Hospital in London was one of the oldest lunatic asylums.[12]

 

In the late 17th century, privately run asylums for the insane began to proliferate and expand in size. Already in 1632 it was recorded that Bethlem Royal HospitalLondon had  "21 rooms wherein the poor distracted people lie, and above the stairs eight rooms more for servants and the poor to lie in".[14] Inmates who were deemed dangerous or disturbing were chained, but Bethlem was an otherwise open building for its inhabitants to roam around its confines and possibly throughout the general neighborhood in which the hospital was situated.[15] In 1676, Bethlem expanded into newly built premises at Moorfields with a capacity for 100 inmates. 

In 1621, Oxford University mathematician, astrologer, and scholar Robert Burton published one of the earliest treatises on mental illness, The Anatomy of Melancholy, What it is: With all the Kinds, Causes, Symptomes, Prognostickes, and Several Cures of it. Burton thought that there was "no greater cause of melancholy than idleness, no better cure than business." Unlike English philosopher of science Francis Bacon, Burton argued that knowledge of the mind, not natural science, is humankind's greatest need.

In 1656, Louis XIV of France created a public system of hospitals for those with mental disorders, but as in England, no real treatment was applied.[19]

 

In 1713 the Bethel Hospital Norwich was opened, the first purpose-built asylum in England, founded by Mary Chapman

In 1890, Sigmund Freud proposed psychoanalysis as  a set of theories and therapeutic techniques that deal in part with the unconscious mind, and which together form a method of treatment for mental disorders. These have been discarded. Carl Jung started as a follower of Freud but parted ways with analytical psychology based on  individuation—the lifelong psychological process of differentiation of the self out of each individual's conscious and unconscious elements. Jung considered it to be the main task of human development. He created some of the best known psychological concepts, including synchronicityarchetypal phenomena, the collective unconscious, the psychological complex and extraversion and introversion.

Psychoanalysis continues to be practiced by psychiatrists, social workers, and other mental health professionals; however, its practice has declined. It has been largely replaced by the similar but broader psychodynamic psychotherapy in the mid-20th century.[82] Psychoanalytic approaches continue to be listed by the UK National Health Service as possibly helpful for depression.

Psychodynamic psychotherapy or psychoanalytic psychotherapy[1][2] is a form of psychological therapy. Its primary focus is to reveal the unconscious content of a client's psyche in an effort to alleviate psychic tension, which is inner conflict within the mind that was created in a situation of extreme stress or emotional hardship, often in the state of distress.

Abortion 

  • Fetal viability typically occurs between 24 and 28 weeks of gestation, although it can happen as early as 22 weeks in some cases.

  • Less than 1% of abortions happen post 24 weeks. The majority before 13 weeks. 

Prior to the Dobbs decision, viability was the delineating factor in the abortion debate established by Roe v Wade and subsequent Supreme Court decisions.

By a 7 to 2 vote in 1973, the Supreme Court established a constitutional right to abortion, striking down laws in many states that had barred the procedure. The court said states could not ban abortions before fetal viability, the point at which the fetus can survive outside the womb. That was around 28 weeks at the time and, because of improvements in medical technology, is around 23 weeks now. However, viability has never been properly defined by the courts and in reality, depends on the individual pregnancy and on various factors, including gestational age, fetal weight and sex, and medical interventions available. While viability does not refer to a specific gestational age, it is often presumed at 24 weeks gestation, with “periviability” referring to the time around viability (20 to 26 weeks gestation). 

Fetal Anomalies: Individuals also seek abortions later in pregnancy due to medical reasons. With medical advances, many genetic fetal anomalies can be detected early in pregnancy; for example, chorionic villus sampling can diagnose Down Syndrome or cystic fibrosis as earlier as 10 weeks gestation. Structural fetal anomalies, however, are often detected much later in pregnancy. As part of routine care, a fetal anatomy scan is performed around 20 weeks, which entails ultrasound imaging of all the developing organs. Many structural anomalies are discovered at this time that would not have been apparent previously. A proportion of these are lethal fetal anomalies, meaning that the fetus will almost certainly die before or shortly after birth, meaning the fetus may be nonviable (consensus does not always exist as to which anomalies are fatal, and thus nonviable). In these cases, many individuals wish to terminate their pregnancies, rather than risk carrying the pregnancy until the fetus or newborn passes away. Very often these pregnancies are desired, making this decision exceedingly difficult for parents. Inadequate data exist to know how many abortions later in pregnancy occur due to fetal anomalies, but a study by Washington University Hospital showed almost all women whose fetuses had lethal fetal anomalies chose to terminate their pregnancies. 

Health Risk to the Pregnant Person: Life threatening conditions may also develop later in pregnancy. These include conditions like early severe preeclampsia, newly diagnosed cancer requiring prompt treatment, and intrauterine infection (chorioamnionitis) often in conjunction with premature rupture of the amniotic sac (PPROM). If these conditions occur in a state where abortion is legal, the pregnant individual may pursue termination of pregnancy to preserve their own health. All states that ban abortion, have gestational bans, or limit abortion at or near viability, have exceptions allowing for abortions to occur when the life of the pregnant person is in danger, and 37 states have exceptions for when the health of the pregnant person is at risk (Figure 5). Former President Donald Trump reportedly supports a 16-week national abortion ban with exceptions for when the life of the pregnant person is in danger and in cases for rape or incest, but the 16-week national ban would not have an exception for when the health of the pregnant person is at risk. It is likely that a ban such as this would be structured so that it would limit abortions in states that currently permit abortions later in pregnancy, but allow states with abortion bans or gestational restrictions to keep their laws in effect.

The legal standards states use to determine when a pregnant person qualifies for a life or health exception can be ambiguous, with some standards leaving physicians in a legally vulnerable position that allow a prosecutor to bring an expert witness to contradict the physician’s medical judgment. In a recent case out of Texas, Kate Cox, a pregnant women seeking an abortion, sought a court order that would have allowed her to have an abortion under the exceptions to the Texas abortion ban. Fearing prosecution for providing abortion care that she believed it fit under the abortion ban’s exception based upon her good faith medical judgement, Ms. Cox’s physician asked a Texas District Court to determine that providing the abortion was not a violation of the state’s ban. While the District Court agreed with the plaintiffs that the case qualified for an exception, the Texas state Attorney General wrote a letter to the hospital stating that his office would still enforce the state abortion ban if abortion care was provided. Consequently, the Texas Supreme Court overruled the lower district court, finding that the physician’s “good faith belief” was insufficient to qualify for the exception, and only abortions that are certified to be necessary under the “reasonable medical judgement” standard are allowable under Texas law.

https://www.kff.org/womens-health-policy/press-release/what-the-data-show-abortions-later-in-pregnancy/

 

cancer-mortality-rates-1930-2003-l.jpg

 Anxiety disorder

An anxiety disorder is anxiety or fear that interferes with normal functioning may be classified as an anxiety disorder.[37] Commonly recognized categories include specific phobiasgeneralized anxiety disordersocial anxiety disorderpanic disorderagoraphobiaobsessive–compulsive disorder and post-traumatic stress disorder.     

https://annals-general-psychiatry.biomedcentral.com/articles/10.1186/1475-2832-2-2#Abs1

Mood disorder

Other affective (emotion/mood) processes can also become disordered. Mood disorder involving unusually intense and sustained sadness, melancholia, or despair is known as major depression (also known as unipolar or clinical depression).  Bipolar disorder (also known as manic depression) involves abnormally "high" or pressured mood states, known as mania or hypomania, alternating with normal or depressed moods.

  

Psychotic disorder

Patterns of belief, language use and perception of reality can become dysregulated (e.g., delusionsthought disorderhallucinations). Psychotic disorders in this domain include schizophrenia,  

Personality

The fundamental characteristics of a person that influence thoughts and behaviors across situations and time—may be considered disordered if judged to be abnormally rigid and maladaptive.  A number of different personality disorders are listed, including those sometimes classed as eccentric, such as paranoidschizoid and schizotypal personality disorders; types that have described as dramatic or emotional, such as antisocialborderlinehistrionic or narcissistic personality disorders; and those sometimes classed as fear-related, such as anxious-avoidantdependent, or obsessive–compulsive personality disorders.   If an inability to sufficiently adjust to life circumstances begins within three months of a particular event or situation, and ends within six months after the stressor stops or is eliminated, it may instead be classed as an adjustment disorder. T 

Eating disorder

Eating disorders involve disproportionate concern in matters of food and weight.[37] Categories of disorder in this area include anorexia nervosa, bulimia nervosaexercise bulimia or binge eating disorder.[44][45]

Sleep disorder

Sleep disorders are associated with disruption to normal sleep patterns. A common sleep disorder is insomnia,

Sexual disorders

Include dyspareunia and various kinds of paraphilia (sexual arousal to objects, situations, or individuals that are considered abnormal or harmful to the person or others).

Impulse control disorder:

People who are abnormally unable to resist certain urges or impulses that could be harmful to themselves or others, may be classified as having an impulse control disorder, and disorders such as kleptomania (stealing) or pyromania (fire-setting). Various behavioral addictions, such as gambling addiction, may be classed as a disorder. Obsessive–compulsive disorder can sometimes involve an inability to resist certain acts but is classed separately as being primarily an anxiety disorder.

Substance use disorder:

This disorder refers to the use of drugs (legal or illegal, including alcohol) that persists despite significant problems or harm related to its use. Substance dependence and substance abuse fall under this umbrella category in the DSM. Substance use disorder may be due to a pattern of compulsive and repetitive use of a drug that results in tolerance to its effects and withdrawal symptoms when use is reduced or stopped.

Dissociative disorder:

People with severe disturbances of their self-identity, memory, and general awareness of themselves and their surroundings may be classified as having these types of disorders,  (which was previously referred to as multiple personality disorder or "split personality").

Cognitive disorder:

These affect cognitive abilities, including learning and memory. This category includes delirium and mild and major neurocognitive disorder (previously termed dementia).

Developmental disorder:

These disorders initially occur in childhood. Some examples include autism spectrum disorder, oppositional defiant disorder and conduct disorder, and attention deficit hyperactivity disorder (ADHD), which may continue into adulthood. Conduct disorder, if continuing into adulthood, may be diagnosed as antisocial personality disorder (dissocial personality disorder in the ICD).

Dissocial personality disorder

People with dissocial personality disorder exhibit traits of impulsivity, high negative emotionality, low conscientiousness and associated behaviors, including irresponsible and exploitative behavior, recklessness and deceitfulness. Popular labels such as psychopath (or sociopath) do not appear in the DSM or ICD but are linked by some to these diagnoses. Psychopathy, sometimes considered synonymous with sociopathy, is characterized by persistent antisocial behavior, impaired empathy and remorse, and bolddisinhibited, and egotistical traits.[1][2][3] Different conceptions of psychopathy have been used throughout history that are only partly overlapping and may sometimes be contradictory.  Although no psychiatric or psychological organization has sanctioned a diagnosis titled "psychopathy", assessments of psychopathic characteristics are widely used in criminal justice settings in some nations and may have important consequences for individuals.[specify] The study of psychopathy is an active field of research. The term is also used by the general public, popular press, and in fictional portrayals.[11][12] While the term is often employed in common usage along with "crazy", "insane", and "mentally ill", there is a categorical difference between psychosis and psychopathy. 

Factitious disorders 

are diagnosed where symptoms are thought to be reported for personal gain. Symptoms are often deliberately produced or feigned, and may relate to either symptoms in the individual or in someone close to them, particularly people they care for.

Relational disorder,

where the diagnosis is of a relationship rather than on any one individual in that relationship. The relationship may be between children and their parents, between couples, or others. There already exists, under the category of psychosis, a diagnosis of shared psychotic disorder where two or more individuals share a particular delusion because of their close relationship with each other.

 

There are a number of uncommon psychiatric syndromes, which are often named after the person who first described them, such as Capgras syndromeDe Clerambault syndromeOthello syndromeGanser syndromeCotard delusion, and Ekbom syndrome, and additional disorders such as the Couvade syndrome and Geschwind syndrome.

Treatment 

There is also a wide range of psychotherapists (including family therapy), counselors, and public health professionals. In addition, there are peer support roles where personal experience of similar issues is the primary source of expertise. 

A major option for many mental disorders is psychotherapy. There are several main types. Cognitive behavioral therapy (CBT) is widely used and is based on modifying the patterns of thought and behavior associated with a particular disorder. Other psychotherapies include dialectic behavioral therapy (DBT) and interpersonal psychotherapy (IPT). Psychoanalysis, addressing underlying psychic conflicts and defenses, has been a dominant school of psychotherapy and is still in use. Systemic therapy or family therapy is sometimes used, addressing a network of significant others as well as an individual.

Some psychotherapies are based on a humanistic approach. There are many specific therapies used for particular disorders, which may be offshoots or hybrids of the above types. Mental health professionals often employ an eclectic or integrative approach. Much may depend on the therapeutic relationship, and there may be problems with trustconfidentiality and engagement.

A major option for many mental disorders is psychiatric medication and there are several main groups. Antidepressants are used for the treatment of clinical depression, as well as often for anxiety and a range of other disorders. Anxiolytics (including sedatives) are used for anxiety disorders and related problems such as insomnia. Mood stabilizers are used primarily in bipolar disorder. Antipsychotics are used for psychotic disorders, notably for positive symptoms in schizophrenia, and also increasingly for a range of other disorders. Stimulants are commonly used, notably for ADHD.[121]

Despite the different conventional names of the drug groups, there may be considerable overlap in the disorders for which they are actually indicated, and there may also be off-label use of medications. These medications in combination with non-pharmacological methods, such as cognitive-behavioral therapy (CBT) are seen to be most effective in treating mental disorders.

 

bottom of page