Recent Posts

Vaccine Injury Stories

Science Is Unsettling

Shop

Health News, The Science Is Unsettling, Vaccines

Injecting Poisons: A History of the Syringe

For centuries now, humans have been at war with microbes–despite having an incomplete understanding of the human body, and how microbes fit into the ecological web of human health.

Amidst our fascination with avoiding microbes, there is one medical discovery that magnified our fear of viruses and bacteria in particular, while simultaneously gaining uncontested permission to inject these very pathogens directly into our bodies: the syringe. 

At some base human level, doesn’t it feel wrong to insert a long, sharp piece of metal into our body and inject a foreign substance into it?  

Now, imagine that syringe and needle was used on hundreds of people before you (and contained animal and human blood products, bacteria, viruses, aluminum)–Sounds crazy right?

While the syringe may be a medical achievement in some contexts, there’s one big problem we can’t pretend didn’t happen: Because we know that disposable syringes were not invented until 1954, not mass-produced until 1961, and not in widespread use until the mid-1960s.

So…that means from its invention in 1844 to the mid-1960s:

Every single syringe and needle was REUSED OVER AND OVER AGAIN.

Today, we know we aren’t supposed to SHARE needles and syringes at all, ever. But it once was completely standard protocol, and sadly, in many parts of the world–it’s still standard protocol.

This largely unrecognized, centuries-long medical mistake changes so many things: it shatters our sugar-coated understanding of history, it questions our perceived victimhood with regard to some very notorious outbreaks, and it shines a bright light on the very 20th century mortality statistics we use as the basis to mandate our current vaccine schedule.

We must reconcile and deal with the fact that both the sick and healthy shared the same invasive medical devices, with substandard sterilization, without gloves, easily transmitting pathogens, by THE VERY MEANS that was meant as protection.

We must admit the deepest, darkest secrets of history: For many people WE caused disease. Because of our Frankensteinian obsession, WE created this problem.

Syringe Reuse Is Still Happening

According to a report by the World Health Organization from 2000, it was estimated that 40% of the total 16 billion annual injections in the world are given with reused equipment, and in developing nations 70% of injections are given with reused syringes and needles.

Globally, unsafe injections account for 30% of new hepatitis B infections, 41% of hepatitis C infections, and 9% of new HIV/AIDS infections. Unsafe injections cause 1.3 million early deaths every year. These are not from illicit drug use, these are all medical health care setting injections, given by a doctor or nurse.

In the United States, a 2017 survey published in the American Journal of Infection Control found that 12% of physicians and 2% of nurses indicated reuse of syringes for more than one patient occurs in their workplace. According to the survey, 6% of US health care providers “sometimes or always” use a single-dose/single-use vial for more than 1 patient.

Since 2001, more than 150,000 patients nationwide have been victims of unsafe injection practices, and two-thirds of those risky shots were administered in just the past four years, according to data from the U.S. Centers for Disease Control and Prevention.

In 2008, a Las Vegas health clinic exposed 40,000 patients to hepatitis B and C and HIV through reusing syringes and vials of medication over a 4-year-period.

In 2012, 8,000 letters were sent to patients of a former Colorado oral surgeon who was found to have reused syringes and needles on patients receiving intravenous medication over a 12-year period.

In 2013, thousands of patients were notified and at least 60 patients tested positive for hepatitis B or HIV after an Oklahoma dentist was found to be reusing needles and using rusty equipment on his patients.

In 2019, 900 children tested positive for HIV in Pakistan linked to a pediatrician who reused syringes and IV drips.

A Historical Look at the Syringe

 

The syringe, from the Greek word syrinx, meaning “tube”, dates back to the 1st century AD. These early instruments required the skin to be cut first, and their primary function was to drain fluid, pus or foreign bodies from cavities and abscesses.

The idea of the hollow needle and syringe (which didn’t come until 1844) was inspired by the stinger of a bee.

In nature, stingers and fangs of poisonous insects and animals, are the most natural biological prototype for a syringe and needle.

Biologically, the mode of delivering venom by injection into prey or a larger enemy allows the poison quick access to the central nervous system to render the victims paralyzed and eventually, dead. It’s a very smart adaptation for an otherwise unassuming, small creature.

Observing insects and animals with fangs or stingers is probably what compelled early humans to make a ‘poison needle’ as a weapon in the form of the blowpipe and poison dipped arrows what was used by practically every indigenous and ancient culture for thousands of years.

Arrows or darts were dipped in poisons (which are nerve agents) from plants or animals, and blown through a pipe to pierce the skin of enemies. The poison quickly spreads to the central nervous system, causing the aforementioned paralysis, death etc.

Modern day botox injections (Botulism neurotoxin type-A) have also been shown to reach the central nervous system.

Anything can be a poison. The old saying is “the dose makes the poison.” I will add that: each recipient is unique, and what may not be poisonous for one person, for example peanuts or latex, may be poisonous for another person.

A poison is defined as:

“a substance that is capable of causing the illness or death of a living organism when introduced or absorbed.”

Coincidently, the word “virus” comes from Latin, vīrus, referring to “poison and other noxious liquids.” According to Oxford Dictionary:

“Late Middle English (denoting the venom of a snake): from Latin, literally ‘slimy liquid, poison’. The earlier medical sense, superseded by the current use as a result of improved scientific understanding, was ‘a substance produced in the body as the result of disease, especially one capable of infecting others’.”

The First Modern Syringe

In 1844 Irish physician and surgeon Francis Rynd created the first hollow needle for subcutaneous injection of morphine to treat a female patient who had suffered for years with severe pain in her face due to neuralgia, which is damaged nerves.

A Rynd-like hypodermic syringe, circa 1860-1880. Is it just me or does the case look like a coffin?

When drinking morphine to kill the pain didn’t work because of our stomach acids, Rynd tried the more direct route of injecting the morphine under her skin closer to the damaged facial nerves.

Pain relief was achieved and this new technique was quickly and widely accepted. It was even credited as:

“The greatest boon to medicine since the discovery of chloroform.”

A few years later in 1853, Scottish physician Alexander Wood and French surgeon Charles Gabriel Pravaz added to the design to make the first true hypodermic syringe.

Taking as his model the “sting of the bee“, Wood constructed a small glass syringe to which was attached a fine perforated needle point for the injection of morphine and preparations of opium.

At the turn of the 20th century, syringes were still artisan-made, hand-crafted items fashioned from metal and glass which allowed the doctor to measure the liquid more accurately, and the piston may have been made from waxed linen tape or asbestos wound on a reel to obtain a watertight seal. 

Early syringes were expensive, meant to be durable, and last a lifetime. Doctors might only have one syringe for all his or her patients. They were a convincing addition to other precision medical equipment including thermometers and stethoscopes, that helped  convince patients that the doctor knew what he was doing.

In 1900, syringes cost $50 per unit, and by 1920, still only about 100,000 syringes were being made each year in the entire world. (For reference, 4.7 million US men and women served in WWI and each one of them received multiple vaccine injections, as well as shots of pain killers, etc.).

In 1952 and 1953, Jonas Salk tested his polio vaccine on children at the D.T. Watson Home for Crippled Children, and residents of the Polk State Home, and at this time he still used a reusable glass syringe. 

In 1954, Jonas Salk was planning a large-scale polio vaccine trial (using the double-blind method) on 623,972 school children in different parts of the United States and he needed a large number of syringes and needles all at once for the trial. This appears to be the impetus to make what was considered the first disposable syringe, although this syringe was still made of glass and still reused between children. Becton, Dickinson delivered:

“thousands of 5 cc syringes and several million 1-inch needles to 215 test sites across the nation.”

For 600,000 children the syringe manufacturer delivered “thousands” to various test sites. Below, Miss Gladyce Toscano places 6,000 hypodermic needles and syringes in an autoclave to prepare for the polio vaccine trial in Los Angeles.

Miss Gladyce Toscano, who is responsible for the daily allotment of 6000 hypodermic needles and syringes in Los Angeles County General Hospital, is shown with a machine used in the process of cleaning the instruments. Thus, the hospital will be ready to use the Salk vaccine against polio.

The trial was deemed successful, so the following year in April 1955, 200,000 children in five Western and mid-Western States received a polio vaccine with a live virus (a virus that was not properly inactivated) and this resulted in 40,000 cases of polio, leaving 200 children with varying degrees of paralysis and killing 10. This is known as the “Cutter Incident”.

Plastic disposable syringes finally began to be mass-produced beginning in 1961. It would be several more years before the truly disposables would be in widespread use. An insulin patient describes the switch to disposable syringes in the 1960s as a very slow process. 

Problems Associated with Syringe Reuse

  • The jagged, rough edge easily traps bacteria and viral microorganisms
  • Needles often become blunt with multiple use
  • Needles required re-sharpening often
  • Impossible to clean and sterilize all parts adequately
  • Caused infections leading to cellulitis, abscesses, erysipelas, sepsis
  • Transmitted infectious diseases like hepatitis b, hepatitis c, smallpox, polio, measles, influenza, syphilis, tetanus, pneumococcus, MRSA, and HIV/AIDS.

Sterilization? What’s That

The oldest method of sterilization consisted of holding a flame to surgical instruments. Sterilization by boiling water was first introduced in 1881. They wouldn’t know it at the time, but boiling does not kill 100% of microorganisms. Steam sterilization and autoclaves came out in the late 1880s, but as with everything, there is a lag before inventions become widely available and affordable.

Vaccine campaigns like this one in the video below, show even as late as the 1950s syringes being reused between patients with very minimal sterilization using an open flame between children:

Disposable gloves didn’t become standard protocol until the 1990s. The first rubber glove was invented in 1889 by Dr. William Halsted specifically for his ‘scab nurse’ (who he would soon marry) whose hands broke out in dermatitis from all the strong chemical disinfectants used during surgery, namely mercuric chloride and carbolic acid.

After that, more surgeons and assistants began to wear them, but mostly to protect their own hands from the harsh chemicals used during surgery.

The first disposable latex gloves were manufactured in 1964. Medical-grade latex gloves that were FDA regulated to prevent infectious microorganisms from spreading through the glove did not appear until the 1990s.

Better Sanitation, Nutrition & Cleaner Air

I think many people have a certain image of human health throughout history, namely that: for thousands of years humans had ill-health and short life spans, that only improved with the advent of modern medicine, after which they experienced better and improving health, and longer life spans. In this view, modern medicine is credited as the humble hero.

However, humans have been living into their 70s and 80s for thousands of years. And in light of medical error being the third leading cause of death nationwide, and that 70,000 Americans die annually from drug overdose, 63% of them doctor-prescribed, and the other two leading causes: cancer and heart disease are largely environmentally caused–I’m going to have to disagree with you.

But it’s interesting to note two things: human civilization created the conditions for disease, and doctors were mistrusted–even back then.

Infectious diseases did not become a “plague” to humans until humans settled down into large civilizations and began farming. The intersection of larger civilizations, as opposed to smaller hunter-gatherer nomadic groups, the new habit of eating single crops, as opposed to a variety of foraged nutrient-dense foods, and the piling up of human waste and refuse, created the perfect storm for diseases to breed and take hold.

Back when doctors were pushing concoctions of mercurous chloride and opium, bloodletting and purging, and practicing what was called “heroic medicine” that always ended in a patient dying anyways, Joseph Addison wrote in The Spectator in 1711:

“If we look into the profession of the physic, we shall find the most formidable body of men. The sight of them is enough to make a man serious, for we may lay it down as a maxim, that when a nation abounds in physicians it grows thin of people.”

Woman emptying her chamber pot on passers-by.

Life was dire in the late 18th and 19th centuries. But it wasn’t because of a lack of vaccines. It was from lack of toilets, before sewage systems, before clean, running water. People’s waste was thrown out the window into the streets, and ended up in their only source of drinking water. Air quality was actually worse in the 19th century than it is today in many big cities. Our mortality rate was much higher for many reasons, including malnutrition, extreme overcrowding and lack of sanitation.

Thus, the desire and promise of a ‘quick cure’, became a very attractive proposition. Together with the mechanism to simply inject a cure, fulfilled a fantasy for quick health, and in turn created a market for a long list of injectable serums, antitoxins, immunoglobulins, and vaccines.

With no regulatory oversight, no labeling requirements, no safety testing, no ingredients that were off-limits, this was a recipe for disaster.

The Quick Cure

Common medicines in the 19th century contained what we would today consider to be completely toxic poisons: mercury, lead, arsenic, strychnine, antimony, cocaine, opium, heroine, formalin, formaldehyde.

Likened to the Wild West, pharmacists did not need any specific formal training to own a pharmacy, or to make their own medications and serums.

The corner pharmacist assumed the role of health care provider for the majority of people who could not afford to see doctors.

Pharmacist Henry K. Mulford (who graduated from the Philadelphia College of Pharmacy in 1887) launched his own line of pharmaceuticals, including introducing the diphtheria antitoxin in 1895, followed by tetanus antitoxin, and a smallpox vaccine.

In 1896, a well-known Berlin pathologist’s 18-month old son Ernst Langerhans died shortly after being injected with a prophylactic dose of anti-diphtheria serum, and his father Robert Langerhans published an obituary notice stating that his son had been poisoned by Behring’s anti-diphtheria serum.

The nature of Ernst Langerhans’s death combined with the fact that he came from a prominent family of physicians made the event a public scandal. The “official” cause of death, following the investigations into the case, was proclaimed to be an accident.

In 1901, nine children died from tetanus after being vaccinated at school with the Mulford smallpox vaccine, which may have been contaminated by the bacteria.

And then again, in another part of the nation in 1901, 13 children died after being injected with diphtheria antitoxin that was contaminated with tetanus.

“As was his routine, he injected diphtheria antitoxin into the child and, as a preventive, her 2 younger siblings and concluded that “she would soon be entirely well.” But 4 days later he was called back to the Bakers’ home to a terrifying discovery:There I found that the little girl was suffering from tetanus (lockjaw). I could do nothing for her. The poison was injected so thoroughly into her system that she was beyond medical aid.”1

Bessie died of tetanus the following day, as did her 2 siblings within the week. So began one of the worst safety disasters in the history of American public health, in which, by the time it was over, some 13 children had died of tetanus from contaminated antisera.”

In 1917, a perfectly healthy seven-year-old boy died 20 minutes after being injected with Diphtheria antitoxin.

By 1918, there was a serum or antitoxin for just about everything. These products were made by immunizing horses against virulent viruses and bacteria and then extracting the “serum” or liquid part of their blood, which contains antibodies. This then would be combined with preservatives like phenol or cresol (derived from toluene, possible human carcinogen), and then filled into ampules, syringes, or cylinders.

In New and Nonofficial Remedies, 1918, some of the listed serums and antitoxins on the market (but not affordable or used by all because there was no government subsidy) included Diphtheria, tetanus, anti-anthrax serum, antidysenteric serum, antigonococcus serum, antimeningococcus serum, antipneumococcus, antistreptococcus, and also antigen containing products, which we know as vaccines, including vaccinicum (smallpox), antirabic vaccine, tuberculins, acne bacillus, cholera, colon bacillus, diphtheria bacillus vaccine, friedlaender bacillus vaccine, gonococcus vaccine, meningococcus vaccine, pertussis bacillus, plague bacillus, pneumococcus, pyocyaneus bacillus, staphylococcus vaccines, streptococcus vaccine, typhoid vaccine, by all the major pharmaceutical players of the time including Lederle Laboratories, Park, Davis & Co, E.R Squibb & Sons, Abbott Laboratories, Greeley Laboratories.

That all these products were being made and marketed does not mean everyone had equal access to these medical treatments. Some people, for example persons in the military or institutionalized children, had greater access to ‘preventative medical care’.

Vaccinating Infants and Children

There might not be anything more unsettling than the idea of giving a therapeutic to a child that may cause them more hard than good. I understand the urge to protect a child from a dreadful disease, but it must be weighed cautiously against the very real possibility of causing disease, causing harm, via the very means of prevention.

Throughout the last century infants have been given a battery of injections beginning just minutes after birth. It’s important to remember that disposable syringes were not in widespread use until the 1960s, so every injection, I mean EVERY INJECTION, given before then was given with a reusable syringe and needle that was used countless times.

It would take decades, even a century, before we would understand how dangerous this was.

In the early to mid 1800s, widespread smallpox vaccination became mandatory in many parts of the world. Most countries passed “compulsory vaccination laws” requiring infants to be vaccinated as young as three months of age or parents would be fined.

Early smallpox vaccines were not injections but were scraped pus-filled scabs placed on the opening in the arm that was made with a lancet. Once smallpox vaccination utilized actual syringes, the vaccines were derived from calf lymph or horse lymph. If calf-lymph was not available, human lymph was still used via arm-to-arm transmission. Later, vaccinia virus became used in smallpox vaccines.

Inherently a dangerous procedure, vaccine recipients often contracted syphilis, tetanus, tuberculosismeasles, erysipelas, or came down with cow or horse pox, or even smallpox.

“In one episode at Rivalta, Italy, for example sixty-three children were vaccinated with material taken from the vaccinal pustule of an apparently healthy infant who had an inapparent syphilis infection.

Forty-four of the vaccinated infants developed overt syphilis, several died of it, and some infected their mothers and nurses.” (Hopkins, 2002)

Foundlings

Still from a difficult to watch video on youtube: “Emotional Deprivation in Infancy :: Study by Rene A. Spitz 1952”

Infants in foundling hospitals and orphanages were vaccinated at birth, and given routine immunoglobulins and serums, tested on for experimental vaccine trials, as well as vaccinated with lymph from each others arms every single year.

Children who were in foundling hospitals, despite having access to what was considered expert medical care, had a higher mortality rate than children outside of the orphanage, and higher incidence of neurological and developmental disorders.

In the 1940s, Austrian psychoanalyst Rene Spitz observed over many months infants in a Latin Foundling and was surprised that despite “superb” medical attention, including hygiene and impeccable ‘precautions against contagion’:

“The children showed, from the third month on, extreme susceptibility to infection and illness of any kind. There was hardly a child in whose case history we did not find reference to otitis media, or morbilli, or varicella, or eczema, or intestinal disease of one kind or another.

No figures could be elicited on general mortality; but during my stay an epidemic of measles swept the institution, with staggeringly high mortality figures, notwithstanding liberal administration of convalescent serum and globulins, as well as excellent hygienic conditions. Of a total of 88 children up to the age of 2½, 23 died.

…In the younger group, 6 died, i.e., approximately 13%. In the older group, 17 died, i.e., close to 40%. The significance of these figures becomes apparent when we realize that the mortality from measles during the first year of life in the community in question, outside the institution, was less than ½%.”

For the infants in the foundling who survived to two and three years old: they had “severe developmental retardation”, were sickly and more prone to infections, non-verbal, couldn’t walk, were incontinent, screamed, non-engaging, and suffered from “bizarre stereotyped motor patterns distinctly reminiscent of the stereotypy in catatonic motility.

It sounds like Rene Spitz is describing what we would today call “autism”.

Leo Kanner (also an Austrian psychiatrist) had just coined ‘autism’ in 1943, and proposed the cause of autism to be “refrigerator mothers”. While parental affection and stability clearly plays a significant role in infant development, could any of these symptoms also be related to the ‘liberal administration of convalescent serum and globulins’ with reused needles and syringes during these early periods of development?

Today, we know that infant immune activation is a known trigger for autism. Newborn immune system activation has long-term negative impacts on brain function, including symptoms commonly associated with autism spectrum disorder and other developmental conditions.

Brian S. Hooker’s recently published study Analysis of health outcomes in vaccinated and unvaccinated children: Developmental delays, asthma, ear infections and gastrointestinal disorders found that fully vaccinated children had twice the rate of developmental delays than fully unvaccinated children.

Crib Death

Towards the end of the 19th century, sudden death was occasionally observed in association with anesthesia, but with the introduction of so many injections in early life in the early 20th century, the sudden death of infants would soon be observed with greater frequency and for many years termed “cot death” or “crib death”.

It was a seemingly new condition and it sparked the attention and curiosity of many pathologists in the midcentury, as they were the first to encounter the new problem.

An early sudden death account from 1929 occurred in a female infant who began daily insulin injections a week or so before her death.

A case study of twins dying of sudden, anaphylactic shock a day after a second injection of diphtheria and pertussis vaccine in 1946.

A seven-year-old boy died 20 minutes after subcutaneous injection.

In 1961, the death of an infant in hyperthermia after vaccination describes a 3-month-old baby girl whose temperature rose to 108 degrees F the morning after being given a smallpox vaccine on the upper arm.

In 1958, Jed and Louise Roe’s 6-month-old son Mark Addison Roe would be found dead in his crib just two weeks after his doctor gave him “a routine injection” for diphtheria, tetanus and whopping cough, as well as his first polio shot.

The autopsy came back with acute bronchial pneumonia, even though Mark showed no signs of illness. His parents soon formed the Mark Addison Roe Foundation, which was later renamed the SIDS Foundation.

Cot death became “Sudden Infant Death Syndrome” in 1969, which updated its definition in 1991 to include a thorough death scene investigation and reenactment, after which we see a decline in SIDS diagnoses, and an increase in “Accidental Suffocation and Strangulation in Bed” and “Unknown” or “Undetermined”.

SIDS may turn out to be one of the biggest cover-ups of all time, especially considering that infants and children die shortly after vaccination all around the world.

Epidemics Caused By Needle Reuse

The Spanish Flu pandemic of 1918 was preceded by an experimental bacterial meningitis vaccine cultured in horses by the Rockefeller Institute for Medical Research in New York was injected into soldiers at Fort Riley, where the first cases emerged.

The 1916 polio outbreak that started in Brooklyn, NY was just a few subway stops from the Rockefeller Institute where Simon Flexner had been passaging spinal cord tissue containing poliovirus between Rhesus monkey spinal cord to another.

When in the 1940s and 1950s cases of paralytic polio began increasing in numbers, studies found that in the US and Australia children who experienced paralytic polio were more likely to have had a recent injection of DPT vaccine or a penicillin injection within the last 30 days than controls. It even has a name: provocation polio. They subsequently halted vaccinations for a period of time in certain areas.

During a large outbreak in Oman, children with paralytic poliomyelitis were twice as likely to have received a DPT injection within the last 30 days.

Researchers have traced the origins to the Hepatitis C epidemic in North America from 1940s to 1965, with a peak in transmission rates in 1950, when the oldest baby boomers were only 5 years old. Researchers believe transmission and rapid expansion of hepatitis C virus infections was caused by the reuse of glass and metal syringes.

In Southern Italy a hepatitis C outbreak was found to be caused by the multiple use of unsafe glass syringes of the Salk vaccine between 1956 and 1965.

“Persons born between the 1940s and early 1960s have a nearly 3-fold increased risk of HCV seropositivity than the younger age group. The findings are consistent with a cohort effect of exposure to the Salk parenteral vaccination.”

In Egypt, a campaign against the parasite illness, schistosomiasis, conducted from the 1950’s to the 80’s has resulted in about 15 to 20 percent of the 63.3 million Egyptians to have antibodies to hepatitis C, meaning that they have been infected by the virus that was transmitted by reused syringes.

It is believed by many that the AIDS epidemic was caused completely by reused syringes used in the 1940s and 1950s in polio vaccine experiments in Africa.

The world’s first known case of AIDS has been traced to a sample of blood plasma from a man who died in the Democratic Republic of Congo in 1959.

An early polio vaccine, the Koprowski vaccine, was tested on 1 million people in Belgian territories in Africa, including the Democratic Republic of the Congo, Rwanda and Burundi, including 76,000 children under the age of 5 in the Belgian Congo from 1958-1960.

Watch Stanley Plotkin discussing the Congo vaccine trials here.

SOURCES:

The Relation Between Recent Injections And Paralytic Poliomyelitis in Children. https://ajph.aphapublications.org/doi/pdfplus/10.2105/AJPH.42.2.142

Polio provocation: solving a mystery with the help of history https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(14)61251-4/fulltext

Attributable risk of DTP (diphtheria and tetanus toxoids and pertussis vaccine) injection in provoking paralytic poliomyelitis during a large outbreak in Oman. https://www.ncbi.nlm.nih.gov/pubmed/1538150

Insulin Delivery Device Technology 2012: Where Are We after 90 Years? https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3440168/

The U.S. Military and the Influenza Pandemic of 1918–1919 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2862337/

One needle, one syringe, only one time? A survey of physician and nurse knowledge, attitudes, and practices around injection safety https://www.sciencedirect.com/science/article/abs/pii/S0196655317306806

Vaccination with the CHAT Strain of Type 1 Attenuated Poliomyelitis Virus in Leopoldville, Congo*

https://apps.who.int/iris/bitstream/handle/10665/267381/PMC2555526.pdf?sequence=1&isAllowed=y

Written by

387   Posts

View All Posts
Follow Me :

Leave a Reply

Your email address will not be published. Required fields are marked *