搜索
热搜: music
门户 Wiki Wiki Health view content

Pharmaceutical industry

2014-11-25 20:37| view publisher: amanda| views: 1003| wiki(57883.com) 0 : 0

description: The modern pharmaceutical industry traces its roots to two sources. The first of these are local apothecaries that expanded from their traditional role distributing botanical drugs such as morphine an ...
The modern pharmaceutical industry traces its roots to two sources. The first of these are local apothecaries that expanded from their traditional role distributing botanical drugs such as morphine and quinine to wholesale manufacture in the mid 1800s. Multinational corporations including Merck, Hoffman-La Roche, Burroughs-Wellcome (now part of GlaxoSmithKline), Abbott Laboratories, Eli Lilly and Upjohn (now part of Pfizer) began their histories as local apothecary shops in the mid-1800s. By the late 1880s, German dye manufacturers had perfected the purification of individual organic compounds from coal tar and other mineral sources and had also established rudimentary methods in organic chemical synthesis.[2] The development of synthetic chemical methods allowed scientists to systematically vary the structure of chemical substances, and growth in the emerging science of pharmacology expanded their ability to evaluate the biological effects of these structural changes.
Epinephrine, norepinephrine, and amphetamine
By the 1890s the profound effect of adrenal extracts on many different tissue types had been discovered, setting off a search both for the mechanism of chemical signalling and efforts to exploit these observations for the development of new drugs. The blood pressure raising and vasoconstrictive effects of adrenal extracts were of particular interest to surgeons as hemostatic agents and as treatment for shock, and a number of companies developed products based on adrenal extracts containing varying purities of the active substance. In 1897 John Abel of Johns Hopkins University identified the active principle as epinephrine, which he isolated in an impure state as the sulfate salt. Industrial chemist Jokichi Takamine later developed a method for obtaining epinephrine in a pure state, and licensed the technology to Parke Davis. Parke Davis marketed epinephrine under the trade name Adrenalin. Injected epinephrine proved to be especially efficacious for the acute treatment of asthma attacks, and an inhaled version was sold in the United States until 2011 (Primatene Mist).[3][4] By 1929 epinephrine had been formulated into an inhaler for use in the treatment of nasal congestion.
While highly effective, the requirement for injection limited the use of norepinephrine and orally active derivatives were sought. A structurally similar compound, ephedrine, was identified by Japanese chemists in the Ma Huang plant and marketed by Eli Lilly as an oral treatment for asthma. Following the work of Henry Dale and George Barger at Burroughs-Wellcome, academic chemist Gordon Alles synthesized amphetamine in and tested in asthma patients in 1929. The drug proved to have only modest anti-asthma effects, but produced sensations of exhilaration and palpitations. Amphetamine was developed by Smith, Kline and French as a nasal decongestant under the trade name Benzedrine Inhaler. Amphetamine was eventually developed for the treatment of narcolepsy, post-encepheletic parkinsonism, and mood elevation in depression and other psychiatric indications. It receive approval as a New and Nonofficial Remedy from the American Medical Association for these uses in 1937 and remained in common use for depression until the development of tricyclic antidepressants in the 1960s.[4]
Discovery and development of the barbiturates

Diethylbarbituric acid was the first marketed barbiturate. It was sold by Bayer under the trade name Veronal
In 1903 Hermann Emil Fischer and Joseph von Mering disclosed their discovery that diethylbarbituric acid, formed from the reaction of diethylmalonic acid, phosphorus oxychloride and urea, induces sleep in dogs. The discovery was patented and licensed to Bayer pharmaceuticals, which marketed the compound under the trade name Veronal as a sleep aid beginning in 1904. Systematic investigations of the effect of structural changes on potency and duration of action led to the discovery of phenobarbital at Bayer in 1911 and the discovery of its potent anti-epileptic activity in 1912. Phenobarbital was among the most widely used drugs for the treatment of epilepsy through the 1970s, and as of 2014, remains on the World Health Organizations list of essential medications.[5][6] The 1950's and 1960's saw increased awareness of the addictive properties and abuse potential of barbiturates and amphetamines and led to increasing restrictions on their use and growing government oversight of prescribers. The major use of these drugs today is restricted to the use of amphetamine for the treatment of attention deficit disorder and phenobarbital for epilepsy.[7][8]
Insulin
A series of experiments performed from the late 1800s to the early 1900s revealed that diabetes is caused by the absence of a substance normally produced by the pancreas. In 1869, Oscar Minkowski and Joseph von Mering found that diabetes could be induced in dogs by surgical removal of the pancreas. In 1921, Canadian professors Frederick Banting and his student Charles Best repeated this study, and found that injections of pancreatic extract reversed the symptoms produced by pancreas removal. The extract was demonstrated to work in people soon thereafter, but development of insulin therapy as a routine medical procedure was delayed by difficulties in producing the material in sufficient quantity and with reproducible purity. The researchers sought assistance from industrial collaborators at Eli Lilly and Co. based on the company's experience with large scale purification of biological materials. Chemist George Walden of Eli Lilly and Company found that careful adjustment of the pH of the extract allowed a relatively pure grade of insulin to be produced. Under pressure from Toronto University and a potential patent challenge by academic scientists who had independently developed a similar purification method, an agreement was reached for non-exclusive production of insulin by multiple companies. Prior to the discovery and widespread availability of insulin therapy the life expectancy of diabetics was only a few months.[9]
Early anti-infective research - Salvarsan, Prontosil, Penicillin and Vaccines
The development of drugs for the treatment of infectious diseases was a major focus of early research and development efforts; in 1900 pneumonia, tuberculosis, and diarrhea were the three leading causes of death in the United States and mortality in the first year of life exceeded 10%.[10][11]
In 1911 arsphenamine, the first synthetic anti-infective drug, was developed by Paul Ehrlich and chemist Alfred Bertheim of the Institute of Experimental Therapy in Berlin. The drug was given the commercial name Salvarsan.[12] Ehrlich, noting both the general toxicity of arsenic and the selective absorption of certain dyes by bacteria, hypothesized that an arsenic-containing dye with similar selective absorption properties could be used to treat bacterial infections. Arsphenamine was prepared as part of a campaign to synthesize a series of such compounds, and found to exhibit partially selective toxicity. Arsphenamine proved to be the first effective treatment for syphilis, a disease which prior to that time was incurable and led inexorably to severe skin ulceration, neurological damage, and death.
Ehrlich’s approach of systematically varying the chemical structure of synthetic compounds and measuring the effects of these changes on biological activity was pursued broadly by industrial scientists, including Bayer scientists Josef Klarer, Fritz Mietzsch, and Gerhard Domagk. This work, also based in the testing of compounds available from the German dye industry, led to the discover of Prontosil, the first representative of the sulfonamide class of antibiotics. Compared to arsphenamine, the sulfonamides had a broader spectrum of activity and were far less toxic, rendering them useful for infections caused by pathogens such as streptococci.[13] In 1939, Domagk received the Nobel Prize in Medicine for this discovery.[14][15] Nonetheless, the dramatic decrease in deaths from infectious diseases that occurred prior to World War II was primarily the result of improved public health measures such as clean water and less crowded housing, and that the impact of anti-infective drugs and vaccines was significant mainly after World War II.[16][17]
In 1928, Alexander Fleming discovered the antibacterial effects of penicillin, but its exploitation for the treatment of human disease awaited the development of methods for its large scale production and purification. These were developed by a U.S. and British government-led consortium of pharmaceutical companies during the Second World War.[18][19]
Early progress toward the development of vaccines occurred throughout this period, primarily in the form of academic and government funded basic research directed toward the identification of the pathogens responsible for common communicable diseases. In 1885 Louis Pasteur and Pierre Paul Émile Roux created the first rabies vaccine. The first diphtheria vaccines were produced in 1914 from a mixture of diphtheria toxin and antitoxin (produced from the serum of an inoculated animal), but the safety of the inoculation was marginal and it was not widely used. The United States recorded 206,000 cases of diphtheria in 1921 resulting in 15,520 deaths. In 1923 parallel efforts by Gaston Ramon at the Pasteur Institute and Alexander Glenny at the Wellcome Research Laboratories (later part of GlaxoSmithKline) led to the discovery that a safer vaccine could be produced by treating diphtheria toxin with formaldehyde.[20] In 1944, Maurice Hilleman of Squibb Pharmaceuticals developed the first vaccine against Japanese encephelitis.[21] Hilleman would later move to Merck where he would play a key role in the development of vaccines against measles, mumps, chickenpox, rubella, hepatitis A, hepatitis B, and meningitis.
Unsafe drugs and early industry regulation

In 1937 over 100 people died after ingesting a solution of the antibacterial sulfanalimide formulated in the toxic solvent diethylene glycol
Prior to the beginning of the 20th century drugs were generally produced by small scale manufacturers with little regulatory control over manufacturing or claims of safety and efficacy. To the extent that such laws did exist, enforcement was lax. In the United States, increased regulation of vaccines and other biological drugs was spurred by tetanus outbreaks and deaths caused by the distribution of contaminated smallpox vaccine and diphtheria antitoxin.[22] The Biologics Control Act of 1902 required that federal government grant premarket approval for every biological drug and for the process and facility producing such drugs. This was followed in 1906 by the Pure Food and Drugs Act, which forbade the interstate distribution of adulterated or misbranded foods and drugs. A drug was considered misbranded if it contained alcohol, morphine, opium, cocaine, or any of several other potentially dangerous or addictive drugs, and if its label failed to indicate the quantity or proportion of such drugs. The government's attempts to use the law to prosecute manufacturers for making unsupported claims of efficacy were undercut by a Supreme Court ruling restricting the federal government's enforcement powers to cases of incorrect specification of the drug's ingredients.[23]
In 1937 over 100 people died after ingesting Elixir of Sulfanalimide manufactured by S.E. Massengill Company of Tennessee. The product was formulated in diethylene glycol, a highly toxic solvent that is now widely used as antifreeze.[24] Under the laws extant at that time, prosecution of the manufacturer was possible only under the technicality that the product had been called an "elixir", which literally implied a solution in ethanol. In response to this episode, the U.S. Congress passed the the Federal Food, Drug, and Cosmetic Act of 1938, which for the first time required pre-market demonstration of safety before a drug could be sold, and explicitly prohibited false therapeutic claims.[25]
The Post-War Years, 1945-1970
Further advances in anti-infective research
The aftermath of the war saw an explosion in the discovery of new classes of antibacterial drugs[26] including the cephalosporins (developed by Eli Lilly based on the seminal work of Giuseppe Brotzu and Edward Abraham),[27][28] streptomycin (discovered during a Merck-funded research program in Selman Waksman's laboratory[29]), the tetracyclines[30] (discovered at Lederle Laboratories, now a part of Pfizer), erythromycin (discovered at Eli Lilly and Co.)[31] and their extension to an increasingly wide range of bacterial pathogens.
Measles cases 1944-1964 follow a highly variable epidemic pattern, with 150,000-850,000 cases per year. A sharp decline followed introduction of the vaccine in 1963, with fewer than 25,000 cases reported in 1968. Outbreaks around 1971 and 1977 gave 75,000 and 57,000 cases, respectively. Cases were stable at a few thousand per year until an outbreak of 28,000 in 1990. Cases declined from a few hundred per year in the early 1990s to a few dozen in the 2000s.
Measles cases reported in the United States before and after introduction of the vaccine.
Life expectancy by age in 1900, 1950, and 1997 United States.
Percent surviving by age in 1900, 1950, and 1997.[10]
During the years 1940-1955, the rate of decline in the U.S. death rate accelerated from 2% per year to 8% per year, then returned to the historical rate of 2% per year. The dramatic decline in the immediate post-war years has been attributed to the rapid development of new treatments and vaccines for infectious disease that occurred during these years.[16][17] Vaccine development continued to accelerate, with the most notable achievement of the period being Salk's 1954 development of the polio vaccine under the funding of the non-profit National Foundation for Infantile Paralysis.[32] The vaccine process was never patented, but was instead given to pharmaceutical companies to manufacture as a low cost generic. In 1960 Maurice Hilleman of Merck, Sharpe and Dohme identified the SV40 virus, which was later shown to cause tumors in many mammalian species. It was later determined that SV40 was present as a contaminant in polio vaccine lots that had been administered to 90% of the children in the United States.[33][34] The contamination appears to have originated both in the original cell stock and in monkey tissue used for production. In 2004 the United States Cancer Institute announced that it had concluded that SV40 is not associated with cancer in people.[35]
Other notable new vaccines of the period include those for measles (1962, John Franklin Enders of Children's Medical Center Boston, later refined by Maurice Hilleman at Merck), Rubella (1969, Hilleman, Merck) and mumps (1967, Hilleman, Merck)[36] The United States incidences of rubella, congenitial rubella syndrome, measles, and mumps all fell by >95% in the immediate aftermath of widespread vaccination.[37] The first 20 years of licensed measles vaccination in the U.S. prevented an estimated 52 million cases of the disease, 17,400 cases of mental retardation, and 5,200 deaths.[38]
Development and marketing of antihypertensive drugs
Hypertension is a risk factor for atherosclerosis, [39] heart failure,[40] coronary artery disease,[41][42] stroke,[43] renal disease,[44][45] and peripheral arterial disease,[46][47] and is the most important risk factor for cardiovascular morbidity and mortality, in industrialized countries.[48]
Early developments in the field of treating hypertension included quaternary ammonium ion sympathetic nervous system blocking agents, but these compounds were never widely used due to their severe side effects, because the long term health consequences of high blood pressure had not yet been established, and because they had to be administered by injection. The first widely used antihypertensive drugs were the thiazides, first developed by Beyer and Sprague at Merck, Sharpe, and Dohme.[49][50] and first marketed in 1957. As of 2014, thiazide diuretics remain recommended as first line therapy in the the treatment of hypertension. A 2009 Cochrane review concluded that thiazide antihypertensive drugs reduce the risk of death (RR 0.89), stroke (RR 0.63), coronary heart disease (RR 0.84), and cardiovascular events (RR 0.70) in people with high blood pressure.[51] In the ensuring years other classes of antihypertensive drug were developed and found wide acceptance in combination therapy, including loop diureteics (Lasix/furosemide, Hoechst Pharmaceuticals, 1963),[52] beta blockers (ICI Pharmaceuticals, 1964)[53] ACE inhibitors, and angiotensin receptor blockers. ACE inhibitors reduce the risk of new onset kidney disease [RR 0.71] and death [RR 0.84] in diabetic patients, irrespective of whether they have hypertension.[54]
Numerous new drugs were developed during the 1950s and mass-produced and marketed through the 1960s. These included the first oral contraceptive, "The Pill", Cortisone, blood-pressure drugs and other heart medications. MAO inhibitors, chlorpromazine (Thorazine), haloperidol (Haldol) and the tranquilizers ushered in the age of psychiatric medication. Diazepam (Valium), discovered in 1960, was marketed from 1963 and rapidly became the most prescribed drug in history, prior to controversy over dependency and habituation.
Thalidomide and the Kefauver-Harris Amendments

Baby born to a mother who had taken thalidomide while pregnant.
In the U.S., a push for revisions of the FD&C Act emerged from Congressional hearings led by Senator Estes Kefauver of Tennessee in 1959. The hearings covered a wide range of policy issues, including advertising abuses, questionable efficacy of drugs, and the need for greater regulation of the industry. While momentum for new legislation temporarily flagged under extended debate, a new tragedy became apparent that underscored the need for more comprehensive regulation and provided the driving force for the passage of new laws.
On September 12, 1960, an American licensee, the William S. Merrell Company of Cincinnati, submitted to FDA a new drug application for Kevadon (thalidomide), the brand name of a sedative that had been marketed in Europe since 1956: thalidomide. The FDA medical officer in charge of this review, Frances Kelsey, believed the data were incomplete to support the safety of this drug.
The firm continued to pressure Kelsey and the agency to approve the application—until November 1961, when the drug was pulled off the German market because of its association with grave congenital abnormalities. Several thousand newborns in Europe and elsewhere suffered the teratogenic effects of thalidomide. Though the drug was never approved in this country, the firm distributed Kevadon to over 1,000 physicians under the guise of investigational use. Over 20,000 Americans received thalidomide in this "study," including 624 pregnant patients, and about 17 known newborns suffered the effects of the drug.
The thalidomide tragedy resurrected Kefauver's bill to enhance drug regulation that had stalled in Congress, and the Kefauver-Harris Amendment became law on October 10, 1962. Manufacturers henceforth had to prove to FDA that their drugs were effective as well as safe before they could go on the market. FDA received authority to regulate advertising of prescription drugs and to establish good manufacturing practices. Finally, the law required that all drugs introduced between 1938 and 1962 had to be effective. An FDA - National Academy of Sciences collaborative study showed that nearly 40 percent of these products were not effective. A similarly comprehensive study of over-the-counter products began ten years later.[55]
1970-1980
Cancer drugs were a feature of the 1970s. From 1978, India took over as the primary center of pharmaceutical production without patent protection.[56]
The industry remained relatively small scale until the 1970s when it began to expand at a greater rate.[citation needed] Legislation allowing for strong patents, to cover both the process of manufacture and the specific products, came into force in most countries. By the mid-1980s, small biotechnology firms were struggling for survival, which led to the formation of mutually beneficial partnerships with large pharmaceutical companies and a host of corporate buyouts of the smaller firms. Pharmaceutical manufacturing became concentrated, with a few large companies holding a dominant position throughout the world and with a few companies producing medicines within each country.
The pharmaceutical industry entered the 1980s pressured by economics and a host of new regulations, both safety and environmental, but also transformed by new DNA chemistries and new technologies for analysis and computation.[citation needed] Drugs for heart disease and for AIDS were a feature of the 1980s, involving challenges to regulatory bodies and a faster approval process.
1980-Today
Controversies emerged around adverse effects, notably regarding Vioxx in the US, and marketing tactics. Pharmaceutical companies became increasingly accused of disease mongering or over-medicalizing personal or social problems.[57]
Since 2008, pharmaceutical companies have been increasing the cost of name-brand prescriptions to offset declining revenues as out-of-patent drugs become available as generics.[58] Simultaneously, pharmaceutical manufacturers are taking increasing advantage of tax havens to avoid taxation.[59]
Research and development
Main articles: Drug discovery and Drug development
Drug discovery is the process by which potential drugs are discovered or designed. In the past most drugs have been discovered either by isolating the active ingredient from traditional remedies or by serendipitous discovery. Modern biotechnology often focuses on understanding the metabolic pathways related to a disease state or pathogen, and manipulating these pathways using molecular biology or biochemistry. A great deal of early-stage drug discovery has traditionally been carried out by universities and research institutions.
Drug development refers to activities undertaken after a compound is identified as a potential drug in order to establish its suitability as a medication. Objectives of drug development are to determine appropriate formulation and dosing, as well as to establish safety. Research in these areas generally includes a combination of in vitro studies, in vivo studies, and clinical trials. The amount of capital required for late stage development has made it a historical strength of the larger pharmaceutical companies.[60]
Often, large multinational corporations exhibit vertical integration, participating in a broad range of drug discovery and development, manufacturing and quality control, marketing, sales, and distribution. Smaller organizations, on the other hand, often focus on a specific aspect such as discovering drug candidates or developing formulations. Often, collaborative agreements between research organizations and large pharmaceutical companies are formed to explore the potential of new drug substances. More recently, multi-nationals are increasingly relying on contract research organizations to manage drug development.[61]
The cost of innovation
Drug discovery and development is very expensive; of all compounds investigated for use in humans only a small fraction are eventually approved in most nations by government appointed medical institutions or boards, who have to approve new drugs before they can be marketed in those countries. In 2010 18 NMEs (New Molecular Entities) were approved and three biologics by the FDA, or 21 in total, which is down from 26 in 2009 and 24 in 2008. On the other hand, there were only 18 approvals in total in 2007 and 22 back in 2006. Since 2001, the Center for Drug Evaluation and Research has averaged 22.9 approvals a year.[62] This approval comes only after heavy investment in pre-clinical development and clinical trials, as well as a commitment to ongoing safety monitoring. Drugs which fail part-way through this process often incur large costs, while generating no revenue in return. If the cost of these failed drugs is taken into account, the cost of developing a successful new drug (new chemical entity, or NCE), has been estimated at about 1.3 billion USD[63](not including marketing expenses). Professors Light and Lexchin reported in 2012, however, that the rate of approval for new drugs has been a relatively stable average rate of 15 to 25 for decades.[64]
Industry-wide research and investment reached a record $65.3 billion in 2009.[65] While the cost of research in the U.S. was about $34.2 billion between 1995 and 2010, revenues rose faster (revenues rose by $200.4 billion in that time).[64]
A study by the consulting firm Bain & Company reported that the cost for discovering, developing and launching (which factored in marketing and other business expenses) a new drug (along with the prospective drugs that fail) rose over a five-year period to nearly $1.7 billion in 2003.[66] According to Forbes, development costs between $4 billion to $11 billion per drug.[67]
Some of these estimates also take into account the opportunity cost of investing capital many years before revenues are realized (see Time-value of money). Because of the very long time needed for discovery, development, and approval of pharmaceuticals, these costs can accumulate to nearly half the total expense. Some approved drugs, such as those based on re-formulation of an existing active ingredient (also referred to as Line-extensions) are much less expensive to develop.
Controversies
Due to repeated accusations and findings that some clinical trials conducted or funded by pharmaceutical companies may report only positive results for the preferred medication, the industry has been looked at much more closely by independent groups and government agencies.[68][69]
In response to specific cases in which unfavorable data from pharmaceutical company-sponsored research was not published, the Pharmaceutical Research and Manufacturers of America have published new guidelines urging companies to report all findings and limit the financial involvement in drug companies of researchers.[70] US congress signed into law a bill which requires phase II and phase III clinical trials to be registered by the sponsor on the clinicaltrials.gov website run by the NIH.[71]
Drug researchers not directly employed by pharmaceutical companies often look to companies for grants, and companies often look to researchers for studies that will make their products look favorable. Sponsored researchers are rewarded by drug companies, for example with support for their conference/symposium costs. Lecture scripts and even journal articles presented by academic researchers may actually be "ghost-written" by pharmaceutical companies.[72]
An investigation by ProPublica found that at least 21 doctors have been paid more than $500,000 for speeches and consulting by drugs manufacturers since 2009, with half of the top earners working in psychiatry, and about $2 billion in total paid to doctors for such services. AstraZeneca, Johnson & Johnson and Eli Lilly have paid billions of dollars in federal settlements over allegations that they paid doctors to promote drugs for unapproved uses. Some prominent medical schools have since tightened rules on faculty acceptance of such payments by drug companies.[73]
Product approval
Main article: Food and Drug Administration
In the United States, new pharmaceutical products must be approved by the Food and Drug Administration (FDA) as being both safe and effective. This process generally involves submission of an Investigational New Drug filing with sufficient pre-clinical data to support proceeding with human trials. Following IND approval, three phases of progressively larger human clinical trials may be conducted. Phase I generally studies toxicity using healthy volunteers. Phase II can include Pharmacokinetics and Dosing in patients, and Phase III is a very large study of efficacy in the intended patient population. Following the successful completion of phase III testing, a New Drug Application is submitted to the FDA. The FDA review the data and if the product is seen as having a positive benefit-risk assessment, approval to market the product in the US is granted.[74]
A fourth phase of post-approval surveillance is also often required due to the fact that even the largest clinical trials cannot effectively predict the prevalence of rare side-effects. Postmarketing surveillance ensures that after marketing the safety of a drug is monitored closely. In certain instances, its indication may need to be limited to particular patient groups, and in others the substance is withdrawn from the market completely.
The FDA provides information about approved drugs at the Orange Book site.[75]
In many non-US western countries a 'fourth hurdle' of cost effectiveness analysis has developed before new technologies can be provided. This focuses on the efficiency (in terms of the cost per QALY) of the technologies in question rather than their efficacy. In England NICE approval requires technologies be made available by the NHS, whilst similar arrangements exist with the Scottish Medicines Consortium in Scotland and the Pharmaceutical Benefits Advisory Committee in Australia. A product must pass the threshold for cost-effectiveness if it is to be approved. Treatments must represent 'value for money' and a net benefit to society. There is much speculation[76] that a NICE style framework may be implemented in the USA in an attempt to decrease Medicare and Medicaid spending by balancing benefits to patients versus profits for the medical industry.
In the UK, the British National Formulary is the core guide for pharmacists and clinicians.
Orphan drugs
Main article: Orphan drug
There are special rules for certain rare diseases ("orphan diseases") involving fewer than 200,000 patients in the United States, or larger populations in certain circumstances. [77] Because medical research and development of drugs to treat such diseases is financially disadvantageous, companies that do so are rewarded with tax reductions, fee waivers, and market exclusivity on that drug for a limited time (seven years), regardless of whether the drug is protected by patents.
Legal issues
Where pharmaceutics have been shown to cause side-effects, civil action has occurred, especially in countries where tort payouts are likely to be large. The top 20 pharmaceutical cases account for over $16 billion in recoveries. Due to high-profile cases leading to large compensations, most pharmaceutical companies endorse tort reform. Recent controversies have involved Vioxx and SSRI antidepressants.
Industry revenues
[78] For the first time ever, in 2011, global spending on prescription drugs topped $954 billion, even as growth slowed somewhat in Europe and North America. The United States accounts for more than a third of the global pharmaceutical market, with $340 billion in annual sales followed by the EU and Japan.(pdf) Emerging markets such as China, Russia, South Korea and Mexico outpaced that market, growing a huge 81 percent.[79] According to IMS the global pharmaceutical industry can reach to US$1.1 trillion by 2014.[80]
The top ten best-selling drugs of 2013 totaled $75.6 billion in sales, with the anti-inflammatory drug Humira being the best-selling drug world wide at $10.7 billion in sales. The second and third best selling were Enbrel and Remicade, respectively.[81] The top three best-selling drugs in the United States in 2013 were Abilify ($6.3 billion,) Nexium ($6 billion) and Humira ($5.4 billion).[82] The best-selling drug ever, Lipitor, averaged $13 billion annually and netted $141 billion total over its lifetime before Pfizer's patent expired in November 2011.
IMS Health publishes an analysis of trends expected in the pharmaceutical industry in 2007, including increasing profits in most sectors despite loss of some patents, and new 'blockbuster' drugs on the horizon.[83]
Teradata Magazine predicted that by 2007, $40 billion in U.S. sales could be lost at the top 10 pharmaceutical companies as a result of slowdown in R&D innovation and the expiry of patents on major products, with 19 blockbuster drugs losing patent.[84] As the number of patents that expire accumulates faster than the number of marketed drugs, this amount is expected to increase even more in the near future.[85][86]
Patents and generics
Depending on a number of considerations, a company may apply for and be granted a patent for the drug, or the process of producing the drug, granting exclusivity rights typically for about 20 years.[87] However, only after rigorous study and testing, which takes 10 to 15 years on average, will governmental authorities grant permission for the company to market and sell the drug.[88] Patent protection enables the owner of the patent to recover the costs of research and development through high profit margins for the branded drug. When the patent protection for the drug expires, a generic drug is usually developed and sold by a competing company. The development and approval of generics is less expensive, allowing them to be sold at a lower price. Often the owner of the branded drug will introduce a generic version before the patent expires in order to get a head start in the generic market.[89] Restructuring has therefore become routine, driven by the patent expiration of products launched during the industry's 'golden era' in the 1990s and companies' failure to develop sufficient new blockbuster products to replace lost revenues.[90]
.
Prescriptions
In the U.S., prescriptions have increased over the past decade to 3.4 billion annually, a 61 percent increase. Retail sales of prescription drugs jumped 250 percent from $72 billion to $250 billion, while the average price of prescriptions has more than doubled from $30 to $68.[91]
Marketing
Advertising is common in healthcare journals as well as through more mainstream media routes. In some countries, notably the US, they are allowed to advertise directly to the general public. Pharmaceutical companies generally employ sales people (often called 'drug reps' or, an older term, 'detail men') to market directly and personally to physicians and other healthcare providers. In some countries, notably the US, pharmaceutical companies also employ lobbyists to influence politicians. Marketing of prescription drugs in the US is regulated by the federal Prescription Drug Marketing Act of 1987.
To healthcare professionals
Currently, there are approximately 81,000 pharmaceutical sales representatives in the United States[92] pursuing some 830,000 pharmaceutical prescribers. A pharmaceutical representative will often try to see a given physician every few weeks. Representatives often have a call list of about 200–300 physicians with 120–180 targets that should be visited in 1–2 or 3 week cycle. The number of pharmaceutical sales reps has been shrinking between 2008 and 2010, an estimated 30% industry wide reduction has occurred and current estimates are there may only be 60,000 pharmaceutical sales reps in the United States.[92][dead link]
In 2008, Senator Charles Grassley began an investigation about unreported payments to physicians by pharmaceutical companies. Grassley led a Congressional Investigation which found that well-known university psychiatrists, who had promoted psychoactive drugs, had violated federal and university regulations by secretly receiving large sums of money from the pharmaceutical companies which made the drugs.[93] The New York Times reported that Dr. Joseph Biederman of Harvard University had failed to report over a million dollars of income that he had received from pharmaceutical companies.[94] Weeks later, Business Week reported that Grassley alleged that Alan Schatzberg, chair of psychiatry at Stanford University, had underreported his investments in Corcept Therapeutics, a company he founded.[95] Dr. Schatzberg had reported only $100,000 investments in Corcept, but Grassley stated that his investments actually totalled over $6 million. Dr. Schaztberg later stepped down from his grant which is funded by the National Institutes of Health (NIH).[96] Similarly, Dr. Charles Nemeroff resigned as chair of the psychiatry department at Emory University after failing to report a third of the $2.8 million in consulting fees he received from GlaxoSmithKline. At the time he received these fees, Dr. Nemeroff had been principal investigator of a $3.9 million NIH grant evaluating five medications for depression manufactured by GlaxoSmithKline.[97]
The book Bad Pharma also discusses the influence of drug representatives, how ghostwriters are employed by the drug companies to write papers for academics to publish, how independent the academic journals really are, how the drug companies finance doctors' continuing education, and how patients' groups are often funded by industry.[98]

The pharmaceutical industry develops, produces, and markets drugs or pharmaceuticals licensed for use as medications.[1] Pharmaceutical companies are allowed to deal in generic or brand medications and medical devices. They are subject to a variety of laws and regulations regarding the patenting, testing and ensuring safety and efficacy and marketing of drugs.

About us|Jobs|Help|Disclaimer|Advertising services|Contact us|Sign in|Website map|Search|

GMT+8, 2015-9-11 20:39 , Processed in 0.152687 second(s), 16 queries .

57883.com service for you! X3.1

返回顶部