Global   US   France   Germany   Spain   Brazil   Poland   Rusia   Netherlands   Australia   Canada   China   UK   Taiwan   Hongkong   Austria   Mexico   Turkey   Italy   Portugal   Sweden   Japan   Switzerland   Argentina   Korea   Indonesia   Philipine   Norway   India   Israel   Grrek   Thai  

science - News Reader PRO

Industry dominates Trump’s new council of science advisers


PCAST’s members investigate the country’s pressing scientific questions at the president’s direction. Typically, academic members of a PCAST outnumber its industry scientists. This was true for the councils under Presidents George H.W. Bush, Bill Clinton and Barack Obama.

Industry veterans dominate Trump’s inaugural PCAST, as they did under George W. Bush. Only one of Trump’s seven new PCAST members works in academia. Two members do not have doctoral degrees.

The newly announced members are: Catherine Bessant, the chief technology officer at Bank of America; H. Fisk Johnson, chief executive at S.C. Johnson & Son; IBM Research director Dario Gil; Cyclo Therapeutics Vice President Sharon Hrynkow; A.N. Sreeram, Dow Chemical’s chief technology officer; HP Labs’ chief technology officer Shane Wall; and K. Birgitta Whaley, an expert in quantum information at the University of California at Berkeley.

Trump’s PCAST will eventually expand to 16 members, according to the White House Office of Science and Technology Policy, including additional academic scholars.

George H.W. Bush chartered the first PCAST in 1990, following a tradition dating back to World War II of soliciting scientists and engineers to advise the White House. Trump’s PCAST comes unusually late in his administration. George W. Bush and Clinton announced their first waves of PCAST appointees within their first year as president. Obama named several scientists to his PCAST in December 2008, while president-elect.

During the Obama administration, PCAST issued 40 reports on topics such as drinking water safety, forensic science and data privacy.

A landmark 2012 PCAST report recommended freeing parts of the radio frequency spectrum that belonged to the government. The Federal Communications Commission adopted spectrum sharing in 2015, and commercial trials are underway; Google tested new wireless technology on the 3.5 GHz band, once restricted to Navy systems, at four NASCAR races in 2017.

After PCAST recommended in 2015 that some hearing aids should be available over-the-counter, the Food and Drug Administration began a regulatory process to allow their sale. Sen. Elizabeth Warren (D-Mass.) cited the PCAST report when she introduced legislation in 2017 to support nonprescription hearing aids.


Source: Industry dominates Trump’s new council of science advisers

Trump announced his tech and science council 3 years into his term, with research execs from IBM and HP and the CEO of the maker of Drano


  • In a Tuesday press release, the White House announced it would relaunch the council of science and technology advisors to the president.
  • The council has not existed since Trump took office in 2017.
  • Like President Obama's selections, this council has few tech executives, but it does not have any tech industry CEOs.
  • Visit Business Insider's homepage for more stories.
  • The Trump administration announced on Tuesday that it is relaunching the President's Council of Advisors on Science and Technology (PCAST) by Executive Order.

    The purpose of PCAST is to "provide policy recommendations on strengthening American leadership in science and technology, building the workforce of the future, and supporting foundational research and development across the country," according to a White House press release. 

    The council has been inactive since President Trump took office 3 years ago, Nextgov reported. The Trump administration has a history of dissolving councils. In 2017, following the white nationalist marches and attack in Charlottesville, Virginia, several CEOs resigned from 2 of Trump's economic advisory councils. In response, he eliminated both.

    Trump's newly announced council has 2 tech executives, and a mix of people from the private sector and academia, which isn't unusual by itself. President Obama's council was also made up mostly of academics, though the Obama council also included some high profile tech figures, including Google's then-CEO Eric Schmidt and Microsoft Chief Research and Strategy Officer Craig Mundie. The sole CEO on Trump's science and tech council is H. Fisk Johnson, the head of SC Johnson and Son Inc, the privately held company that makes Raid and Drano.

    President Obama also named his council much sooner after his election, in April of 2009. It's not clear why Trump's council is being announced three years into his term.

    The White House did not respond to a request for comment.

    Here are the 7 members appointed by Trump:

  • Catherine Bessant, Chief Technology Officer, Bank of America
  • Dr. H. Fisk Johnson, Chairman and Chief Executive Officer, S.C. Johnson & Son, Inc.
  • Dr. Dario Gil, Director of Research, IBM Research
  • Dr. Sharon Hrynkow, Senior Vice President for Medical Affairs, Cyclo Therapeutics
  • Dr. A.N. Sreeram, Vice President and Chief Technology Officer, Dow Chemical
  • Shane Wall, Chief Technology Officer and Global Head of HP Labs, HP Inc.
  • Dr. K. Birgitta Whaley, Director of Quantum Information and Computation Center, University of California, Berkeley

  • Source: Trump announced his tech and science council 3 years into his term, with research execs from IBM and HP and the CEO of the maker of Drano

    Can marketplace science be trusted?


    Four years after the first issue of Nature was published, the US National Academy of Sciences (NAS) faced an existential crisis. In October 1873, one of its original members demanded the expulsion of another member for swindling. Josiah Whitney, California’s state geologist, accused Benjamin Silliman Jr, professor of applied chemistry at Yale University in New Haven, Connecticut, of accepting large sums from California oil companies in return for favourable, possibly fraudulent, science. Silliman responded forcefully that company funding for science was evidence of responsibility, not misconduct: companies needed objective “technical opinions”. Without science, swindling would be more common, he argued.

    NAS president Joseph Henry, secretary of the Smithsonian Institution and a former consultant to Samuel F. B. Morse, inventor of the telegraph, had to agree. If the NAS expelled every member who had ever consulted for a private company, it would not survive. Henry rejected the efforts to remove Silliman. More importantly, he resolved to expand the NAS membership; new members were to be judged on the basis of their research, not on the source of their income1. By the 1870s, it was already clear that industry relied on science.

    The Silliman–Whitney controversy marked a watershed in the relationship between science and industry. For US scientists, as well as many in Britain and Europe, private companies had become valuable patrons, supplying both funds for research and problems to be researched, and were gainful employers who provided short-term commissions. Likewise, companies regarded scientists and their findings as profitable to the development of their respective industries.

    Over the next 150 years, relations between science and industry continued to evolve — in four significant stages. Scientists moved from part-time consultants to full-time corporate researchers, and then to academic entrepreneurs. Industry grew from a scattering of local businesses to a concentration of large companies, and on to multinational corporations with global reach. Although these transformations might seem symbiotic, and even inevitable, the very fact that US scientists and industries emerged as leaders and exemplars (in terms of employment, funding, publishing, patenting and innovating) serves as a cautionary reminder of the contingent nature of such developments.

    Consultancy (1820–80)

    At the heart of the NAS crisis was an essential tension in the relations between science and industry: can the pursuit of knowledge be corrupted by the pursuit of profit? To Whitney and his allies, the answer was obviously yes. Their ‘pure’ science needed to be practised in places protected from the profit motive, such as government agencies or well-endowed universities. Silliman and supporters of ‘applied’ science, by contrast, believed the interactions between science and industry to be mutually advantageous. Indeed, the emergence of a distinct kind of endeavour called applied science characterized a new era in which research would address more and more industrial concerns, and private enterprise would, ideally, become a steady supporter of that work2.

    The profession of scientific consulting goes back to the early nineteenth century, when individuals or groups of capitalists occasionally commissioned scientists to examine prospects in farming, mining, transportation (canals and railroads) and manufacturing. These fee-for-expertise engagements were short term and advisory. By the 1870s, changes in US commercial law (similar to those in British and European law) allowed the formation of limited-liability, joint-stock companies. These businesses, with their large pools of funds and numerous shareholders looking for investment assurances, regularly consulted scientists. As the engagements became both more routine (continuous testing and analysing of existing products and processes) and more investigative, scientists began to receive lucrative contracts and retainers1.

    The president of the United States Geological Survey (USGS) with his team at Wasatch mountain range, Utah in 1869

    The president of the United States Geological Survey (USGS) with his team at Wasatch mountain range, Utah in 1869

    A United States Geological Survey team in the Wasatch mountains in Utah in 1869. US geologists were among the most active consultants during the Gilded Age, a period of rapid economic growth from the 1870s to the 1890s.Credit: Timothy H. O'Sullivan/George Eastman Museum/Getty

    In the United States, geologists were among the most active consultants during the Gilded Age, a period of rapid economic growth from the 1870s to the 1890s, especially in precious-metal mining in the area west of the Mississippi River. In Britain and Germany, the most prolific consultants were chemists, because of their essential expertise in new products such as acids, soaps, paints and especially synthetic dyes, including mauve and alizarin. Consulting chemists also found themselves in prominent public roles as expert witnesses in sensational patent cases. Witness-box quarrelling among chemists made good newspaper copy, and it highlighted profound developments in the chemical industries. Changes in patent law in the United States, Britain and Germany allowed inventors to claim those new chemical products and processes as their intellectual property (IP) instead of judging them to be scientific discoveries, which were, by definition, unpatentable.

    Industry (1880–1940)

    At the turn of the twentieth century, the independent consulting scientist was replaced by the salaried researcher in new industrial laboratories. These labs represented the incorporation of applied science; that is, the creation of a separate place within the organization for ‘research and development’ — a phrase that entered the lexicon at this time.

    In Germany, the largest dye companies, such as Bayer, Hoechst and BASF, were the first to establish dedicated labs for chemical research. These were connected to production departments, also staffed by university-trained chemists, and to specialized legal departments, from which the new products and processes were submitted for patenting. This type of industrialized invention, with close connections between German academic chemistry and company labs, was firmly established before the First World War3.

    image

    In the United States, the prototype for the industrial research lab appeared in the electrical industry, when inventor Thomas Edison set up an ‘invention factory’ in Menlo Park, New Jersey, in 1876. Edison wanted to replace what had been an unpredictable act of creative genius with a regular and reliable system. He recruited machinists, mechanics, chemists, physicists and mathematicians to work on technical problems connected to telegraphy and electric lighting. Although their efforts were collaborative, only the ‘Wizard of Menlo Park’ (the singular inventor) was listed on more than 1,000 US patents, including those for the phonograph (1878) and electric light bulb (1880)4.

    The looming expiration of that original light-bulb patent and the threat from other lighting companies impelled General Electric (GE), the corporation that took over Edison’s Electric Light Company and all his patents, to establish the aptly named Research Laboratory in 1900 in Schenectady, New York. This proved profitable within a decade — commercially, with the invention of a new light bulb that restored GE to its dominant market position, and professionally, with the recruitment of more than 250 engineers and scientists.

    A few other large US corporations followed suit and pioneered their own formal research and development (R&D) labs — DuPont (1903), Westinghouse Electric (1904), American Telephone and Telegraph (AT&T, 1909) and Eastman Kodak (1912).

    It was the First World War and the embargo on all German products, especially chemicals, that was the catalyst to the golden age of ‘industrial research’, a neologism of the 1920s. Between 1919 and 1936, US corporations established more than 1,100 labs in nearly all industries — petroleum, pharmaceuticals, cars, steel — thereby dominating the world’s industrial research. In 1921, these employed roughly 3,000 engineers and scientists; by 1940, there were more than 27,000 researchers. At the end of the Second World War, the figure was nearly 46,0005.

    This remarkable proliferation reflected the massive scale of vertically integrated corporations that controlled nearly all areas of their respective industries, from natural resources through R&D to mass production and mass marketing. Industrial research was also fuelled by radical changes in US patent law that allowed these behemoths to claim the IP of their employees. The inventor was now the corporation.

    During the Great Depression, critics singled out modern big business for its ruinous consequences to society — unemployment, overproduction and bankruptcy. Having research in thrall to industry raised the alarm, again, that capitalism corrupted science. So corporate captains and R&D directors marshalled the cornucopia of wondrous consumer products (‘technology’ in the new parlance) created by their science-based industries. In this story, science in industry was good; it guaranteed efficacy, efficiency and safety. In words that nineteenth-century consulting scientists would have understood, consumers could trust these modern technologies (and their corporations) because of the R&D.

    At the World’s Fair in New York City in 1939, industry paraded the fruits of its science. The Radio Corporation of America (RCA) introduced consumers to the television. International Business Machines (IBM) showed off its electric typewriter. GE exhibited its new electrical refrigeration system, and DuPont, under its banner “Better Things for Better Living through Chemistry”, showcased a synthetic fibre called nylon6.

    Westinghouse's display at the New York World’s Fair In 1939

    Westinghouse's display at the New York World’s Fair In 1939

    US firms paraded the fruits of their industrial research at the 1939 World’s Fair in New York City.Credit: Bettmann

    Fears of corporate corruption of science were put to rest by awards of the Nobel prize. In 1931, two Germans, Carl Bosch and Friedrich Bergius, became the first industrial researchers to win in chemistry. The next year, GE’s Irving Langmuir won the chemistry prize, and in 1937, Clinton J. Davisson of Bell Telephone Laboratories (Bell Labs) won a share of the Nobel Prize in Physics.

    The largest research facility in the United States was Bell Labs, established in 1925 in New York City to consolidate the R&D arm of AT&T and Western Electric, its telephone-manufacturing arm. The labs had around 3,600 staff members and a budget in excess of US$12 million. (GE allocated less than $2 million to its Research Laboratory.) The first president of Bell Labs was the physicist Frank Jewett. In 1939, he became the first industrial scientist to be president of the NAS7.

    In short, national standing and international acclaim seemed to confirm that science done under the auspices of industry was equal to science in universities or governments. Still, industrial labs of the 1920s and 1930s were not simply universities without students. As institutions of applied science, they always needed to show corporate headquarters their value in terms of profitable products and processes.

    Military (1940–80)

    By the time the New York World’s Fair closed in October 1940, Europe was already at war. The United States entered in December 1941, and the Second World War transformed the relationship between science and industry, along with the very terms — and even the history — of those relations.

    The prime mover in all those changes was the US military and the unprecedented amounts of money it allocated — through new forms of contracting and subcontracting — to scientific research. During the war, the Office of Scientific Research and Development, under its director Vannevar Bush, signed more than 2,300 research contracts, worth roughly $350 million, with more than 140 academic institutions and 320 companies. About two-thirds of that funding went to universities; the Massachusetts Institute of Technology (MIT) in Cambridge, for example, received more than $200 million for its Radiation Laboratory for research on radar. Corporate R&D also received unrivalled amounts: AT&T was allocated $16 million, GE $8 million and RCA, DuPont and Westinghouse between $5 million and $6 million each8.

    But by far the most prodigious investments in R&D flowed from the War Department ($800 million) and the Navy Department ($400 million). The largest portion of that went to private industry ($800 million), much of it directed towards emergent industries with compelling national-security interests — for example, aerospace, electronics, computing and nuclear technology8.

    The US military had not intended to become the commander-in-chief of US science, but by the end of the war it was apparent, at least to Bush, that the federal government needed a plan. In his 1945 report to US president Franklin D. Roosevelt, Science — The Endless Frontier, Bush presented a vision for US science policy that would guide and define both university science and corporate R&D throughout the cold war. The endless frontier was ‘basic’ research, the kind performed “without thought of practical ends”, a direct throwback to the nineteenth-century idea of pure science. The US military would fund this to boost industrial research because, the reasoning went, basic research was “the pacemaker of technological progress”.

    Here, then, was a new argument. As many commentators at the time and since have pointed out, it did not reflect either the experience of the war years (during which multifunctional teams worked on military projects such as the atomic bomb or radar) or of the previous decades (in which multifunctional teams worked in R&D labs on corporate projects such as the light bulb). Science — The Endless Frontier thus propounded a different idea for developing new technologies, both military and commercial. In time, this became known as the linear model of innovation9.

    The theory posits a conveyor belt, beginning with basic science and moving smoothly along to development, then to manufacturing and production, and culminating with technology or innovation. Increase the amount of basic science and the (alleged) result would be more technology, innovation and overall economic growth. Theoretically, basic research was to be centred in universities (and military funding did transform US universities and their science departments accordingly). But corporate R&D labs were also contracting with the military, as they had been during the war. With these military contracts, as well as enlarged funding from corporate headquarters (business leaders also bought into the linear model), industrial labs were redirected away from applied science and towards basic research10.

    Such faith in endless scientific innovation combined with prodigious financial resources led to the creation of central corporate research labs. These functioned more or less independently, which nicely suited the new organizational structure of multinationals. In place of vertical integration, sprawling conglomerates adopted horizontal organizational structures comprising multiple divisions (the M-form organization), in which each division, including the central research lab, operated on its own.

    Leading research labs relocated to the countryside, far removed from headquarters and any connection to manufacturing. RCA Laboratories Division, for example, expanded its campus near Princeton, New Jersey, after 1945 and started work on colour TV and semiconductors. In 1956, Westinghouse built up its research labs in Churchill outside Pittsburgh, Pennsylvania, for nuclear research. IBM set up its Thomas J. Watson Research Center, designed by the modernist architect Eero Saarinen, in Yorktown Heights near New York City in 1961, to work on lasers, semiconductors and other computer-related physics. And Bell Labs moved its research headquarters to Murray Hill, New Jersey.

    At its height (before 2001), Bell Labs conducted world-class research in many fields (physics, mathematics, radio astronomy) at numerous sites. Its largest campus at Naperville near Chicago, Illinois, employed 11,000 people. The 191-hectare flagship campus at Holmdel, New Jersey, some 30 kilometres south of New York City, included a magnificent mirrored-glass building also designed by Saarinen in 1962.

    These ‘industrial Versailles’ did research without much development; they had indeed been converted into universities without students11. As industrial ivory towers, they hoovered up university faculty members and PhD scientists and engineers, promising them time and resources to pursue their own agendas, and offering them open publication policies that allowed their results to appear in the most prestigious journals. By the mid-1950s at RCA in Princeton, half of the staff were theoretical scientists and more than 75% of the contracts were with the military. DuPont, likewise, increased its scientific staff by 150% in the decade after the war, with the greatest growth in fundamental chemistry being at its Experimental Station near Wilmington, Delaware. By the early 1960s, the number of engineers and scientists employed in US industrial research topped 300,00012.

    Bardeen, Shockley and Brattain, with equipment in 1948

    Bardeen, Shockley and Brattain, with equipment in 1948

    John Bardeen, William Shockley and Walter Brattain invented the transistor at Bell Labs in 1947, and were awarded the 1956 Nobel Prize in Physics for their work.Credit: Science History Images/Alamy

    These leading corporate laboratories — Bell Labs, IBM, Westinghouse, DuPont, RCA (Princeton), Xerox Palo Alto Research Center (PARC, 1970) — became powerhouses of basic science. Between 1956 and 1987, 12 corporate scientists won Nobel prizes. Bell Labs alone has collected eight in physics and one in chemistry since the Second World War, including one for its most famous technology, the transistor, in 1956. In the early 1960s, corporate researchers authored 70% of papers appearing in Physics Abstracts. By 1980, Xerox PARC matched the world’s leading universities on citation impact6,8.

    With its emphasis on basic science as the necessary prerequisite to any future technological progress, the linear model was a break with the past. It prompted a new interpretation of the historical relations of science and industry. In the 1950s and 1960s, economists, historians and other scholars began to re-examine the latter half of the nineteenth century, and claimed to have discovered a ‘Second Industrial Revolution’. Characterized by the chemical and electrical industries, this revolution involved replacing the old trial-and-error methods of invention used in the dirty industries of the ‘First Industrial Revolution’ (textile factories, coal mines and iron foundries) with science-based methods. In this revisionist history, glamorous synthetic dyes and bright electric bulbs sprang directly from the pure science of organic chemistry and electromagnetic physics. History thus seemed to provide definitive evidence for the necessity of continued funding of basic science, as well as a ready explanation for why US and Western European corporations had dominated the world’s economy for more than a century13.

    It was not to last.

    Outsourcing (1980 on)

    Corporate investment in basic science had been sustained by dominant positions in international markets. AT&T, DuPont, IBM, Kodak and Xerox held more than 80% market shares in their respective core businesses. Then the oil shocks of the 1970s, combined with widespread stagflation (high inflation, slow growth), weakened the US and European economies. Global competition increased, especially from Japanese and South Korean firms. In the early 1980s, growing free trade squeezed profit margins even further.

    In response, US corporations began to restructure and downsize. Business leaders and shareholders decided that the multi-division conglomerate had become too unwieldy to compete. A new, leaner corporation was required. One way to restructure was outsourcing, replacing internal suppliers with external ones. Corporations began to relocate their manufacturing, once the backbone of the industrial economy, to plants in lower-cost and less-regulated countries. (The pace has only accelerated, especially after 2001, when China joined the World Trade Organization.)

    A Bell Labs researcher working on optical fiber in a lab test, 1991

    A Bell Labs researcher working on optical fiber in a lab test, 1991

    Bell Labs in the 1990s: a researcher testing data transmission through fibre-optic cable.Credit: Ovak Arslanian/The LIFE Images Collection via Getty

    Another way to downsize was divestiture, selling off subsidiaries unrelated to the core business. To shareholders seeking quick profits, long-term corporate research looked like a financial liability. The central laboratory became a prime target. In 1988, RCA sold off its Princeton lab as an independent business, Sarnoff Corporation. In 1993, IBM slashed $1 billion — roughly 20% — from its R&D budget. The German corporation Siemens bought Westinghouse’s Churchill laboratory in 1997, and in 2002, PARC, the former division of Xerox, became an independent company. In 1996, AT&T, following the break-up of its phone monopoly, spun off the vaunted Bell Labs as a separate company, Lucent Technologies (in 2016 this was taken over by Nokia, the Finnish telecommunications company). The Holmdel campus closed in 2007. Within a year, just four scientists remained at Murray Hill doing fundamental physics research. It was the end of an era14.

    Accompanying globalized competitive markets, liberalized free trade and shareholder short-termism, the US military began to cut back funding for basic science at corporate labs. With the exception of a few years in the early 1980s (US president Ronald Reagan’s Strategic Defense Initiative, the ‘Star Wars’ programme), the US government steadily reallocated research funds to universities and other non-profit organizations, particularly towards medical schools and research hospitals through the National Institutes of Health (NIH). With continuous funding, new fields (molecular biology, biochemistry and biotechnology, for instance) surged past the diminished physical sciences. By 1988, only about 10% of basic research articles in physics were authored by industrial scientists; by 2005, the number had plummeted to less than 3%15.

    The demise of the corporate research lab heralded the death of the linear-model idea. Many scholars concluded that it was too simplistic. The pathway from science to technology was neither straight nor singular, and perhaps not even one way (technological advances can also lead to scientific discoveries). For corporate executives, investment in basic science did not seem to pay off. DuPont discovered no new nylons; Kodak failed to produce a revolution in photography; RCA lost its edge in consumer electronics; IBM ignored the personal computer; and Xerox PARC let slip the graphical user interface.

    In the late 1960s and 1970s, small firms such as Intel, Microsoft, Apple, Sun Microsystems and Cisco Systems did commercialize the basic research being done at the larger corporations. Without establishing traditional research labs of their own, these players came to dominate the new information technology (IT) industry. In 1991, for example, when Microsoft created Microsoft Research — one of the largest industrial labs of its generation — its declared mission was not basic science, but innovation. In a more extreme case, Apple co-founder Steve Jobs shut down a fledgling research lab in 1998 in the belief that innovation would not require any investment in R&D.

    The Google Quantum Computing lab in Goleta, California

    The Google Quantum Computing lab in Goleta, California

    A device at the Google Quantum Computing lab in Goleta, California, in 2017. Researchers at Google AI are aiming to build quantum processors that speed up computational tasks in machine learning.Credit: Greg Kendall-Ball/Nature

    Until 2010 and the emergence of machine learning, artificial intelligence (AI) and the Internet of Things, most technology companies ignored basic research. In 2012, following Jobs’s death, Apple began investing in R&D again, particularly in AI. Likewise, Amazon, Google, Facebook and Uber began to recruit AI researchers from academia. This brain drain has become so serious that universities have begun to worry about their ability to train future AI researchers.

    Twenty-first-century corporations value science (particularly, patentable discoveries) and still think that basic research can lead to invention and innovation. They would just prefer that someone else do it (and pay for it). In business terms, they optimize their ‘supply-chain management’, a phrase that gained currency in the 1990s, by replacing stable in-house labs (warehouses of scientists and engineers) with flexible contract research. Their ability to do so was greatly facilitated by the US government and the loosening of antitrust enforcement. The settlement of the monopoly case against Microsoft in 2001, for example, stands in stark contrast to the forced break-up of AT&T in 1984.

    Moreover, the US government now permitted innovative start-ups to acquire new technologies, patents and licences from other companies and independent non-profit organizations such as Sarnoff and PARC, and to engage in extensive collaborative research with institutes and universities. Microsoft Research, for instance, now has labs around the globe (New York City, Beijing, Bangalore) and on several university campuses (MIT, the University of California, Santa Barbara, and Cambridge, UK), which account for 20% of patents in AI worldwide. Google, by contrast, mostly underwrites academic research through grants, fellowships, internships and visiting positions.

    Universities have traditionally been the home of basic science. In the twenty-first century they have also become the source of innovation and entrepreneurship, in part because of sweeping changes in US patent law. In 1980, the US Supreme Court (in Diamond v. Chakrabarty) significantly expanded what could be patented to include new life forms. That same year, the US Congress passed the Bayh–Dole Act, permitting universities to patent the results of research funded by the NIH or other federal agencies and conducted on their campuses by faculty members, students and employees. Universities started filing for patents at an increasing rate — from 2,266 in 1996 to 5,990 in 2014. The university is now an inventor16.

    The most prominent industry that has been transformed by these legal and policy changes has been biotechnology. In 1976, a university biochemist and a venture capitalist founded Genentech, the first biotech firm. Genentech focused, as did other biotech start-ups (Amgen in 1980 and Genzyme in 1981), on translating basic science done in universities and, subsequently, in-house into patents and other forms of profitable IP. They facilitated that linear movement from research to development. Further commercialization towards the manufacture and distribution of drugs and therapies was taken up by traditional big pharmaceutical corporations. Eli Lilly (founded in 1876), for example, guided Genentech’s first drug (synthetic human insulin) through clinical trials and brought it to market17.

    The emergence of biotech represented both a new business plan (entrepreneurial scientists partnering with venture capitalists to sell their research) and a new model of innovation. Here, industry shifted from a single internal or closed source of research to multiple external or open sources18. In this model, academic entrepreneurs, commercialized universities, globalized contract-research institutes and numerous small research start-ups supply the science and the IP. Larger, more established firms then develop and commercialize these into new products and processes.

    According to some economists and business scholars, open innovation characterizes a ‘Third Industrial Revolution’19. From their perspective, the university professor seeking to patent the results of federally funded research to form a start-up, with seed money from venture capitalists, is the direct descendant of the consulting chemist of the nineteenth century. In this ecosystem, a population of nimble researchers and small firms has displaced a pack of lumbering corporate labs20. To critics and less-sanguine academics, the twenty-first-century relations of science and industry illustrate the commodification of university research and the corruption of the pursuit of knowledge by the profit motive21.

    Today, a complex innovation web has replaced the old conveyor belt. This is another new model — global commercialization. Supply-chain science is premised on the belief that research is a fungible commodity to be bought on demand and sold by the lowest-cost lab. In some ways, twenty-first-century contract research is reminiscent of nineteenth-century consulting science. In both cases, the question remains: is marketplace science trustworthy?


    Source: Can marketplace science be trusted?



    News Reader Pro Powered by. Full RSS | Disclaimer | Contact Us