A Totalitarian Society Has Totalitarian Science

drugs
Source: NoMoreFakeNews.com | JonRappoport.wordpress.com
By: Jon Rappoport
August 23, 2017

Over the past 35 years, I’ve exposed as least as much fraudulent science as any reporter around. That’s just a fact.

I mention it, because one would expect I’ve learned a few lessons in the process.

And I have.

Government-backed science exists because it is a fine weapon to use, in order to force an agenda of control over the population.

We aren’t talking about knowledge here. Knowledge is irrelevant. What counts is: ‘How can we fabricate something that looks like the truth?’

I keep pointing this out: we’re dealing with reality builders. In this case, they make their roads and fences and buildings out of data, and they massage and invent the data out of thin air to suit their purposes. After all, they also invent money out of thin air.

Since 1987, one of my goals as a reporter has been to educate the public about false science.

Between then and now, I have found that, with remarkably few exceptions, mainstream reporters are studiously indifferent to false science.

They shy away from it. They pretend “it couldn’t be.” They refuse to consider facts. They and their editors parrot “the experts.”

Official science has a stranglehold on major media. It has the force of a State religion. When you stop and think about it, official science is, in a significant sense, a holy church. Therefore, it is no surprise that the church’s spokespeople would wield power over major information outlets.

These prelates invent, guard, and dispense “what is known.” That was precisely the role of the Roman Church in times past. And those professionals within the modern Church of Science are severely punished when they leave the fold and accuse their former masters of lies and crimes. They are blackballed, discredited, and stripped of their licenses. At the very least.

Totalitarian science lets you know you’re living in a totalitarian society.

The government, the press, the mega-corporations, the prestigious foundations, the academic institutions, the “humanitarian” organizations say:

“This is the disease. This is its name. This is what causes it. This is the drug that treats it. This is the vaccine that prevents it.”

“This is how accurate diagnosis is done. These are the tests. These are the possible results and what they mean.”

“Here are the genes. This is what they do. This is how they can be changed and substituted and manipulated. These are the outcomes.”

“These are the data and the statistics. They are correct. There can be no argument about them.”

“This is life. These are the components of life. All change and improvement result from our management of the components.”

“This is the path. It is governed by truth which our science reveals. Walk the path. We will inform you when you stray. We will report new improvements.”

“This is the end. You can go no farther. You must give up the ghost. We will remember you.”

We are now witnessing the acceleration of Official Science. Of course, that term is an internal contradiction. But the State shrugs and moves forward.

The notion that the State can put its seal on favored science, enforce it, and punish its competitors, is anathema to a free society.

For example: declaring that psychiatrists can appear in court as expert witnesses, when none of the 300 so-called mental disorders listed in the psychiatric literature are diagnosed by laboratory tests.

For example: stating that vaccination is mandatory, in order to protect the vaccinated (who are supposed to be immune) from the unvaccinated. An absurdity on its face.

For example: announcing that the science of climate change is “settled,” when there are, in fact, huge numbers of researchers who disagree. —And then, drafting legislation and issuing executive orders based on the decidedly unsettled science.

For example: officially approving the release and sale of medical drugs (“safe and effective”) which go on to kill, at a conservative estimate, 100,000 Americans every year. And then refusing to investigate or punish the agents of these drug approvals (the FDA).

For example: permitting the widespread use of genetically modified food crops, based on no studies of their impact on human health. And then, arbitrarily announcing that the herbicide, Roundup, for which many of these crops are specifically designed, is non-toxic.

For example: declaring and promoting the existence of various epidemics, when the viruses purportedly causing them are not proven to exist or not proven to cause human illness (SARS, West Nile, Swine Flu, etc.)

A few of you reading this have been with me since 1988, when I published my first book, AIDS INC., Scandal of the Century. Among other conclusions, I pointed out that HIV had never been shown to cause human illness; the front-line drug given to AIDS patients, AZT, was overwhelmingly toxic; and what was being called AIDS was actually a diverse number immune-suppressing conditions.

Others of you have found my work more recently. I always return to the subject of false science, because it is the most powerful long-term instrument for repression, political control, and destruction of human life.

As I’ve stated on many occasions, medical science is ideal for mounting and launching covert ops aimed at populations—because it appears to be politically neutral, without any allegiance to State interests.

Unfortunately, medical science, on many fronts, has been hijacked and taken over. The profit motive is one objective, but beyond that, there is a more embracing goal:

Totalitarian control.

On the issue of vaccines, I’ve written much about their dangers and ineffectiveness. But also consider this: the push for mandatory vaccination goes a long way toward creating a herd effect—which is really a social construction.

In other words, parents are propagandized to think of themselves as a kind of synthetic artificial “community.”

“Here we are. We are the fathers and mothers. We must all protect our children against the outliers, the rebels, the defectors, the crazy ones who refuse to vaccinate their own children. We are all in this together. They are the threat. The enemy. We are good. We know the truth. They are evil.”

This “community of the willing” are dedicated to what the government tells them. They are crusaders imbued with group-think. They run around promoting “safety and protection.” This group consciousness is entirely an artifact, propelled by official science.

The crusaders are, in effect, agents of the State.

They are created by the State.

Androids.

They live in an absurd Twilight Zone where fear of germs (the tiny invisible terrorists) demands coercive action against the individuals who see through the whole illusion.

This is what official science can achieve. This is how it can enlist obedient foot soldiers and spies who don’t have the faintest idea about how they’re being used.

This is a variant on Orwell’s 1984. The citizens are owned by the all-embracing State, but they aren’t even aware of it.

That’s quite a trick.

One of my favorite examples of double-think or reverse-think is the antibody test. It is given to diagnosis diseases. Antibodies are immune-system scouts sent out to identify germ-intruders, which can then be wiped out by other immune-system troops.

Prior to 1985, the prevailing view of a positive antibody test was: the patient is doing well; his body detected the germ and dispensed with it. After 1985, the view was suddenly: this is bad news; the patient is sick or he is on the verge of getting sick; he has the germ in his body; it does harm.

Within the medical community, no one (with very few exceptions) raised hell over this massive switch. It was accepted. It was actually good for business. Now, many more people could be labeled “needs treatment,” whereas before, they would have been labeled “healthy.”

While I was writing my first book, AIDS INC., in 1987-8, I wrote the FDA asking about a possible AIDS vaccine. I was told the following: every person given such a vaccine would, of course, produce antibodies against HIV. That is the whole purpose of a vaccine: to produce antibodies.

However, I was informed, patients receiving this vaccine would be given a letter to carry with them, in case they were ever tested for HIV and came up positive. The letter would explain that the antibodies causing the positive test were the result of the vaccine, not the result of “natural” action inside the patient’s body.

In other words, the very same antibodies were either protective against AIDS (good) or indicative of deadly disease (bad).

This was the contradictory and ridiculous and extraordinary pronouncement of official science.

It carries over into every disease for which an antibody test is administered. If a vaccine against disease X is given, it delivers immunity, because it produces antibodies. But if a diagnostic test for disease X reveals the presence of the same antibodies, naturally produced in the body, this is taken as a sign of illness.

Extrapolated to a more general level, the Word is: synthetic medical treatment is good; the action of the body to heal itself is incompetent.

This is a type of superstition that would astonish even the most “primitive” societies.

It no longer astonishes me. I see it everywhere in official science.

From the medical establishment’s point of view, being alive is a medical condition.

The most useful politicians—as far as official science is concerned—are those who automatically promote its findings. Such politicians are lifted into prominence. They are champions of the Science Matrix. They never ask questions. They never doubt. They never make waves. They blithely travel their merry way into new positions of power, knowing they have enormous elite support behind them. When they need to lie, they lie. They are taught that those who question or reject official science are a tiny ‘demographic’ who can be ignored during election campaigns. ‘Don’t worry about them. They don’t count.’ These politicians are never in the trenches with the people on issues of health.

The elite Plan is universal collectivism, in which all citizens are atoms of a giant molecule. Many lies need to be told in order to make that dream/nightmare come true. If some of those lies are about science, so much the better. People believe in science.

Think about the agendas behind universal vaccination, climate change, universal psychiatric treatment, GMO food, and other ‘science-based’ frauds. They all imply a model, in which individuals give up their power in exchange for ‘doing good’ and becoming members of the largest group in the world: ‘disabled’ people with needs that must be addressed and satisfied.

Instead of supporting the liberation of the individual, the controllers want to squash it. Why? Because they fear individual power. It is forever the unpredictable wild card. They want a society in which every thought an individual thinks connects him to a greater whole—and if that sounds attractive, understand that this Whole is a fiction, intentionally faked to resemble a genuine oceanic feeling. The elite Whole is ultimately a trance-like fiction that will slow down time to a crawl, and shrink space to a sliver, and focus attention on a single mandate: wait for the next instruction from above, content in the knowledge that it will benefit all of humanity.”

This program has many agents.

Some of them are agents of official science.

Read More At: JonRappoport.wordpress.com
_______________________________________________________________

Jon Rappoport

The author of three explosive collections, THE MATRIX REVEALED, EXIT FROM THE MATRIX, and POWER OUTSIDE THE MATRIX, Jon was a candidate for a US Congressional seat in the 29th District of California. He maintains a consulting practice for private clients, the purpose of which is the expansion of personal creative power. Nominated for a Pulitzer Prize, he has worked as an investigative reporter for 30 years, writing articles on politics, medicine, and health for CBS Healthwatch, LA Weekly, Spin Magazine, Stern, and other newspapers and magazines in the US and Europe. Jon has delivered lectures and seminars on global politics, health, logic, and creative power to audiences around the world. You can sign up for his free NoMoreFakeNews emails here or his free OutsideTheRealityMachine emails here.

The FBI Evidence Lab, A Cesspool

TruthFact
Source: NoMoreFakeNews.com | JonRappoport.wordpress.com
By: Jon Rappoport
August 22, 2017

In these pages, I’ve emphasized that mainstream news often fails to follow up on their own stories.

They publish a shocking account of a scandal, and then they drop it like a hot potato.

Why? There are several reasons, but the most important is: the scandal is too revealing. It indicts an institution or organization that, in the long run, must be protected.

In 2014-15, stories appeared in the press about the phenomenal corruption of the FBI evidence lab. But since then, there has been very little follow-up. I find no compelling evidence that the federal government has fixed the problem.

Here is a sample of the 2014-15 stories:

April 20, 2015, The Atlantic: “…the Washington Post made clear Saturday in an article that begins with a punch to the gut… ‘Nearly every examiner in an elite FBI forensic unit gave flawed testimony in almost all trials in which they offered evidence against criminal defendants over more than a two-decade period before 2000,’ the newspaper reported, adding that ‘the cases include those of 32 defendants sentenced to death’.”

August 12, 2014, New Scientist: “…the initial results were released of an ongoing review of thousands of criminal cases in which FBI scientists’ testimony may have led to wrongful convictions – including for some people now on death row…[an FBI source states] ’we teach these people [lab techs in training] for two weeks, and they would go back to their laboratories with a certificate of completion and be told: Great you’re qualified to do this [analysis of evidence] – here’s your caseload.’”

Washington Post, April 18, 2015: “The Justice Department and FBI have formally acknowledged that nearly every examiner in an elite FBI forensic unit gave flawed testimony in almost all trials in which they offered evidence against criminal defendants over more than a two-decade period before 2000.”

“Of 28 examiners with the FBI Laboratory’s microscopic hair comparison unit, 26 overstated forensic matches in ways that favored prosecutors in more than 95 percent of the 268 trials reviewed so far, according to the National Association of Criminal Defense Lawyers (NACDL) and the Innocence Project, which are assisting the government with the country’s largest post-conviction review of questioned forensic evidence.”

“The cases include those of 32 defendants sentenced to death. Of those, 14 have been executed or died in prison, the groups said under an agreement with the government to release results after the review of the first 200 convictions.”

Giant long-term scandal and corruption. The story is covered. Then it disappears.

Here is one reason why. If the press outlets continued to search out every aspect of the story, they would come upon numerous prosecutors who routinely relied on false FBI lab reports in trials. Some of those prosecutors would be exposed for knowingly accepting fake evidence from the FBI, in order to make their cases.

The scandal would spread like ink on a blotter.

Major media news picks their spots. They choose to pound on certain stories day after day, in an effort to convince the public of certain “facts.” They studiously refuse to dig and keep digging on other stories, hoping the public will forget.

Remember this, forget that.

Journalism schools don’t teach their students that this is the way to do news. After graduating and finding jobs, young reporters catch on.

They catch on and go along.

That’s how their ideals crumble and disintegrate.

That’s how they become agents and blunt weapons for their bosses.

That’s how they become alcoholics and denizens traveling through a dim underworld of lies.

Read More At: JonRappoport.wordpress.com
_______________________________________________________________

Jon Rappoport

The author of three explosive collections, THE MATRIX REVEALED, EXIT FROM THE MATRIX, and POWER OUTSIDE THE MATRIX, Jon was a candidate for a US Congressional seat in the 29th District of California. He maintains a consulting practice for private clients, the purpose of which is the expansion of personal creative power. Nominated for a Pulitzer Prize, he has worked as an investigative reporter for 30 years, writing articles on politics, medicine, and health for CBS Healthwatch, LA Weekly, Spin Magazine, Stern, and other newspapers and magazines in the US and Europe. Jon has delivered lectures and seminars on global politics, health, logic, and creative power to audiences around the world. You can sign up for his free NoMoreFakeNews emails here or his free OutsideTheRealityMachine emails here.

Fake Evidence Used In Oklahoma Bombing

How official “science” is deployed to advance a political agenda

TruthFact
Source: NoMoreFakeNews.com | JonRappoport.wordpress.com
By: Jon Rappoport
August 22, 2017

The public wants to buy every official scientific claim the mainstream press pounds into their brains—whether the issue is vaccine safety, global warming, the “overwhelming success” of medical drugs, the Big Bang theory of the universe’s origin…

The notion that a political agenda underlies such scientific pronouncements is unthinkable.

So as an example, a very specific example of fake science, let’s look back at the attack on Oklahoma City.

On April 19, 1995, one-third of the Murrah Federal Building in Oklahoma City blew up, killing 169 people and wounding 680 others.

Three men were arrested and convicted: Tim McVeigh, Terry Nichols, and Michael Fortier. McVeigh was put to death on June 11, 2001, Nichols is currently serving multiple life sentences without the possibility of parole, and Fortier was sentenced to 12 years (he served that term and was released).

The official narrative of the bombing stated: A Ryder truck parked at the curb outside the Murrah Building contained barrels of ammonium nitrate plus fuel oil (ANFO bombs), and their coordinated explosion occurred shortly after 9AM on the morning of April 19th.

In addition to the deaths and the woundings, the explosion impacted 324 buildings and 86 cars in the area.

(In my 1995, book, “The Oklahoma City Bombing, the Suppressed Truth,” I laid to rest the claim that ANFO bombs could have caused that much damage; and more importantly, I showed that an explosion coming out of a Ryder truck at the curb could not have caused the particular profile of damage sustained by the Murrah Building.)

The vaunted FBI lab decided that, indeed, all the damage and death HAD been caused by ANFO bombs in the Ryder truck.

But wait.

Buckle up.

Two years after the bombing, on March 22, 1997, we had this from CNN: “The Justice Department inspector general’s office has determined that the FBI crime laboratory working on the Oklahoma City bombing case made ‘scientifically unsound’ conclusions that were ‘biased in favor of the prosecution,’ The Los Angeles Times reported Saturday.”

“…[FBI] supervisors approved lab reports that they ‘cannot support’ and…FBI lab officials may have erred about the size of the blast, the amount of explosives involved and the type of explosives used in the bombing[!].”

“…harshest criticism was of David Williams, a supervisory agent in the [FBI] explosives unit, the paper [LA Times] said. Those flaws reportedly include the basis of his determination that the main charge of the explosion was ammonium nitrate. The inspector general called such a determination ‘inappropriate,’ the Times said.”

“…FBI officials found a receipt for ammonium nitrate at defendant [Terry] Nichols’ home and, because of that discovery, Williams slanted his conclusion to match the evidence.”

And with those revelations, the case, the investigation, the court trials, and press probes should have taken a whole new direction. But they didn’t.

The fake science was allowed to stand.

Therefore, other paths of investigation were abandoned. If bombs did, in fact, explode in the Ryder truck, but didn’t cause the major damage, then those bombs were a cover for other explosions of separate origin—for example, charges wired inside the columns of the Murrah Building, triggered at the exact moment the Ryder Truck explosion occurred.

Now we would be talking about a very sophisticated operation, far beyond the technical skills of McVeigh, Nichols, and Fortier.

Who knows where an honest in-depth investigation would have led? The whole idea of anti-government militia terrorism in the OKC attack—symbolized by McVeigh—was used by President Bill Clinton to bring the frightened public “back to the federal government” as their ultimate protector and savior.

Instead, the public might have been treated to a true story about a false flag operation, in which case President Clinton’s massaged message would never have been delivered.

But the fake crooked science pushed by the FBI lab was permitted to stand—despite exposure as fraud—and the story of militia terrorists trying to bring down the federal government was allowed to stand as well.

The year 1995 was rife with anti-government sentiment in America. This wasn’t merely a bunch of militias talking about insurrection. This was widespread dissatisfaction, on the part of many Americans, who were seeing federal power expand beyond any semblance of constitutionality.

As an object lesson, the Oklahoma Bombing was: “You see what happens when crazy people are allowed to own guns and oppose the government? Stop listening to anti-government rhetoric. It’s horribly dangerous. We, the government, are here to protect you. Come home to us. Have faith in us. We’ll punish the offenders. We’ll make America safe again. Let’s all come together and oppose these maniacs who want to destroy our way of life…”

The lesson worked.

Many scared Americans signed on to Clinton’s agenda.

And fake FBI science was used to bolster that agenda.

Even when exposed as fake by mainstream press outlets—however briefly, with no determined follow-up—the federal brainwashing held. The myth was stronger than reality.

If the federal government can egregiously lie about an event as huge as the Oklahoma Bombing, using fake science as a cover—what wouldn’t they lie about?

That’s a question which answers itself.

Read More At: JonRappoport.wordpress.com
_______________________________________________________________

Jon Rappoport

The author of three explosive collections, THE MATRIX REVEALED, EXIT FROM THE MATRIX, and POWER OUTSIDE THE MATRIX, Jon was a candidate for a US Congressional seat in the 29th District of California. He maintains a consulting practice for private clients, the purpose of which is the expansion of personal creative power. Nominated for a Pulitzer Prize, he has worked as an investigative reporter for 30 years, writing articles on politics, medicine, and health for CBS Healthwatch, LA Weekly, Spin Magazine, Stern, and other newspapers and magazines in the US and Europe. Jon has delivered lectures and seminars on global politics, health, logic, and creative power to audiences around the world. You can sign up for his free NoMoreFakeNews emails here or his free OutsideTheRealityMachine emails here.

The 11-Dimensional Multi-Verse Of The Brain

Source: GizaDeathStar.com
Dr. Joseph P. Farrell Ph.D.
June 21, 2017

A couple of days ago I blogged about the discovery of “memory-wiping” enzymes and its implications for the topic of mind control. In that blog, I also made the connection between the mind and the universe, particularly the version of quantum theory called the “multiverse” hypothesis. I’ve long sensed that there is a connection between the mind and matter, and that this connection is not of the tidy Cartesian variety, where the one (take your choice) gives rise to the other. I suspect, and have suspected for some time, that the situation is rather than of a complex set of feedback loops between the two, and that in that situation, that complexity can only be described by something “not physical” in the ordinary, three dimensional sense of our everyday experience.

Thus, when Mr. M.H. shared this article this week, I took notice:

Brain Architecture: Scientists Discover 11 Dimensional Structures That Could Help Us Understand How the Brain Works

The following paragraphs leapt out at me:

Scientists studying the brain have discovered that the organ operates on up to 11 different dimensions, creating multiverse-like structures that are “a world we had never imagined.”

By using an advanced mathematical system, researchers were able to uncover architectural structures that appears when the brain has to process information, before they disintegrate into nothing.

In the latest study, researchers honed in on the neural network structures within the brain using algebraic topology—a system used to describe networks with constantly changing spaces and structures. This is the first time this branch of math has been applied to neuroscience.

“Algebraic topology is like a telescope and microscope at the same time. It can zoom into networks to find hidden structures—the trees in the forest—and see the empty spaces—the clearings—all at the same time,” study author Kathryn Hess said in a statement.

In the study, researchers carried out multiple tests on virtual brain tissue to find brain structures that would never appear just by chance.

“We found a world that we had never imagined. There are tens of millions of these objects even in a small speck of the brain, up through seven dimensions. In some networks, we even found structures with up to eleven dimensions.”

The findings indicate the brain processes stimuli by creating these complex cliques and cavities, so the next step will be to find out whether or not our ability to perform complicated tasks requires the creation of these multi-dimensional structures.

Hess says the findings suggest that when we examine brain activity with low-dimensional representations, we only get a shadow of the real activity taking place. This means we can see some information, but not the full picture. “So, in a sense our discoveries may explain why it has been so hard to understand the relation between brain structure and function,” she explains.

Talk about high octane! Let that sink in for a moment: at every moment you are thinking, multi-dimensional structures arise in your very three dimensional brain, and that’s a fancy way of saying your brain is not closed within or upon itself, but is rather an open system interacting with much higher dimensional realities that cannot be encompassed in the material 3-d world. And this is why, using a merely three-dimensional model, or, if I may be more blunt, a merely materialistic model of the mind-brain relationship, has failed to grasp the complexity, the hyper-dimensional complexity, of what is actually going on. Indeed, higher order topologies are necessary to describe thought at all: thought does not occur in the three dimensional material stuff of life solely or exclusively, but outside it, as something coupled with it. (Regular readers of my books will recognize this as what I’ve been calling the Topological Metaphor of the Medium, and its analogical basis.) For those who’ve read my books Secrets of the Unified Field or The Third Way, the name of Gabriel Kron should also spring to mind, with his theory that all electrical circuits, no matter how simple they are, are in effect hyper-dimensional machines, transducing something “down here” from “up there”.

What is interesting in this article is also the implication of the object or stimulus of brain activity: for consider what that object is, in physics terms. Even at the atomic or, better, sub-atomic quantum level, these “material” entities dissolve – if I may use that term – into packets of information modeled by multi-dimensional mathematical equations. In other words, multi-dimensionality is the bridge of perception because the multi-dimensionality is at the root of the objects themselves.

What’s coming down the pike? Well, I’ve speculated at length about this idea in our numerous members’ vidchats (along with some pretty stimulating speculations from members themselves): the next step is to find the exact nature and structure of those “feedback” loops between the “material” world and the “incorporeal” one: think “quantum neurology” and “neuro-cosmology” for a moment, and you get an intuitive approximation of how the old, tidy, Cartesian dualistic lines are breaking down. We are, I rather suspect, looking at something more akin to the old Neoplatonic spectrum of “fine gradations” from the immaterial world of forms to the increasingly “dense” world of matter.

Funny thing, too, to remember that Plato referred to all of this as “the mathematicals”. Funny thing, too, that in membrane theory, space-time is in 11 dimensions.

See you on the flip side…

Read More At: GizaDeathStar.com
________________________________________________

About Dr. Joseph P. Farrell

Joseph P. Farrell has a doctorate in patristics from the University of Oxford, and pursues research in physics, alternative history and science, and “strange stuff”. His book The Giza DeathStar, for which the Giza Community is named, was published in the spring of 2002, and was his first venture into “alternative history and science”.

40% of Scientists Admit Fraud “Always or Often” Contributes to Irreproducible Research

TruthFact
Source: Nature.com
Monya Baker
May 25, 2016

More than 70% of researchers have tried and failed to reproduce another scientist’s experiments, and more than half have failed to reproduce their own experiments. Those are some of the telling figures that emerged from Nature‘s survey of 1,576 researchers who took a brief online questionnaire on reproducibility in research.

The data reveal sometimes-contradictory attitudes towards reproducibility. Although 52% of those surveyed agree that there is a significant ‘crisis’ of reproducibility, less than 31% think that failure to reproduce published results means that the result is probably wrong, and most say that they still trust the published literature.

Data on how much of the scientific literature is reproducible are rare and generally bleak. The best-known analyses, from psychology1 and cancer biology2, found rates of around 40% and 10%, respectively. Our survey respondents were more optimistic: 73% said that they think that at least half of the papers in their field can be trusted, with physicists and chemists generally showing the most confidence.

The results capture a confusing snapshot of attitudes around these issues, says Arturo Casadevall, a microbiologist at the Johns Hopkins Bloomberg School of Public Health in Baltimore, Maryland. “At the current time there is no consensus on what reproducibility is or should be.” But just recognizing that is a step forward, he says. “The next step may be identifying what is the problem and to get a consensus.”

Failing to reproduce results is a rite of passage, says Marcus Munafo, a biological psychologist at the University of Bristol, UK, who has a long-standing interest in scientific reproducibility. When he was a student, he says, “I tried to replicate what looked simple from the literature, and wasn’t able to. Then I had a crisis of confidence, and then I learned that my experience wasn’t uncommon.”

The challenge is not to eliminate problems with reproducibility in published work. Being at the cutting edge of science means that sometimes results will not be robust, says Munafo. “We want to be discovering new things but not generating too many false leads.”

The scale of reproducibility

But sorting discoveries from false leads can be discomfiting. Although the vast majority of researchers in our survey had failed to reproduce an experiment, less than 20% of respondents said that they had ever been contacted by another researcher unable to reproduce their work. Our results are strikingly similar to another online survey of nearly 900 members of the American Society for Cell Biology (see go.nature.com/kbzs2b). That may be because such conversations are difficult. If experimenters reach out to the original researchers for help, they risk appearing incompetent or accusatory, or revealing too much about their own projects.

A minority of respondents reported ever having tried to publish a replication study. When work does not reproduce, researchers often assume there is a perfectly valid (and probably boring) reason. What’s more, incentives to publish positive replications are low and journals can be reluctant to publish negative findings. In fact, several respondents who had published a failed replication said that editors and reviewers demanded that they play down comparisons with the original study.

Nevertheless, 24% said that they had been able to publish a successful replication and 13% had published a failed replication. Acceptance was more common than persistent rejection: only 12% reported being unable to publish successful attempts to reproduce others’ work; 10% reported being unable to publish unsuccessful attempts.

Survey respondent Abraham Al-Ahmad at the Texas Tech University Health Sciences Center in Amarillo expected a “cold and dry rejection” when he submitted a manuscript explaining why a stem-cell technique had stopped working in his hands. He was pleasantly surprised when the paper was accepted3. The reason, he thinks, is because it offered a workaround for the problem.

Others place the ability to publish replication attempts down to a combination of luck, persistence and editors’ inclinations. Survey respondent Michael Adams, a drug-development consultant, says that work showing severe flaws in an animal model of diabetes has been rejected six times, in part because it does not reveal a new drug target. By contrast, he says, work refuting the efficacy of a compound to treat Chagas disease was quickly accepted4.

The corrective measures

One-third of respondents said that their labs had taken concrete steps to improve reproducibility within the past five years. Rates ranged from a high of 41% in medicine to a low of 24% in physics and engineering. Free-text responses suggested that redoing the work or asking someone else within a lab to repeat the work is the most common practice. Also common are efforts to beef up the documentation and standardization of experimental methods.

Any of these can be a major undertaking. A biochemistry graduate student in the United Kingdom, who asked not to be named, says that efforts to reproduce work for her lab’s projects doubles the time and materials used — in addition to the time taken to troubleshoot when some things invariably don’t work. Although replication does boost confidence in results, she says, the costs mean that she performs checks only for innovative projects or unexpected results.

Consolidating methods is a project unto itself, says Laura Shankman, a postdoc studying smooth muscle cells at the University of Virginia, Charlottesville. After several postdocs and graduate students left her lab within a short time, remaining members had trouble getting consistent results in their experiments. The lab decided to take some time off from new questions to repeat published work, and this revealed that lab protocols had gradually diverged. She thinks that the lab saved money overall by getting synchronized instead of troubleshooting failed experiments piecemeal, but that it was a long-term investment.

Irakli Loladze, a mathematical biologist at Bryan College of Health Sciences in Lincoln, Nebraska, estimates that efforts to ensure reproducibility can increase the time spent on a project by 30%, even for his theoretical work. He checks that all steps from raw data to the final figure can be retraced. But those tasks quickly become just part of the job. “Reproducibility is like brushing your teeth,” he says. “It is good for you, but it takes time and effort. Once you learn it, it becomes a habit.”

One of the best-publicized approaches to boosting reproducibility is pre-registration, where scientists submit hypotheses and plans for data analysis to a third party before performing experiments, to prevent cherry-picking statistically significant results later. Fewer than a dozen people mentioned this strategy. One who did was Hanne Watkins, a graduate student studying moral decision-making at the University of Melbourne in Australia. Going back to her original questions after collecting data, she says, kept her from going down a rabbit hole. And the process, although time consuming, was no more arduous than getting ethical approval or formatting survey questions. “If it’s built in right from the start,” she says, “it’s just part of the routine of doing a study.”

The cause

The survey asked scientists what led to problems in reproducibility. More than 60% of respondents said that each of two factors — pressure to publish and selective reporting — always or often contributed. More than half pointed to insufficient replication in the lab, poor oversight or low statistical power. A smaller proportion pointed to obstacles such as variability in reagents or the use of specialized techniques that are difficult to repeat.

But all these factors are exacerbated by common forces, says Judith Kimble, a developmental biologist at the University of Wisconsin–Madison: competition for grants and positions, and a growing burden of bureaucracy that takes away from time spent doing and designing research. “Everyone is stretched thinner these days,” she says. And the cost extends beyond any particular research project. If graduate students train in labs where senior members have little time for their juniors, they may go on to establish their own labs without having a model of how training and mentoring should work. “They will go off and make it worse,” Kimble says.

What can be done?

Respondents were asked to rate 11 different approaches to improving reproducibility in science, and all got ringing endorsements. Nearly 90% — more than 1,000 people — ticked “More robust experimental design” “better statistics” and “better mentorship”. Those ranked higher than the option of providing incentives (such as funding or credit towards tenure) for reproducibility-enhancing practices. But even the lowest-ranked item — journal checklists — won a whopping 69% endorsement.

The survey — which was e-mailed to Nature readers and advertised on affiliated websites and social-media outlets as being ‘about reproducibility’ — probably selected for respondents who are more receptive to and aware of concerns about reproducibility. Nevertheless, the results suggest that journals, funders and research institutions that advance policies to address the issue would probably find cooperation, says John Ioannidis, who studies scientific robustness at Stanford University in California. “People would probably welcome such initiatives.” About 80% of respondents thought that funders and publishers should do more to improve reproducibility.

“It’s healthy that people are aware of the issues and open to a range of straightforward ways to improve them,” says Munafo. And given that these ideas are being widely discussed, even in mainstream media, tackling the initiative now may be crucial. “If we don’t act on this, then the moment will pass, and people will get tired of being told that they need to do something.”

Read More At: Nature.com

Memory Erasing 101: Ewen Cameron’s Dream Come True: Memory Wipe Enzyme

Source: GizaDeathStar.com
Dr. Joseph P. Farrell Ph.D.
June 18, 2017

Just in case you read the title of this blog, and don’t know who Ewen Cameron was, a little history is in order: Cameron was a psychologist/psychiatrist involved in the CIA’s infamous mind-control program, MK-Ultra. Cameron had his “laboratory” in a  psychiatric hospital in Canada, where he subjected his victims (I won’t use the word “patients” here, because what Cameron did is in my opinion unspeakable) to a regime of drug cocktails, continual sleep (nor for hours, but for days at a time), and repeated endless bombardment of tape recorders playing back, for hours on end, recorded messages. He called all this “psychic driving,” and his goal was to eliminate “bad personality habits” (or even the personality itself) and to replace it with “something else”, that something else being the recorded endlessly repeated messages. If his “procedures” (and we’ve only very briefly summarized them) sound a little like the Nazi doctors in World War Two, then you understand the measure of the torture he was inflicting.

But imagine a magic drug that could do the same thing, without the endless weeks of sleep, tape-recorded looped messages, and cocktails. Indeed, if one digs a little bit into the history of the CIA’s various mind-control programs – Projects Bluebird, Artichoke, or MK-Ultra – one of the things being investigated was precisely the use of drugs for memory and behavior alteration.

Which brings us to this rather frightening article that was shared by Mr. C.S. this week:

MEMORY HOLE: U.S. scientists have developed a “memory wipe” enzyme that can erase memories forever

Assuming the article to be true, then the implications of the following would fulfill Dr. Cameron’s wildest fantasies of “psychic driving” and memory wipes:

Scientists have long known that creating new memories and storing old ones involve the creation of proteins in the synapse, where two brain cells meet. For this process to be successful, genes must be expressed in the nucleus of the cell, and this is where a key enzyme can turn genes on or off as new memories are formed. It’s also believed that this enzyme, which is known as ACSS2, plays a role in the memory impairment that is seen in neurodegenerative disorders.

In the study, the researchers found that lowering ACSS2 levels in mice reduced the expression of memory genes, thereby stopping the formation of long-term memories. Mice who had reduced enzyme levels showed no interest in a ball they saw the previous day, whereas those with normal levels of the enzyme were interested in the ball.

Now the researchers are hoping to use this knowledge to stop traumatic memories from forming in people with PTSD simply by blocking the brain’s ACSS2. This might sound like a good idea to those of us who are haunted by some sort of trauma, but there’s also the potential for this to be used for more sinister reasons.

As the article goes on to point out, what’s to prevent the “intelligence” agencies of the modern police state from using the capability to erase memories in individuals (or for that matter, whole populations), it finds “inconvenient”, or from planting completely false memories. In these, Cameron’s goal of completely wiping one personality and replacing it with another come close to reality, without the corresponding torture he inflicted.

Which brings us to the high octane speculation of the day: why investigate such things at all? As the article avers, some beneficial uses could be had, but I strongly suspect that all those assurances we were given decades ago from our intelligence agencies during the Church Committee were just that: assurances, nothing more, and that the covert funding and investigation of mind control techniques and technologies continued. With its track record of having given LSD to unsuspecting victims to study their responses – all under the aegis of its mind control programs – one can see where this is going, for in a world where chemicals are sprayed over whole populations without their knowledge (in many cases) and without their consent (in most), it takes little imagination to see that a study of “whole population effects” could be had with the appropriate spraying, or slipping a little “mind wipe enzyme” into a town’s water supply, and watching and studying the results. Indeed, in 1984(note the year) American actor Tim Matheson starred with co-star Hume Cronyn in the movie Impulse, which was about precisely such a scenario. Add a false news story or two and one has a frightening scenario where whole populations might be induced to “remember” something that didn’t actually happen, or to forget something that did.

I am reminded of the late 1960s and 1970s, when various gurus of the “drug culture” actually viewed psychedelics are a means of accessing “alternate worlds,” and in a universe where one has quantum physicists emphasizing the role of the observer in the creation of reality, and where they are talking about “multiple worlds” hypotheses, such an experiment might even have cosmological implications.

I don’t know about you, but I for one put nothing past them.

See you on the flip side…

Read More At: GizaDeathStar.com
________________________________________________

About Dr. Joseph P. Farrell

Joseph P. Farrell has a doctorate in patristics from the University of Oxford, and pursues research in physics, alternative history and science, and “strange stuff”. His book The Giza DeathStar, for which the Giza Community is named, was published in the spring of 2002, and was his first venture into “alternative history and science”.

Immortality & Resurrection Inc.


Source: GizaDeathStar.com
Dr. Joseph P. Farrell Ph.D.
June 13, 2017

Just when you thought the aspirations and plans of modern science couldn’t possibly become more diabolical (or, if one prefer, sacrilegious), an article comes along to renew your hope that the world continues on its path of normalcy, and that many scientists are, indeed, just as wild-eyed-nuts as you always thought them to be. And this week, apparently many people were relieved and reassured that the mad scientist is not a thing of the past or a species that died out, but a real, living creature deserving of our awe and respect. Ms. M.W. and many others found this, and shared it, doubtless because they were concerned that I was losing hope that there were no more mad scientists:

Could we soon REVERSE death? US company to start trials ‘reawakening the dead’ in Latin America ‘in a few months’ – and this is how they’ll do it

Way back when I first started writing about these strange topics in The Giza Death Star, I made the observation that physical immortality might not be such a good thing, without a commensurate and corresponding improvement in human spirituality and morality. In this, I took my cue from an ancient Greek Church Father named St. John Chrysostom, who warned about the same thing, and who stated that it was death, in fact, that formed the crucial condition for the possibility of human repentance and a change of mind, for it cut off further progress in evil. Taking this as my cue, in the final pages of that book, I asked people to imagine if such immortality were possible, or even a dramatically extended life span were possible – both of which are now being openly discussed and touted in serious and not-so-serious literature – what it might mean for the resulting civilization? One thing that would result, I pointed out, was a vastly expanded and accelerated scientific and technological development. One individual would, in such a condition, be able to learn and to master several academic disciplines, not just one.The explosion of technology and science would dwarf anything we have seen thus far. But the other consequence would be for moral progress. Imagine, I said back then, an Albert Schweitzer having not a century, but centuries or even millennia to do good things, or, conversely, a Mao Tse-Tung, a Josif Stalin, a Pol Pot or an Adolf Hitler, having that long to “perfect their progress in evil,” and one gets a clear picture of the sharp moral contradictions such a society would be in. And please note: this problem is not a problem that, to my knowledge, is receiving anything close to the attention it needs in the transhumanism-virtual immortality community. The sole focus is on the science; if we can do it, we should do it.

Now we have this:

Bioquark, a Philadelphia-based company, announced in late 2016 that they believe brain death is not ‘irreversible’.

And now, CEO Ira Pastor has revealed they will soon be testing an unprecedented stem cell method on patients in an unidentified country in Latin America, confirming the details in the next few months.

To be declared officially dead in the majority of countries, you have to experience complete and irreversible loss of brain function, or ‘brain death’.

According to Pastor, Bioquark has developed a series of injections that can reboot the brain – and they plan to try it out on humans this year.

They have no plans to test on animals first.

The first stage, named ‘First In Human Neuro-Regeneration & Neuro-Reanimation’ was slated to be a non-randomized, single group ‘proof of concept’ study.

The team said they planned to examine individuals aged 15-65 declared brain dead from a traumatic brain injury using MRI scans, in order to look for possible signs of brain death reversal.

Specifically, they planned to break it down into three stages.

First, they would harvest stem cells from the patient’s own blood, and inject this back into their body.

Next, the patient would receive a dose of peptides injected into their spinal cord.

Finally, they would undergo a 15-day course of nerve stimulation involving lasers and median nerve stimulation to try and bring about the reversal of brain death, whilst monitoring the patients using MRI scans.

Light, chemistry, and stem cells and DNA. If one didn’t know any better, one would swear one was looking at the broad chronological progression of Genesis 1.

But I digress.

The problem here is, one notices, the almost complete avoidance of the moral question. Let’s assume the technology works and that one can, literally, resurrect the dead scientifically. And let us assume the project reaches the stage of perfection envisioned by the Russian Cosmists, like Nikolai Fedorov. The cosmists, recall, want to extend the resurrection-by-science principle to the entire history of one’s ancestors. But should this occur, then what about resurrecting people like Stalin, Mao, or Hitler? The sad truth is, some people still “revere” those twisted and murderous people as heroes. The sad truth is, some people would attempt to do it, if given the means to do so.

But there’s an even bigger problem. The entire project is predicated on the materialist assumption that “brain function equals the person.” Regular readers here know that I have never subscribed to such a view, nor have I subscribed to the view, conversely, that there is no relationship between a person’s “personhood” and the functions of their soul, which would include, of course, the functions of their will, intellect, emotions, and brain. It is, I suspect, a very complex phenomenon not neatly divided into tidy Cartesian dualisms, with numerous feedback loops between the two. This said, however, the problem arises then that the brain is not the creator of individuality, but rather, its transducer (and, if I may employ a more ancient version of the term, its traducer). Thus, the possibility arises that one might “revive” a brain, and traduce or transduce a different individual than one “recalls” being present prior to brain death. Already some psychologists have written – and published – papers suggesting that certain mental disorders such as bipolarity and schizophrenia might not be disorders in any standard sense, but rather a phenomenon where an individual is inhabiting two very different and parallel universes at the same time. In this they draw upon the many worlds hypotheses of qauntum mechanics.

In short, for my money, I have no doubt that ultimately, some sort of “scientific” resurrection technique might be possible. But I suspect it will be a Pandora’s box of spiritual phenomena which, once opened, will be difficult if not impossible to close again, and that before we open it, we should give lengthy, and due consideration to all the moral problems it will engender.

See you on the flip side…

Read More At: GizaDeathStar.com
________________________________________________

About Dr. Joseph P. Farrell

Joseph P. Farrell has a doctorate in patristics from the University of Oxford, and pursues research in physics, alternative history and science, and “strange stuff”. His book The Giza DeathStar, for which the Giza Community is named, was published in the spring of 2002, and was his first venture into “alternative history and science”.

Global Warming A Myth Say 80 Graphs From 58 Peer-Reviewed Papers

Scientists Increasingly Discarding ‘Hockey Stick’ Temperature Graphs


“[W]hen it comes to disentangling natural variability from anthropogenically affected variability the vast majority of the instrumental record may be biased.”  — Büntgen et al., 2017

EarthSource: Notrickszone.com
Kenneth Richard
May 29, 2017


Last year there were at least 60 peer-reviewed papers published in scientific journals demonstrating that Today’s Warming Isn’t Global, Unprecedented, Or Remarkable.
 .
Just within the last 5 months, 58 more papers and 80 new graphs have been published that continue to undermine the popularized conception of a slowly cooling Earth temperature history followed by a dramatic hockey-stick-shaped uptick, or an especially unusual global-scale warming during modern times.
 .
Yes, some regions of the Earth have been warming in recent decades or at some point in the last 100 years.  Some regions have been cooling for decades at a time.  And many regions have shown no significant net changes or trends in either direction relative to the last few hundred to thousands of years.
 .
Succinctly, then, scientists publishing in peer-reviewed journals have increasingly affirmed that there is nothing historically unprecedented or remarkable about today’s climate when viewed in the context of long-term natural variability.


Büntgen et al., 2017

“Spanning the period 1186-2014 CE, the new reconstruction reveals overall warmer conditions around 1200 and 1400, and again after ~1850. … Little agreement is found with climate model simulations that consistently overestimate recent summer warming and underestimate pre-industrial temperature changes. … [W]hen it comes to disentangling natural variability from anthropogenically affected variability the vast majority of the instrumental record may be biased. …




Abrantes et al., 2017

The transition from warm to colder climatic conditions occurs around 1300 CE associated with the Wolf solar minimum. The coldest SSTs are detected between 1350 and 1850 CE, on Iberia during the well-known Little Ice Age (LIA) (Bradley and Jones, 1993), with the most intense cooling episodes related with other solar minima events, and major volcanic forcing and separated by intervals of relative warmth (e.g. (Crowley and Unterman, 2013; Solanki et al., 2004; Steinhilber et al., 2012; Turner et al., 2016; Usoskin et al., 2011). During the 20th century, the southern records show unusually large decadal scale SST oscillations in the context of the last 2 millennia, in particular after the mid 1970’s, within the Great Solar Maximum (1940 – 2000 (Usoskin et al., 2011)) and the “greater salinity anomaly” event in the northern Atlantic (Dickson et al., 1988), or yet the higher global temperatures of the last 1.4 ky detected by (Ahmed et al., 2013).”


Werner et al., 2017


Deng et al., 2017

The results indicate that the climate of the Medieval Climate Anomaly (MCA, AD 900–1300) was similar to that of the Current Warm Period (CWP, AD 1850–present) … As for the Little Ice Age (LIA, AD 1550–1850), the results from this study, together with previous data from the Makassar Strait, indicate a cold and wet period compared with the CWP and the MCA in the western Pacific. The cold LIA period agrees with the timing of the Maunder sunspot minimum and is therefore associated with low solar activity.”


Chapanov et al., 2017

“A good agreement exists between the decadal cycles of LOD [length of day], MSL [mean sea level], climate and solar indices whose periods are between 12-13, 14-16, 16-18 and 28-33 years.”


Williams et al., 2017

“Reconstructed SSTs significantly warmed 1.1°C … from 1660s to 1800 (rate of change: 0.008°C/year), followed by a significant cooling of 0.8°C …  until 1840 (rate of change: 0.02°C/year), then a significant warming of 0.8°C from 1860 until the end of reconstruction in 2007 (rate of change: 0.005°C/year).” [The amplitude of sea surface temperature warming and cooling was higher and more rapid from the 1660s to 1800 than from 1860-2007.]
‘In fact, the SST reconstruction significantly co-varied with a reconstruction of solar irradiance [Lean, 2000] on the 11-year periodicity only from ~1745 to 1825. In addition, the reconstructed SSTs were cool during the period of lower than usual solar irradiance called the Maunder minimum (1645–1715) but then warmed and cooled during the Dalton minimum (1795–1830), a second period of reduced solar irradiance. … The Dalton solar minimum and increased volcanic activity in the early 1800s could explain the decreasing SSTs from 1800 to 1850.”


Stenni et al., 2017

“A recent effort to characterize Antarctic and sub-Antarctic climate variability during the last 200 years also concluded that most of the trends observed since satellite climate monitoring began in 1979 CE cannot yet be distinguished from natural (unforced) climate variability (Jones et al., 2016), and are of the opposite sign [cooling, not warming] to those produced by most forced climate model simulations over the same post-1979 CE interval. … (1) Temperatures over the Antarctic continent show an overall cooling trend during the period from 0 to 1900CE, which appears strongest in West Antarctica, and (2) no continent-scale warming of Antarctic temperature is evident in the last century.”


Li et al., 2017


Demezhko et al., 2017

“GST [ground surface temperature] and SHF [surface heat flux] histories differ substantially in shape and chronology. Heat flux changes ahead temperature changes by 500–1000 years.”


Luoto and Nevalainen, 2017


Li et al., 2017

“The main driving forces behind the Holocene climatic changes in the LYR [Lower Yangtze Region, East China] area are likely summer solar insolation associated with tropical or subtropical macro-scale climatic circulations such as the Intertropical Convergence Zone (ITCZ), Western Pacific Subtropical High (WPSH), and El Niño/Southern Oscillation (ENSO).”


Mayewski et al., 2017


Rydval et al., 2017

“[T]he recent summer-time warming in Scotland is likely not unique when compared to multi-decadal warm periods observed in the 1300s, 1500s, and 1730s“



Reynolds et al., 2017


Rosenthal et al., 2017

“Here we review proxy records of intermediate water temperatures from sediment cores and corals in the equatorial Pacific and northeastern Atlantic Oceans, spanning 10,000 years beyond the instrumental record. These records suggests that intermediate waters [0-700 m] were 1.5-2°C warmer during the Holocene Thermal Maximum than in the last century. Intermediate water masses cooled by 0.9°C from the Medieval Climate Anomaly to the Little Ice Age. These changes are significantly larger than the temperature anomalies documented in the instrumental record. The implied large perturbations in OHC and Earth’s energy budget are at odds with very small radiative forcing anomalies throughout the Holocene and Common Era. … The records suggest that dynamic processes provide an efficient mechanism to amplify small changes in insolation [surface solar radiation] into relatively large changes in OHC.”


Li et al., 2017

“We suggest that solar activity may play a key role in driving the climatic fluctuations in NC [North China] during the last 22 centuries, with its quasi ∼100, 50, 23, or 22-year periodicity clearly identified in our climatic reconstructions. … It has been widely suggested from both climate modeling and observation data that solar activity plays a key role in driving late Holocene climatic fluctuations by triggering global temperature variability and atmospheric dynamical circulation


Goursaud et al., 2017


Guillet et al., 2017


Wilson et al., 2017


Tegzes et al., 2017

Our sortable-silt time series show prominent multi-decadal to multi-centennial variability, but no clear long-term trend over the past 4200 years. … [O]ur findings indicate that variations in the strength of the main branch of the Atlantic Inflow may not necessarily translate into proportional changes in northward oceanic heat transport in the eastern Nordic Seas.”



Tejedor et al., 2017


Fernández-Fernández et al., 2017


Cai and Liu et al., 2017

“2003– 2009 was the warmest period in the reconstruction. 1970– 2000 was colder than the last stage of the Little Ice Age (LIA).”


Köse et al., 2017

“The reconstruction is punctuated by a temperature increase during the 20th century; yet extreme cold and warm events during the 19th century seem to eclipse conditions during the 20th century. We found significant correlations between our March–April spring temperature reconstruction and existing gridded spring temperature reconstructions for Europe over Turkey and southeastern Europe. … During the last 200 years, our reconstruction suggests that the coldest year was 1898 and the warmest year was 1873. The reconstructed extreme events also coincided with accounts from historical records. …  Further, the warming trends seen in our record agrees with data presented by Turkes and Sumer (2004), of which they attributed [20th century warming] to increased urbanization in Turkey.”


Flannery et al., 2017

The early part of the reconstruction (1733–1850) coincides with the end of the Little Ice Age, and exhibits 3 of the 4 coolest decadal excursions in the record. However, the mean SST estimate from that interval during the LIA is not significantly different from the late 20th Century SST mean. The most prominent cooling event in the 20th Century is a decade centered around 1965. This corresponds to a basin-wide cooling in the North Atlantic and cool phase of the AMO.”


Steiger et al., 2017

“Through several idealized and real proxy experiments we assess the spatial and temporal extent to which isotope records can reconstruct surface temperature, 500 hPa geopotential height, and precipitation. We find local reconstruction skill to be most robust across the reconstructions, particularly for temperature and geopotential height, as well as limited non-local skill in the tropics.  These results are in agreement with long-held views that isotopes in ice cores have clear value as local climate proxies, particularly for temperature and atmospheric circulation.”




Chang et al., 2017

“The chironomid-based record from Heihai Lake shows a summer temperature fluctuation within 2.4°C in the last c. 5000 years from the south-east margin of the QTP [Qinghai–Tibetan Plateau]. … The summer temperature changes in this region respond primarily to the variation in the Asian Summer Monsoon. The variability of solar activity is likely an important driver of summer temperatures, either directly or by modifying the strength and intensity of the Indian Ocean Summer Monsoon. … We observed a relatively long-lasting summer cooling episode (c. 0.8°C lower than the 5000-year average) between c. 270 cal. BP and AD c. 1956. … The record shows cooling episodes occurred at c. 3100, 2600, 2100 and 1600 cal. BP.  This is likely related to the period defined as the Northern Hemisphere Little Ice Age (LIA; c. AD 1350–1850, equivalent to 600–100 cal. BP). These possibly relate to the 500-year quasi-periodic solar cycle. Cooling stages between c. 270 and 100 cal. BP were also recorded and these are possibly linked to the LIA suggesting a hemisphere-wide forcing mechanism for this event.”

 


Krossa et al., 2017


Albot, 2017

Growing paleoclimatic evidence suggests that the climatic signals of Medieval Warm Period and the Little Ice Age events can be detected around the world (Mayewski et al., 2004; Bertler et al., 2011). … [T]he causes for these events are still debated between changes in solar output, increased volcanic activity, shifts in zonal wind distribution, and changes in the meridional overturning circulation (Crowley, 2000; Hunt, 2006).”


Zhang et al., 2017

“[S]ummer temperature variability at the QTP [Qinghai-Tibetan Plateau] responds rapidly to solar irradiance changes in the late Holocene”




Kotthoff et al., 2017


Li et al., 2017

“Overall, the strong linkage between solar variability and summer SSTs is not only of regional significance, but is also consistent over the entire North Atlantic region.”


Jones et al., 2017


Vachula et al., 2017


Fischel et al., 2017


Li et al., 2017


Anderson et al., 2017


Woodson et al., 2017

The last ca. 1000 years recorded the warmest SST averaging 28.5°C. We record, for the first time in this region, a cool interval, ca. 1000 years in duration, centered on 5000 cal years BP concomitant with a wet period recorded in Borneo. The record also reflects a warm interval from ca. 1000 to 500 cal years BP that may represent the Medieval Climate Anomaly. Variations in the East Asian Monsoon (EAM) and solar activity are considered as potential drivers of SST trends. However, hydrology changes related to the El Nino-Southern Oscillation (ENSO) variability, ~ shifts of the Western Pacific Warm Pool and migration of the Intertropical Convergence Zone are more likely to have impacted our SST temporal trend. …  The SA [solar activity] trends (Steinhilber et al., 2012) are in general agreement with the regional cooling of SST (Linsley et al., 2010) and the SA [solar activity] oscillations are roughly coincident with the major excursions in our SST data.”


Koutsodendris et al., 2017

“Representing one of the strongest global climate instabilities during the Holocene, the Little Ice Age (LIA) is marked by a multicentennial-long cooling (14-19th centuries AD) that preceded the recent ‘global warming’ of the 20th century. The cooling has been predominantly attributed to reduced solar activity and was particularly pronounced during the 1645-1715 AD and 1790-1830 AD solar minima, which are known as Maunder and Dalton Minima, respectively.”


Browne et al., 2017


Perșoiu et al., 2017


Kawahata et al., 2017

“The SST [sea surface temperature] shows a broad maximum (~17.3 °C) in the mid-Holocene (5-7 cal kyr BP), which corresponds to the Jomon transgression. … The SST maximum continued for only a century and then the SST [sea surface temperatures] dropped by 3.5 °C [15.1 to 11.6 °C] within two centuries. Several peaks fluctuate by 2°C over a few centuries.”


Saini et al., 2017


Dechnik et al., 2017


Wu et al., 2017


Sun et al., 2017

“Our findings are generally consistent with other records from the ISM [Indian Summer Monsoon]  region, and suggest that the monsoon intensity is primarily controlled by solar irradiance on a centennial time scale. This external forcing may have been amplified by cooling events in the North Atlantic and by ENSO activity in the eastern tropical Pacific, which shifted the ITCZ further southwards.”


Wu et al., 2017

“The existence of depressed MAAT [mean annual temperatures] (1.3°C lower than the 3200-year average) between 1480 CE and 1860 CE (470–90 cal. yr BP) may reflect the manifestation of the ‘Little Ice Age’ (LIA) in southern Costa Rica. Evidence of low-latitude cooling and drought during the ‘LIA’ has been documented at several sites in the circum-Caribbean and from the tropical Andes, where ice cores suggest marked cooling between 1400 CE and 1900 CE.  Lake and marine records recovered from study sites in the southern hemisphere also indicate the occurrence of ‘LIA’ cooling. High atmospheric aerosol concentrations, resulting from several large volcanic eruptions and sea-ice/ocean feedbacks, have been implicated as the drivers responsible for the ‘LIA’.”


Park, 2017

Late Holocene climate change in coastal East Asia was likely driven by ENSO variation.   Our tree pollen index of warmness (TPIW) shows important late Holocene cold events associated with low sunspot periods such as Oort, Wolf, Spörer, and Maunder Minimum. Comparisons among standard Z-scores of filtered TPIW, ΔTSI, and other paleoclimate records from central and northeastern China, off the coast of northern Japan, southern Philippines, and Peru all demonstrate significant relationships [between solar activity and climate]. This suggests that solar activity drove Holocene variations in both East Asian Monsoon (EAM) and El Niño Southern Oscillation (ENSO).”


Markle et al., 2017


Dong et al., 2017


Nazarova et al., 2017

“The application of transfer functions resulted in reconstructed T July fluctuations of approximately 3 °C over the last 2800 years. Low temperatures (11.0-12.0 °C) were reconstructed for the periods between ca 1700 and 1500 cal yr BP (corresponding to the Kofun cold stage) and between ca 1200 and 150 cal yr BP (partly corresponding to the Little Ice Age [LIA]). Warm periods (modern T[emperatures] July or higher) were reconstructed for the periods between ca 2700 and 1800 cal yr BP, 1500 and 1300 cal yr BP and after 150 cal yr BP.”


Samartin et al., 2017


Thienemann et al., 2017

“[P]roxy-inferred annual MATs[annual mean air temperatures] show the lowest value at 11,510 yr BP (7.6°C). Subsequently, temperatures rise to 10.7°C at 9540 yr BP followed by an overall decline of about 2.5°C until present (8.3°C).”


Li et al., 2017

“Contrary to the often-documented warming trend over the past few centuries, but consistent with temperature record from the northern Tibetan Plateau, our data show a gradual decreasing trend of 0.3 °C in mean annual air temperature from 1750 to 1970 CE. This result suggests a gradual cooling trend in some high altitude regions over this interval, which could provide a new explanation for the observed decreasing Asian summer monsoon. In addition, our data indicate an abruptly increased interannual-to decadal-scale temperature variations of 0.8 – 2.2 °C after 1970 CE, in terms of both magnitude and frequency, indicating that the climate system in high altitude regions would become more unstable under current global warming.”

Krawczyk et al., 2017



Pendea et al., 2017  (Russia)

The Holocene Thermal Maximum (HTM) was a relatively warm period that is commonly associated with the orbitally forced Holocene maximum summer insolation (e.g., Berger, 1978; Bartlein et al., 2011). Its timing varies widely from region to region but is generally detected in paleorecords between 11 and 5 cal ka BP (e.g., Kaufman et al., 2004; Bartlein et al., 2011; Renssen et al., 2012).  … In Kamchatka, the timing of the HTM varies. Dirksen et al. (2013) find warmer-than-present conditions between 9000 and 5000 cal yr BP in central Kamchatka and between 7000 and 5800 cal yr BP at coastal sites.”

Stivrins et al., 2017  (Latvia)

“Conclusion: Using a multi-proxy approach, we studied the dynamics of thermokarst characteristics in western Latvia, where thermokarst occurred exceptionally late at the Holocene Thermal Maximum. …  [A] thermokarst active phase … began 8500 cal. yr BP and lasted at least until 7400 cal. yr BP. Given that thermokarst arise when the mean summer air temperature gradually increased ca. 2°C beyond the modern day temperature, we can argue that before that point, the local geomorphological conditions at the study site must have been exceptional to secure ice-block from the surficial landscape transformation and environmental processes.”

Bañuls-Cardona et al., 2017  (Spain)

“During the Middle Holocene we detect important climatic events. From 7000 to 6800 [years before present] (MIR 23 and MIR22), we register climatic characteristics that could be related to the end of the African Humid Period, namely an increase in temperatures and a progressive reduction in arboreal cover as a result of a decrease in precipitation. The temperatures exceeded current levels by 1°C, especially in MIR23, where the most highly represented taxon is a thermo-Mediterranean species, M. (T.) duodecimcostatus.”

Reid, 2017 (Global)

The small increase in global average temperature observed over the last 166 years is the random variation of a centrally biased random walk. It is a red noise fluctuation. It is not significant, it is not a trend and it is not likely to continue.”

Åkesson et al., 2017 (Norway)

“Reconstructions for southern Norway based on pollen and chironomids suggest that summer temperatures were up to 2 °C higher than present in the period between 8000 and 4000 BP, when solar insolation was higher (Nesje and Dahl, 1991; Bjune et al., 2005; Velle et al., 2005a).”

– See more at: http://notrickszone.com/2017/05/29/80-graphs-from-58-new-2017-papers-invalidate-claims-of-unprecedented-global-scale-modern-warming/#sthash.ktF0tSb7.ulRAHkyd.dpuf

Break Out The CO2 Bubbly; Al Gore Is Crying In His Beer

TruthFact

Source: NoMoreFakeNews.com | JonRappoport.wordpress.com
By: Jon Rappoport
June 2, 2017

“All right, contestants, listen carefully. Here’s the final question. The winner will be awarded three years living in a hut with no electricity or heat and he’ll dig for tubers and roots so he can eat—thus contributing to a decrease in global warming. All right, here is the question: Whose private jet spews more CO2? Al Gore’s or Leo DiCaprio’s?”

With Trump’s historic rejection of the Paris climate treaty, Al Gore is deep in a funk.

But don’t weep for Al. He can still amuse himself counting his money. Yes, Al’s done very well for himself hustling the “settled science” all these years, shilling for an energy-depleted Globalist utopia.

Al knows actual science the way a June bug knows how to pilot a spaceship.

Every movement needs such men.

Consider facts laid out in an uncritical Washington Post story (October 10, 2012, “Al Gore has thrived as a green-tech investor”):

In 2001, Al was worth less than $2 million. By 2012, it was estimated he’d piled up a nice neat $100 million in his lock box.

How did he do it? Well, he invested in 14 green companies, who inhaled—via loans, grants and tax relief—somewhere in the neighborhood of $2.5 billion from the federal government to go greener.

Therefore, Gore’s investments paid off, because the federal government was providing massive cash backup to those companies. It’s nice to have friends in high places.

For example, Gore’s investment firm at one point held 4.2 million shares of an outfit called Iberdrola Renovables, which was building 20 wind farms across the United States.

Iberdrola was blessed with $1.5 billion from the Federal government for the work which, by its own admission, saved its corporate financial bacon. Every little bit helps.

Then there was a company called Johnson Controls. It made batteries, including those for electric cars. Gore’s investment company, Generation Investment Management (GIM), doubled its holdings in Johnson Controls in 2008, when shares cost as little $9 a share. Gore sold when shares cost $21 to $26—before the market for electric-car batteries fell on its head.

Johnson Controls had been bolstered by $299 million dropped at its doorstep by the administration of President Barack Obama.

On the side, Gore had been giving speeches on the end of life as we know it on Earth, for as much as $175,000 a pop. (Gore was constantly on the move from conference to conference, spewing jet fumes in his wake.) Those lecture fees can add up.

So Gore, as of 2012, had $100 million.

The man worked every angle to parlay fear of global-warming catastrophes into a humdinger of a personal fortune. And he didn’t achieve his new status in the free market. The federal government helped out with major, major bucks.

This wasn’t an entrepreneur relying exclusively on his own smarts and hard work. Far from it.

—How many scientists and other PhDs have been just saying no to the theory of manmade global warming?

2012: A letter to The Wall Street Journal signed by 16 scientists said no. Among the luminaries: William Happer, professor of physics at Princeton University; Richard Lindzen, professor of atmospheric sciences at Massachusetts Institute of Technology; William Kininmonth, former head of climate research at the Australian Bureau of Meteorology.

And then there was the Global Warming Petition Project, or the Oregon Petition, that said no. According to Petitionproject.org, the petition has the signatures of “31,487 American scientists,” of which 9,029 stated they had Ph.Ds.

Global warming is one of the Rockefeller Globalists’ chief issues. Manipulating it entails convincing populations that a massive intervention is necessary to stave off the imminent collapse of life on Earth. Therefore, sovereign nations must be eradicated. Political power and decision-making must flow from above, from “those who are wiser.”

Globalists want all national governments on the planet to commit to lowering energy production by a significant and destructive percentage in the next 15 years—“to save us from a horrible fate.”

Their real agenda is clear: “The only solution to climate change is a global energy-management network. We (the Globalist leaders) are in the best position to manage such a system. We will allocate mandated energy-use levels throughout the world, region by region, nation by nation, and eventually, citizen by citizen.”

This is the long-term goal. This is the Globalists’ Holy Grail.

Slavery imposed through energy.

Al Gore has done admirable work for his bosses. And for himself. As a past politician with large name recognition, he’s promoted fake science, tried to scare the population of Earth, and financially leveraged himself to the hilt in the fear-crevice he helped create.

Ask not for whom the bells toll. They toll with delight. They’re attached to cash registers. And Al has stuck his hands in and removed the cash.

He might be crying in his beer today, after Trump rejected the Paris climate treaty, but Al’s also thinking about how he can play to the Left that’s so outraged at Trump’s decision. More speeches, more “inconvenient truth” films, maybe a summit with Leo DiCaprio and Obama.

Yes, there’s still money in those hills…quite possibly more money than ever.

Read More At: JonRappoport.wordpress.com
_______________________________________________________________

Jon Rappoport

The author of three explosive collections, THE MATRIX REVEALED, EXIT FROM THE MATRIX, and POWER OUTSIDE THE MATRIX, Jon was a candidate for a US Congressional seat in the 29th District of California. He maintains a consulting practice for private clients, the purpose of which is the expansion of personal creative power. Nominated for a Pulitzer Prize, he has worked as an investigative reporter for 30 years, writing articles on politics, medicine, and health for CBS Healthwatch, LA Weekly, Spin Magazine, Stern, and other newspapers and magazines in the US and Europe. Jon has delivered lectures and seminars on global politics, health, logic, and creative power to audiences around the world. You can sign up for his free NoMoreFakeNews emails here or his free OutsideTheRealityMachine emails here.