Quote; "Quantum
computing is computing
using quantum-mechanical
phenomena, such
as superposition
and entanglement.[1]
A quantum computer is a device that performs quantum
computing. They are different from binary
digital
electronic computers based on transistors.
Whereas common digital computing requires that the data be encoded
into binary digits (bits),
each of which is always in one of two definite states (0 or 1),
quantum computation uses quantum
bits, which can be in superpositions
of states.”..
..”Quantum
algorithms are often probabilistic, in that they provide the correct
solution only with a certain known probability.*[12]
Note that the term non-deterministic computing must not be used in
that case to mean probabilistic (computing), because the term
non-deterministic
has a different meaning in computer science.”..
"Quantum entanglement is a physical phenomenon which occurs when pairs or groups of particles are generated or interact in ways such that the quantum state of each particle cannot be described independently of the state of the other(s), even when the particles are separated by a large distance—instead, a quantum state must be described for the system as a whole.
Measurements of physical properties such as position, momentum, spin, and polarization, performed on entangled particles are found to be correlated. For example, if a pair of particles is generated in such a way that their total spin is known to be zero, and one particle is found to have clockwise spin on a certain axis, the spin of the other particle, measured on the same axis, will be found to be counterclockwise, as to be expected due to their entanglement. However, this behavior gives rise to paradoxical effects: any measurement of a property of a particle can be seen as acting on that particle (e.g., by collapsing a number of superposed states) and will change the original quantum property by some unknown amount; and in the case of entangled particles, such a measurement will be on the entangled system as a whole. It thus appears that one particle of an entangled pair "knows" what measurement has been performed on the other, and with what outcome, even though there is no known means for such information to be communicated between the particles, which at the time of measurement may be separated by arbitrarily large distances.
Such phenomena were the subject of a 1935 paper by Albert Einstein, Boris Podolsky, and Nathan Rosen,[1] and several papers by Erwin Schrödinger shortly thereafter,[2][3] describing what came to be known as the EPR paradox. Einstein and others considered such behavior to be impossible, as it violated the local realist view of causality (Einstein referring to it as "spooky action at a distance")[4] and argued that the accepted formulation of quantum mechanics must therefore be incomplete. Later, however, the counterintuitive predictions of quantum mechanics were verified experimentally.[5]"
Go to: https://en.wikipedia.org/wiki/Quantum_computing
*Italics mine, one might argue that all quantum
calculations are probabilistic as one will also only probably make the required measurement, when dealing with quantum the observer
cannot be “ruled out” of the equation.
Reading the "tea-leafs"?
Quote; ""Put simply, quantum computers use a unit of computing called a qubit. While regular semiconductors represent information as a series of 1s and 0s, qubits exhibit quantum properties and can compute as both a 1 and a 0 simultaneously. That means two qubits could represent the sequence 1-0, 1-1, 0-1, 0-0 at the same moment in time. This compute power increases exponentially with each qubit. A quantum computer with as few as 50 qubits could, in theory, pack more computing power than the most powerful supercomputers on earth today.
This comes at a timely juncture. Moore’s Law dictated that computing power per unit would double every 18 months while the price per computing unit would drop by half. While Moore’s Law has largely held true, the amount of money required to squeeze out these improvements is now significantly greater than it was in the past. In other words, semiconductor companies and researchers must spend more and more money in R&D to achieve each jump in speed. Quantum computing, on the other hand, is in rapid ascent.
One company, D-Wave Systems, is selling a quantum computer that it says has 2,000 qubits. However, D-Wave computers are controversial. While some researchers have found good uses for D-Wave machines, these quantum computers have not beaten classical computers and are only useful for certain classes of problems—optimization problems. Optimization problems involve finding the best possible solution from all feasible solutions. So, for example, complex simulation problems with multiple viable outcomes may not be as easily addressable with a D-Wave machine. The way D-Wave performs quantum computing, as well, is not considered to be the most promising for building a true supercomputer-killer.
Google, IBM, and a number of startups are working on quantum computers that promise to be more flexible and likely more powerful because they will work on a wider variety of problems. A few years ago, these flexible machines of two or four qubits were the norm. During the past year, company after company has announced more powerful quantum computers. In November 2017, IBM announced that it has built such a quantum machine that uses 50 qubits, breaking the critical barrier beyond which scientists believe quantum computers will shoot past traditional supercomputers.
The downside? The IBM machine can only maintain a quantum computing state for 90 microseconds at a time. This instability, in fact, is the general bane of quantum computing. The machines must be super-cooled to work, and a separate set of calculations must be run to correct for errors in calculations due to the general instability of these early systems. That said, scientists are making rapid improvements to the instability problem and hope to have a working quantum computer running at room temperature within five years." http://fortune.com/2018/01/17/what-is-quantum-computing/
for full article.
Quote; "A crucial feature of Bitcoin is its security. Bitcoins have two important security features that prevent them from being stolen or copied. Both are based on cryptographic protocols that are hard to crack. In other words, they exploit mathematical functions, like factorization, that are easy in one direction but hard in the other—at least for an ordinary classical computer.
That raises an urgent question: how secure is Bitcoin to the kinds of quantum attack that will be possible in the next few years?
Today, we get an answer thanks to the work of Divesh Aggarwal at the National University of Singapore and a few pals. These guys have studied the threat to Bitcoin posed by quantum computers and say that the danger is real and imminent.
First some background. Bitcoin transactions are stored in a distributed ledger that collates all the deals carried out in a specific time period, usually about 10 minutes. This collection, called a block, also contains a cryptographic hash of the previous block, which contains a cryptographic hash of the one before that, and so on in a chain. Hence the term blockchain.
(A hash is a mathematical function that turns a set of data of any length into a set of specific length.)
The new block must also contain a number called a nonce that has a special property. When this nonce is hashed, or combined mathematically, with the content of the block, the result must be less than some specific target value.
Given the nonce and the block content, this is easy to show, which allows anybody to verify the block. But generating the nonce is time consuming, since the only way to do it is by brute force—to try numbers one after the other until a nonce is found.
This process of finding a nonce, called mining, is rewarded with Bitcoins. Mining is so computationally intensive that the task is usually divided among many computers that share the reward.
The block is then placed on the distributed ledger and, once validated, incorporated into the blockchain. The miners then start work on the next block.
Occasionally, two mining groups find different nonces and declare two different blocks. The Bitcoin protocol states that in this case, the block that has been worked on more will be incorporated into the chain and the other discarded.
This process has an Achilles’ heel. If a group of miners controls more than 50 percent of the computational power on the network, it can always mine blocks faster than whoever has the other 49 percent. In that case, it effectively controls the ledger.
If it is malicious, it can spend bitcoins twice, by deleting transactions so they are never incorporated into the blockchain. The other 49 percent of miners are none the wiser because they have no oversight of the mining process.
That creates an opportunity for a malicious owner of a quantum computer put to work as a Bitcoin miner. If this computational power breaks the 50 percent threshold, it can do what it likes." Go to: https://www.technologyreview.com/s/609408/quantum-computers-pose-imminent-threat-to-bitcoin-security/
for full article.
Quote; ""Put simply, quantum computers use a unit of computing called a qubit. While regular semiconductors represent information as a series of 1s and 0s, qubits exhibit quantum properties and can compute as both a 1 and a 0 simultaneously. That means two qubits could represent the sequence 1-0, 1-1, 0-1, 0-0 at the same moment in time. This compute power increases exponentially with each qubit. A quantum computer with as few as 50 qubits could, in theory, pack more computing power than the most powerful supercomputers on earth today.
This comes at a timely juncture. Moore’s Law dictated that computing power per unit would double every 18 months while the price per computing unit would drop by half. While Moore’s Law has largely held true, the amount of money required to squeeze out these improvements is now significantly greater than it was in the past. In other words, semiconductor companies and researchers must spend more and more money in R&D to achieve each jump in speed. Quantum computing, on the other hand, is in rapid ascent.
One company, D-Wave Systems, is selling a quantum computer that it says has 2,000 qubits. However, D-Wave computers are controversial. While some researchers have found good uses for D-Wave machines, these quantum computers have not beaten classical computers and are only useful for certain classes of problems—optimization problems. Optimization problems involve finding the best possible solution from all feasible solutions. So, for example, complex simulation problems with multiple viable outcomes may not be as easily addressable with a D-Wave machine. The way D-Wave performs quantum computing, as well, is not considered to be the most promising for building a true supercomputer-killer.
Google, IBM, and a number of startups are working on quantum computers that promise to be more flexible and likely more powerful because they will work on a wider variety of problems. A few years ago, these flexible machines of two or four qubits were the norm. During the past year, company after company has announced more powerful quantum computers. In November 2017, IBM announced that it has built such a quantum machine that uses 50 qubits, breaking the critical barrier beyond which scientists believe quantum computers will shoot past traditional supercomputers.
The downside? The IBM machine can only maintain a quantum computing state for 90 microseconds at a time. This instability, in fact, is the general bane of quantum computing. The machines must be super-cooled to work, and a separate set of calculations must be run to correct for errors in calculations due to the general instability of these early systems. That said, scientists are making rapid improvements to the instability problem and hope to have a working quantum computer running at room temperature within five years." http://fortune.com/2018/01/17/what-is-quantum-computing/
for full article.
Quote; "A crucial feature of Bitcoin is its security. Bitcoins have two important security features that prevent them from being stolen or copied. Both are based on cryptographic protocols that are hard to crack. In other words, they exploit mathematical functions, like factorization, that are easy in one direction but hard in the other—at least for an ordinary classical computer.
That raises an urgent question: how secure is Bitcoin to the kinds of quantum attack that will be possible in the next few years?
Today, we get an answer thanks to the work of Divesh Aggarwal at the National University of Singapore and a few pals. These guys have studied the threat to Bitcoin posed by quantum computers and say that the danger is real and imminent.
First some background. Bitcoin transactions are stored in a distributed ledger that collates all the deals carried out in a specific time period, usually about 10 minutes. This collection, called a block, also contains a cryptographic hash of the previous block, which contains a cryptographic hash of the one before that, and so on in a chain. Hence the term blockchain.
(A hash is a mathematical function that turns a set of data of any length into a set of specific length.)
The new block must also contain a number called a nonce that has a special property. When this nonce is hashed, or combined mathematically, with the content of the block, the result must be less than some specific target value.
Given the nonce and the block content, this is easy to show, which allows anybody to verify the block. But generating the nonce is time consuming, since the only way to do it is by brute force—to try numbers one after the other until a nonce is found.
This process of finding a nonce, called mining, is rewarded with Bitcoins. Mining is so computationally intensive that the task is usually divided among many computers that share the reward.
The block is then placed on the distributed ledger and, once validated, incorporated into the blockchain. The miners then start work on the next block.
Occasionally, two mining groups find different nonces and declare two different blocks. The Bitcoin protocol states that in this case, the block that has been worked on more will be incorporated into the chain and the other discarded.
This process has an Achilles’ heel. If a group of miners controls more than 50 percent of the computational power on the network, it can always mine blocks faster than whoever has the other 49 percent. In that case, it effectively controls the ledger.
If it is malicious, it can spend bitcoins twice, by deleting transactions so they are never incorporated into the blockchain. The other 49 percent of miners are none the wiser because they have no oversight of the mining process.
That creates an opportunity for a malicious owner of a quantum computer put to work as a Bitcoin miner. If this computational power breaks the 50 percent threshold, it can do what it likes." Go to: https://www.technologyreview.com/s/609408/quantum-computers-pose-imminent-threat-to-bitcoin-security/
for full article.
So are we misidentifying the data set?
Consider the following; if we do indeed (as promised by the quantum computing advocates), attempt to apply quantum computed algorithmic profiling to our biodiversity issues (in for instance the case of the apparently rapid extinction of many of the world's insects), are we not already suffering from something of a paucity within the data-set? After all this is what "trawling the big-data" is for, the greater the range the more accurate (or "the more probable"), the result, pre-limiting this data set is clearly insane but CERN deliberately misconstrue the influence of Chronos claiming (mathematically), that a big-enough data-set can represent infinity whereas any schoolboy knows the size of any finite number is totally insignificant in relation to infinity, worse still they then attempt to apply the notion that it is not to Einsteins "basic" E=MC2 in the hope that they can create more energy!
Attempting to put Descartes before the horse?
Quote; “The worldwide loss of insects is simply staggering with some reports of 75% up to 90%, happening much faster than the paleoclimate record rate of the past five major extinction events. It is possible that some insect species may already be close to total extinction!
It’s established that species evolve and then go extinct over thousands and millions of years as part of nature’s course, but the current rate of devastation is simply “off the charts, and downright scary.””..
..”Significantly,
insects are the primary source for ecosystem creation and support.
The world literally crumbles apart without mischievous burrowing,
forming new soil, aerating soil, pollinating food crops, etc.
Nutrition for humans happens because insects pollinate.
One of the world’s best and oldest entomological resources is Krefeld Entomological Society (est. 1905) tracking insect abundance at more than 100 nature reserves. They first noticed a significant drop off of insects in 2013 when the total mass of catch fell by 80%. Again, in 2014 the numbers were just as low. Subsequently, the society discovered huge declines in several observation sites throughout Western Europe.
For example, Krefeld data for hoverflies, a pollinator often mistaken for a bee, registered 17,291 hoverflies from 143 species trapped in a reserve in 1989. Whereas by 2014 at the same location, 2,737 individuals from 104 species, down 84%.1
Down Under in Australia anecdotal evidence similarly shows an unusual falloff of insect populations. For example, Jack Hasenpusch, an entomologist and owner of the Australian Insect Farm collects swarms of wild insects but now says: “I’ve been wondering for the last few years why some of the insects have been dropping off … This year has really taken the cake with the lack of insects, it’s left me dumbfounded, I can’t figure out what’s going on.”2”" Go to: https://dissidentvoice.org/2018/03/insect-decimation-upstages-global-warming/
for full article.
Biodiversity.
One of the world’s best and oldest entomological resources is Krefeld Entomological Society (est. 1905) tracking insect abundance at more than 100 nature reserves. They first noticed a significant drop off of insects in 2013 when the total mass of catch fell by 80%. Again, in 2014 the numbers were just as low. Subsequently, the society discovered huge declines in several observation sites throughout Western Europe.
For example, Krefeld data for hoverflies, a pollinator often mistaken for a bee, registered 17,291 hoverflies from 143 species trapped in a reserve in 1989. Whereas by 2014 at the same location, 2,737 individuals from 104 species, down 84%.1
Down Under in Australia anecdotal evidence similarly shows an unusual falloff of insect populations. For example, Jack Hasenpusch, an entomologist and owner of the Australian Insect Farm collects swarms of wild insects but now says: “I’ve been wondering for the last few years why some of the insects have been dropping off … This year has really taken the cake with the lack of insects, it’s left me dumbfounded, I can’t figure out what’s going on.”2”" Go to: https://dissidentvoice.org/2018/03/insect-decimation-upstages-global-warming/
for full article.
Biodiversity.
Quote; “"Last
week, the Intergovernmental Science-Policy Platform on Biodiversity
and Ecosystem Services (IPBES), an independent intergovernmental body
that monitors biodiversity, sent out a chilling warning — in
destroying the planet’s flora and fauna with such mindless haste,
we are undermining our own future well-being. Compiled by nearly 600
scientists over three years, IPBES’s reports underline how human
activity has driven animals and plants into decline in every region
of the world.
“Without biodiversity, there is no future for humanity,” says Prof David Macdonald, at Oxford University.
The term biodiversity was coined in 1985 — a contraction of “biological diversity” — but the huge global biodiversity losses now becoming apparent represent a crisis equalling — or quite possibly surpassing — climate change.
More formally, biodiversity comprises several levels, starting with genes, then individual species, then communities of creatures and finally entire ecosystems, such as forests or coral reefs, where life interplays with the physical environment. These myriad interactions have made Earth habitable for billions of years.
A more philosophical way of viewing biodiversity is this: it represents the knowledge learned by evolving species over millions of years about how to survive through the vastly varying environmental conditions Earth has experienced. Seen like that, experts warn, humanity is currently “burning the library of life”.”..
“Without biodiversity, there is no future for humanity,” says Prof David Macdonald, at Oxford University.
The term biodiversity was coined in 1985 — a contraction of “biological diversity” — but the huge global biodiversity losses now becoming apparent represent a crisis equalling — or quite possibly surpassing — climate change.
More formally, biodiversity comprises several levels, starting with genes, then individual species, then communities of creatures and finally entire ecosystems, such as forests or coral reefs, where life interplays with the physical environment. These myriad interactions have made Earth habitable for billions of years.
A more philosophical way of viewing biodiversity is this: it represents the knowledge learned by evolving species over millions of years about how to survive through the vastly varying environmental conditions Earth has experienced. Seen like that, experts warn, humanity is currently “burning the library of life”.”..
..”
But the recent revelation that 75 per cent of flying insects were
lost in the last 25 years in Germany — and likely elsewhere —
indicates the massacre of biodiversity is not sparing creepy
crawlies. And insects really matter, not just as pollinators but as
predators of pests, decomposers of waste and, crucially, as the base
of the many wild food chains that support ecosystems.
“If we lose the insects, then everything is going to collapse,” says Prof Dave Goulson of Sussex University, UK. “We are currently on course for ecological Armageddon.”"...
"What’s destroying biodiversity?
We are. The rise in human population, our food habits, the felling of forests, poaching and unsustainable hunting for food are some of the big causes. More than 300 mammal species, from chimpanzees to hippos to bats, are being eaten into extinction.
Pollution is a killer too, with orcas and dolphins being seriously harmed by long-lived industrial pollutants. Global trade contributes further harm: amphibians have suffered one of the greatest declines of all animals due to a fungal disease thought to be spread around the world by the pet trade.
The hardest hit of all habitats may be rivers and lakes, with freshwater animal populations in these collapsing by 81 per cent since 1970, following huge water extraction for farms and people, plus pollution and dams.
Could biodiversity loss be a greater threat than climate change?
Yes — nothing on Earth is experiencing more dramatic change at the hands of human activity. Changes to the climate are reversible, even if that takes centuries or millennia. But once species become extinct, particularly those unknown to science, there’s no going back.”
“If we lose the insects, then everything is going to collapse,” says Prof Dave Goulson of Sussex University, UK. “We are currently on course for ecological Armageddon.”"...
"What’s destroying biodiversity?
We are. The rise in human population, our food habits, the felling of forests, poaching and unsustainable hunting for food are some of the big causes. More than 300 mammal species, from chimpanzees to hippos to bats, are being eaten into extinction.
Pollution is a killer too, with orcas and dolphins being seriously harmed by long-lived industrial pollutants. Global trade contributes further harm: amphibians have suffered one of the greatest declines of all animals due to a fungal disease thought to be spread around the world by the pet trade.
The hardest hit of all habitats may be rivers and lakes, with freshwater animal populations in these collapsing by 81 per cent since 1970, following huge water extraction for farms and people, plus pollution and dams.
Could biodiversity loss be a greater threat than climate change?
Yes — nothing on Earth is experiencing more dramatic change at the hands of human activity. Changes to the climate are reversible, even if that takes centuries or millennia. But once species become extinct, particularly those unknown to science, there’s no going back.”
Locating
the tipping point that moves biodiversity loss into ecological
collapse is an urgent priority. However, some researchers say that
the missing ingredient is political will.
A global treaty, the Convention on Biological Diversity (CBD), has set many targets. Some are likely to be reached, for example protecting 17 per cent of all land and 10 per cent of the oceans by 2020. Others, such as making all fishing sustainable by the same date are not. The 196 nations that are members of the CBD next meet in Egypt in November."" Go to:http://gulfnews.com/news/americas/usa/why-does-biodiversity-matter-1.2197149
for full article.
*Italics mine (Nb. which are not the same as a "Bitcoin Mine"!).
A global treaty, the Convention on Biological Diversity (CBD), has set many targets. Some are likely to be reached, for example protecting 17 per cent of all land and 10 per cent of the oceans by 2020. Others, such as making all fishing sustainable by the same date are not. The 196 nations that are members of the CBD next meet in Egypt in November."" Go to:http://gulfnews.com/news/americas/usa/why-does-biodiversity-matter-1.2197149
for full article.
*Italics mine (Nb. which are not the same as a "Bitcoin Mine"!).
No comments:
Post a Comment