Archive for February, 2020

Recovering A Strong American Conception Of Property Rights – The Federalist

Within our constitutional framework, property rights have been relegated to second-class citizenship.

Take the Supreme Courts double-standard on the Fifth Amendments prohibition against the government taking private property unless its for public use. For alleged infringements of other guarantees in the Bill of Rights, the Court strictly scrutinizes government action. But with the Fifth Amendments property protections, the Court allows legislatures to interpret their own constitutional boundaries. If only property rights are at stake, then the fox may guard the henhouse.

Or consider the Courts amorphous review for substantive due process, a values-based inquiry into the constitutional legitimacy of state and federal regulatory laws. On this score, the Court candidly concedes that property rights and contractual freedoms enjoy less protection than other, non-economic liberties.

In his new book Property and the Pursuit of Happiness: Locke, the Declaration of Independence, Madison, and the Challenge of the Administrative State, Edward Erler shows how constitutional property rights climbed through the looking glass and came out topsy-turvy. From Americas founding era to the present day, property rights flipped from cachet to low-caste, and whats supposed to be up, well, is down.

Erler is a professor of political philosophy, so its unsurprising this books foremost contribution is its discussion of the vital role property rights played in the Framers constitutional vision. Tracing an arc of political thought from Aristotle through Locke on to the Declaration of Independence, Erler argues that the Founding Fathers put an inherently American gloss on pre-existing conceptions of property one that merged natural rights and moral obligation into a synthesis they called the pursuit of happiness.

For the Founders, the right to property was the comprehensive right that included all other rights. In this spirit, the Supreme Court in 1795 averred that the right of acquiring and possessing property, and having it protected, is one of the natural, inherent and unalienable rights of man.

Erler explains the decline of property rights from these sanctified heights. As the economy advanced and governments grew, vested property interests came increasingly into conflict with public policy, and it fell to the courts to demarcate the boundaries between public and private spheres.

For much of our nations history, as courts wrestled with these controversies, they hewed to an understanding of property rights closer the Framers than what we see today. The practical result was that property rights enjoyed considerable constitutional protection from overbearing government.

But the scales of justice shifted early in the twentieth century, when the Progressive forces of history swept first into legislatures and then into the courts. Progressives rejected the Founders conception of property rights because it impeded the science of economic planning. As Progressive influence waxed, property rights waned.

Although Property and the Pursuit of Happiness overlaps in subject and tone with Richard Epsteins excellent 2008 book, Supreme Neglect: How to Revive Constitutional Protection for Private Property, the two books are complementary but not the same. Discussion of the Founding Fathers is largely absent from the latterarguably the only flaw in Epsteins seminal workand this topic is Erlers strongest contribution.

This is not to say that Property and the Pursuit of Happiness is flawless. In the introduction, Erler warns that he test[s] the patience of the reader on some occasions, and hes not lying. The book is needlessly difficult. Relatedly, he peppers his prose with awkward sentence introductions (e.g., In a statement that is not entirely hyperbolic . . .). Further, the books subtitle, which mentions the Challenge of the Administrative State, engages in a bit of false advertising, as Erler gives the topic only a cursory examination.

Notwithstanding these drawbacks, Property and the Pursuit of Happiness is an important contribution to a growing body of scholarship pushing for a restoration of property rights to their original place among our individual freedomsparticularly with respect to the Fifth Amendments Takings Clause.

The good news is that these ideas are taking root. To wit, the Trump administration is reshaping the federal judiciary with a generation of judges affected by Richard Epsteins work. On the other side of the bar, dogged public interest lawyersmost notably those at the Pacific Legal Foundationhave advanced property rights in courts across the country. After decades, all this effort is paying off.

Consider the blowback to the Supreme Courts infamous holding 15 years ago in Kelo v. City of New London, which allows government to condemn peoples homes and give their land to a corporation in the name of economic development. As Ilya Somin explains in his book The Grasping Hand, many state courts reacted to Kelo by tightening restrictions on the use of eminent domain.

Last Summer, the Court handed down a watershed decision in Knick v. Township of Scott, which basically puts property rights (and Fifth Amendment takings claims, specifically) on the same procedural footing as other guarantees enumerated in the Bill of Rights. The Courts newest members, Justices Neil Gorsuch and Brett Kavanaugh, joined Chief Justice John Robertss Knick opinion. The holding is a bold step towards ending the inequality of our constitutional rights.

None of these welcome developments would have happened absent the toils of scholars and practitioners who laid the foundations for a resurgence of property rights. With Property and the Pursuit of Happiness, Erler adds a valuable voice to this worthy cause.

William Yeatman is a research fellow at the Cato Institute in Washington, D.C.

Originally posted here:
Recovering A Strong American Conception Of Property Rights - The Federalist

New Intel chip could accelerate the advent of quantum computing – RedShark News

The marathon to achieve the promise of quantum computers hasedged a few steps forward as Intel unveils a new chip capable, it believes, of accelerating the process.

Called Horse Ridgeand named after one of the coldest places in Oregon, the system-on-chip can control a total of 128 qubits (quantum bits) which is more than double the number of qubits Intel heralded in its Tangle Lake test chip in early 2018.

While companies like IBM and Microsoft have been leapfrogging each other with systems capable of handling ever greater qubits the breakthrough in this case appears to be an ability to lead to more efficient quantum computers by allowing one chip to handle more tasks. It is therefore a step toward moving quantum computing from the lab and into real commercial viability.

Applying quantum computing to practical problems hinges on the ability to scale, and control, thousands of qubits at the same time with high levels of fidelity. Intel suggests Horse Ridge greatly simplifies current complex electronics required to operate a quantum system.

To recap why this is important lets take it for read that Quantum computing has the potential to tackle problems conventional computers cant by leveraging a phenomena of quantum physics: that Qubits can exist in multiple states simultaneously. As a result, they are able to conduct a large number of calculations at the same time.

This can dramatically speed up complex problem-solving from years to a matter of minutes. But in order for these qubits to do their jobs, hundreds of connective wires have to be strung into and out of the cryogenic refrigerator where quantum computing occurs (at temperatures colder than deep space).

The extensive control cabling for each qubit drastically hinders the ability to control the hundreds or thousands of qubits that will be required to demonstrate quantum practicality in the lab not to mention the millions of qubits that will be required for a commercially viable quantum solution in the real world.

Researchers outlined the capability of Horse Ridge in a paper presented at the 2020 International Solid-State Circuits Conference in San Francisco and co-written by collaborators at Dutch institute QuTech.

The integrated SoC design is described as being implemented using Intels 22nm FFL (FinFET Low Power) CMOS technology and integrates four radio frequency channels into a single device. Each channel is able to control up to 32 qubits leveraging frequency multiplexing a technique that divides the total bandwidth available into a series of non-overlapping frequency bands, each of which is used to carry a separate signal.

With these four channels, Horse Ridge can potentially control up to 128 qubits with a single device, substantially reducing the number of cables and rack instrumentations previously required.

The paper goes on to argue that increases in qubit count trigger other issues that challenge the capacity and operation of the quantum system. One such potential impact is a decline in qubit fidelity and performance. In developing Horse Ridge, Intel optimised the multiplexing technology that enables the system to scale and reduce errors from crosstalk among qubits.

While developing control systems isnt, evidently, as hype-worthy as the increase in qubit count has been, it is a necessity, says Jim Clarke, director of quantum hardware, Intel Labs. Horse Ridge could take quantum practicality to the finish line much faster than is currently possible. By systematically working to scale to thousands of qubits required for quantum practicality, were continuing to make steady progress toward making commercially viable quantum computing a reality in our future.

Intels own research suggests it will most likely take at least thousands of qubits working reliably together before the first practical problems can be solved via quantum computing. Other estimates suggest it will require at least one million qubits.

Intel is exploring silicon spin qubits, which have the potential to operate at temperatures as high as 1 kelvin. This research paves the way for integrating silicon spin qubit devices and the cryogenic controls of Horse Ridge to create a solution that delivers the qubits and controls in one package.

Quantum computer applications are thought to include drug development high on the worlds list of priorities just now, logistics optimisation (that is, finding the most efficient way from any number of possible travel routes) and natural disaster prediction.

Continued here:
New Intel chip could accelerate the advent of quantum computing - RedShark News

Particle accelerator technology could solve one of the most vexing problems in building quantum computers – Fermi National Accelerator Laboratory

Last year, researchers at Fermilab received over $3.5 million for projects that delve into the burgeoning field of quantum information science. Research funded by the grant runs the gamut, from building and modeling devices for possible use in the development of quantum computers to using ultracold atoms to look for dark matter.

For their quantum computer project, Fermilab particle physicist Adam Lyon and computer scientist Jim Kowalkowski are collaborating with researchers at Argonne National Laboratory, where theyll be running simulations on high-performance computers. Their work will help determine whether instruments called superconducting radio-frequency cavities, also used in particle accelerators, can solve one of the biggest problems facing the successful development of a quantum computer: the decoherence of qubits.

Fermilab has pioneered making superconducting cavities that can accelerate particles to an extremely high degree in a short amount of space, said Lyon, one of the lead scientists on the project. It turns out this is directly applicable to a qubit.

Researchers in the field have worked on developing successful quantum computing devices for the last several decades; so far, its been difficult. This is primarily because quantum computers have to maintain very stable conditions to keep qubits in a quantum state called superposition.

Superconducting radio-frequency cavities, such as the one seen here, are used in particle accelerators. They can also solve one of the biggest problems facing the successful development of a quantum computer: the decoherence of qubits. Photo: Reidar Hahn, Fermilab

Superposition

Classical computers use a binary system of 0s and 1s called bits to store and analyze data. Eight bits combined make one byte of data, which can be strung together to encode even more information. (There are about 31.8 million bytes in the average three-minute digital song.) In contrast, quantum computers arent constrained by a strict binary system. Rather, they operate on a system of qubits, each of which can take on a continuous range of states during computation. Just as an electron orbiting an atomic nucleus doesnt have a discrete location but rather occupies all positions in its orbit at once in an electron cloud, a qubit can be maintained in a superposition of both 0 and 1

Since there are two possible states for any given qubit, a pair doubles the amount of information that can be manipulated: 22 = 4. Use four qubits, and that amount of information grows to 24 = 16. With this exponential increase, it would take only 300 entangled qubits to encode more information than there is matter in the universe.

Qubits can be in a superposition of 0 and 1, while classical bits can be only one or the other. Image: Jerald Pinson

Parallel positions

Qubits dont represent data in the same way as bits. Because qubits in superposition are both 0 and 1 at the same time, they can similarly represent all possible answers to a given problem simultaneously. This is called quantum parallelism, and its one of the properties that makes quantum computers so much faster than classical systems.

The difference between classical computers and their quantum counterparts could be compared to a situation in which there is a book with some pages randomly printed in blue ink instead of black. The two computers are given the task of determining how many pages were printed in each color.

A classical computer would go through every page, Lyon said. Each page would be marked, one at a time, as either being printed in black or in blue. A quantum computer, instead of going through the pages sequentially, would go through them all at once.

Once the computation was complete, a classical computer would give you a definite, discrete answer. If the book had three pages printed in blue, thats the answer youd get.

But a quantum computer is inherently probabilistic, Kowalkowski said.

This means the data you get back isnt definite. In a book with 100 pages, the data from a quantum computer wouldnt be just three. It also could give you, for example, a 1 percent chance of having three blue pages or a 1 percent chance of 50 blue pages.

An obvious problem arises when trying to interpret this data. A quantum computer can perform incredibly fast calculations using parallel qubits, but it spits out only probabilities, which, of course, isnt very helpful unless, that is, the right answer could somehow be given a higher probability.

Interference

Consider two water waves that approach each other. As they meet, they may constructively interfere, producing one wave with a higher crest. Or they may destructively interfere, canceling each other so that theres no longer any wave to speak of. Qubit states can also act as waves, exhibiting the same patterns of interference, a property researchers can exploit to identify the most likely answer to the problem theyre given.

If you can set up interference between the right answers and the wrong answers, you can increase the likelihood that the right answers pop up more than the wrong answers, Lyon said. Youre trying to find a quantum way to make the correct answers constructively interfere and the wrong answers destructively interfere.

When a calculation is run on a quantum computer, the same calculation is run multiple times, and the qubits are allowed to interfere with one another. The result is a distribution curve in which the correct answer is the most frequent response.

When waves meet, they may constructively interfere, producing one wave with a higher crest. Image: Jerald Pinson

Waves may also destructively interfere, canceling each other so that theres no longer any wave to speak of. Image: Jerald Pinson

Listening for signals above the noise

In the last five years, researchers at universities, government facilities and large companies have made encouraging advancements toward the development of a useful quantum computer. Last year, Google announced that it had performed calculations on their quantum processor called Sycamore in a fraction of the time it would have taken the worlds largest supercomputer to complete the same task.

Yet the quantum devices that we have today are still prototypes, akin to the first large vacuum tube computers of the 1940s.

The machines we have now dont scale up much at all, Lyon said.

Theres still a few hurdles researchers have to overcome before quantum computers become viable and competitive. One of the largest is finding a way to keep delicate qubit states isolated long enough for them to perform calculations.

If a stray photon a particle of light from outside the system were to interact with a qubit, its wave would interfere with the qubits superposition, essentially turning the calculations into a jumbled mess a process called decoherence. While the refrigerators do a moderately good job at keeping unwanted interactions to a minimum, they can do so only for a fraction of a second.

Quantum systems like to be isolated, Lyon said, and theres just no easy way to do that.

When a quantum computer is operating, it needs to be placed in a large refrigerator, like the one pictured here, to cool the device to less than a degree above absolute zero. This is done to keep energy from the surrounding environment from entering the machine. Photo: Reidar Hahn, Fermilab

Which is where Lyon and Kowalkowskis simulation work comes in. If the qubits cant be kept cold enough to maintain an entangled superposition of states, perhaps the devices themselves can be constructed in a way that makes them less susceptible to noise.

It turns out that superconducting cavities made of niobium, normally used to propel particle beams in accelerators, could be the solution. These cavities need to be constructed very precisely and operate at very low temperatures to efficiently propagate the radio waves that accelerate particle beams. Researchers theorize that by placing quantum processors in these cavities, the qubits will be able to interact undisturbed for seconds rather than the current record of milliseconds, giving them enough time to perform complex calculations.

Qubits come in several different varieties. They can be created by trapping ions within a magnetic field or by using nitrogen atoms surrounded by the carbon lattice formed naturally in crystals. The research at Fermilab and Argonne will be focused on qubits made from photons.

Lyon and his team have taken on the job of simulating how well radio-frequency cavities are expected to perform. By carrying out their simulations on high-performance computers, known as HPCs, at Argonne National Laboratory, they can predict how long photon qubits can interact in this ultralow-noise environment and account for any unexpected interactions.

Researchers around the world have used open-source software for desktop computers to simulate different applications of quantum mechanics, providing developers with blueprints for how to incorporate the results into technology. The scope of these programs, however, is limited by the amount of memory available on personal computers. In order to simulate the exponential scaling of multiple qubits, researchers have to use HPCs.

Going from one desktop to an HPC, you might be 10,000 times faster, said Matthew Otten, a fellow at Argonne National Laboratory and collaborator on the project.

Once the team has completed their simulations, the results will be used by Fermilab researchers to help improve and test the cavities for acting as computational devices.

If we set up a simulation framework, we can ask very targeted questions on the best way to store quantum information and the best way to manipulate it, said Eric Holland, the deputy head of quantum technology at Fermilab. We can use that to guide what we develop for quantum technologies.

This work is supported by the Department of Energy Office of Science.

Excerpt from:
Particle accelerator technology could solve one of the most vexing problems in building quantum computers - Fermi National Accelerator Laboratory

Top 10 breakthrough technologies of 2020 – TechRepublic

Between tiny AI and unhackable internet, this decade's tech trends will revolutionize the business world.

MIT Technology Review unveiled its top 10 breakthrough technology predictions on Wednesday. The trends--which include hype-inducing tech like quantum computing and unhackable internet--are expected to become realities in the next decade, changing the enterprise and world.

SEE: Internet of Things: Progress, risks, and opportunities (free PDF) (TechRepublic)

While many of the trends have a more scientific background, most can also apply to business, said David Rotman editor at MIT Technology Review.

"Even though some of these sound science-y or research-y, all really do have important implications and business impacts. [For example], unhackable internet," Rotman said. "It's early, but we can all see why that would be a big deal.

"Digital money will change how we do commerce; satellite mega constellations will potentially change how we do communications and the price of communications," Rotman added.The methodology behind determining the breakthrough technologies focused on what writers, editors, and journalists have been reporting on in the past year. All of the technologies are still being developed and improved in labs, Rotman said.

The MIT Technology Review outlined the following 10 most exciting technologies being created and deployed in the next 10 years.

One of the most exciting technologies of the bunch, according to Rotman, quantum supremacy indicates that quantum computers are not only becoming a reality, but the functionality is becoming even more advanced.Murmurs of quantum computer development have floated around the enterprise. The technology is able to process massive computational solutions faster than any supercomputer.

While this form of computing hasn't been widely used yet, it will not only be usable by 2030, but possibly reach quantum supremacy, MIT found.

"Quantum supremacy is the point where a quantum computer can do something that a classical conventional computer cannot do or take hundreds of years for a classical computer to do," Rotman said.

The technology is now getting to the point where people can test them in their businesses and try different applications, and will become more popular in the coming years, Rotman said.

Quantum computers are especially useful for massive scheduling or logistical problems, which can be particularly useful in large corporations with many moving parts, he added.

"Satellites have become so small and relatively cheap that people are sending up whole clusters of these satellites," Rotman said. "It's going to have an enormous impact on communication and all the things that we rely on satellites for."

These satellites could be able to cover the entire globe with high-speed internet. Applications of satellite mega-constellation use are currently being tested by companies including SpaceX, OneWeb, Amazon, and Telesat, according to the report.

Another interesting, and surprising, technology in the study concerned tiny AI. The surprising nature of this comes with how quickly AI is growing, Rotman said.

Starting in the present day, AI will become even more functional, independently running on phones and wearables. This ability would prevent devices from needing the cloud to use AI-driven features, Rotman said.

"It's not just a first step, but it would be an important step in speeding up the search for new drugs," Rotman said.

Scientists have used AI to find drug-like compounds with specific desirable characteristics. In the next three to five years, new drugs might be able to be commercialized for a lesser cost, compared to the current $2.5 billion it takes to currently commercialize a new drug, the report found.

Researchers are now able to detect climate change's role in extreme weather conditions. With this discovery, scientists can help people better prepare for severe weather, according to the report.

In less than five years, researchers will find drugs that treat ailments based on the body's natural aging process, the report found. Potentially, diseases including cancer, heart disease and dementia could be treated by slowing age.

Within five years, the internet could be unhackable, the report found.

Researchers are using quantum encryption to try and make an unhackable internet, which is particularly important as data privacy concerns heighten, Rotman said.

Digital money, also known as cryptocurrency, will become more widely used in 2020. However, the rise of this money will also have major impacts on financial privacy, as the need for an intermediary becomes less necessary, according to the report.

Occupying three trends on the list, medicine is proving to potentially be a huge area for innovation. Currently, doctors and researchers are designing novel drugs to treat unique genetic mutations. These specialized drugs could cure some ailments that were previously uncurable, the report found.

Differential privacy is a technique currently being used by the US government collecting data for the 2020 census. The US Census Bureau has issues keeping the data it collects private, but this tactic helps to anonymize the data, a tactic other countries may also adopt, according to the report.

For more, check out Forget quantum supremacy: This quantum-computing milestone could be just as important on ZDNet.

Be in the know about smart cities, AI, Internet of Things, VR, AR, robotics, drones, autonomous driving, and more of the coolest tech innovations. Delivered Wednesdays and Fridays

Image: Urupong, Getty Images/iStockphoto

Here is the original post:
Top 10 breakthrough technologies of 2020 - TechRepublic

Quantum Computing Will Have a Huge Impact on Banking, says Deltec Bank Bahamas – MENAFN.COM

(MENAFN - GetNews)

When you hear quantum computing, what do you think of? Mathematical equations swirling around you? Einstein standing at a blackboard? A computer running extensive code? Quantum computing may initially sound confusing, but it is currently a big factor in where banks are moving. According to IBM , quantum computers provide the potential for quite a few developments in the fields of science and finance. From medications to machine learning diagnosis to financial strategies for retirement, these are just some of the ways quantum computing has real-life impacts. It can also drastically impact banking as we know it. Here is what you should know.

Quantum computing 101

Quantum computing as its name would suggest is computing based on the principles of quantum theory. A classic computer encodes information in the binary value of 1 or 0, which ultimately restricts their ability. Conversely, quantum computing differs by manipulating its information utilizing quantum mechanical phenomena also known as 'qubits. The difference is that subatomic participles allow them to exist in more than one state simultaneously, which means that you can have both a 1 and a 0. The topic largely relies on the ideas of superposition and entanglement which are not used in typical computing. By bring quantum physics into computing, you create new avenues and developments.

How does quantum computing serve banks?

Security is one of the most significant problems that banks are faced with. As such, they are constantly reviewing their current systems and seeking new technology that could add to their defenses. Quantum computing is one of those technologies that could change the ways that banks protect themselves.

According to Deltec Bank, Bahamas - 'Quantum computing could help build systems that protect vital customer information and transaction details and safeguard against market vulnerability and financial crashes. That said, the technology is not being used to its fullest potential yet. It might take some time before quantum computers have the ability to overtake traditional computers, but when they do, it will be a swift switch because it is a better option overall.

It is becoming known as the 'quantum advantage to use quantum computing to run everyday banking tasks rather than a traditional computer. It is more efficient and more secure, which has both customers and banks on board.

Are there any other impacts?

The impact of quantum computing on banking is enormous. Big names in banking like JP Morgan and Barclays are preparing to make the switch. IBM has released a full report detailing the potential uses and applications of quantum computing in the financial sector. Yet, even beyond that, there is a prediction that quantum computing may be competition for another well-known method of data protection that is on the rise. Some believe that quantum encryption could actually eclipse blockchain, which is key to the use of cryptocurrencies as it stores information about monetary transactions. Quantum encryption enables banks to send highly secure data over its quantum network.

Final thoughts

'The greatest benefit of quantum computing is that it provides banks a highly secure way to solve problems that were at one point very resource-intensive or entirely impossible to complete, says Deltec Bank, Bahamas That said, will quantum computing change the face of banking as we know it tomorrow? Probably not.

The technology exists and is being tested to see how it can be practically implemented. Banks must calculate financial models due to complex hardware requirements and that takes time. The most important takeaway is that the technology exists and it is something that banks are both aware of and working towards. When a new system is capable of running the same calculations in a matter of seconds and provides the high level of security necessary for financial transactions, it is only a matter of time before it begins to see massive implementation.

Disclaimer: The author of this text, Robin Trehan, has an Undergraduate degree in economics, Masters in international business and finance and MBA in electronic business. Trehan is Senior VP at Deltec International http://www.deltecbank.com . The views, thoughts, and opinions expressed in this text are solely the views of the author, and not necessarily reflecting the views of Deltec International Group, its subsidiaries and/or employees.

About Deltec Bank

Headquartered in The Bahamas, Deltec is an independent financial services group that delivers bespoke solutions to meet clients' unique needs. The Deltec group of companies includes Deltec Bank & Trust Limited, Deltec Fund Services Limited, and Deltec Investment Advisers Limited, Deltec Securities Ltd. and Long Cay Captive Management.

Media Contact Company Name: Deltec International Group Contact Person: Media Manager Email: Send Email Phone: 242 302 4100 Country: Bahamas Website: https://www.deltecbank.com/

MENAFN2602202000703268ID1099765295

Read more here:
Quantum Computing Will Have a Huge Impact on Banking, says Deltec Bank Bahamas - MENAFN.COM