Media Search:



The Successes and Limitations of the First Congressional Report on Jan. 6 – Lawfare

During the days following the attempted insurrection on Jan. 6, with both Democrats and Republicans condemning the riot, it seemed possibleeven likelythat Congress might authorize a broad bipartisan investigation of what happened to foster the violence that day. Five months later, though, that hope feels distant. On May 28, the Senate failed to break a filibuster to create an independent commission on the causes of the riot, and overall, the outlook for a robust, definitive investigation seems grim.

In the absence of an outside inquiry, Congress has pursued a variety of investigative approaches. Individual House committees have begun investigations, and Speaker of the House Nancy Pelosi may decide that the best way forward is simply to have those panels continue that work. She has also floated forming a select committee to investigate the attack or designating one specific panel to take the lead in the inquiry. But whatever Pelosi supports, it will likely face opposition from House Republicans.

For the Senates part, the strategy has been clearer from the start: the Senate Rules and Administration Committee and the Senate Homeland Security and Governmental Affairs Committee conducted a joint probe, and this week, they released their own joint report on the events of Jan. 6. The report is the first public document produced by committees in either chamber of Congress investigating the riotand it may yet be the only one. Elsewhere on Lawfare, Billy Ford has provided an in-depth summary of the document. Here, we take a look at what the report does and doesnt cover, and what those gaps say about our understanding of what happened on Jan. 6.

The document goes deep on what went wrong on Jan. 6but its less deep on the question of why things went wrong. Its focused on a relatively narrow timeframe, digging into how various agencies and the congressional bureaucracy fumbled the ball in the weeks before Jan. 6 and on the day itself. But it doesnt broaden its scope to examine the structural factors that might have led those organizations to fumble the ball, or examine the role of President Trump in whipping up rioters through his lies about a stolen election. Some of these limitations likely stem from the bipartisan nature of the report: Republicans, reporting from the New York Times and Washington Post suggests, were none too eager to delve into Trumps responsibility for the violence. And other limitations trace back to the fact that this report is the product of an investigation by only one chamber of Congress, with limited cooperation from key actors in the House of Representatives.

The document, in other words, is both a useful recordand profoundly incomplete.

The committees sketch out a grim picture of the cascading institutional failures both within and beyond Congress. The failures within the congressional bureaucracy laid out by the report are severaland began even before Jan. 6. The U.S. Capitol Police (USCP) did not, the report makes clear, effectively use the intelligence gathered by its three intelligence-related components to track the threat to the Capitol Complex. The report identifies sharing intelligence information as a particular weak spot, both within and beyond the USCP. The entity with primary responsibility for distributing intelligence reports, the Intelligence and Interagency Coordination Division (IICD), produced conflicting products prior to Jan. 6, and some key informationincluding a now-famous bulletin sent out from the FBIs Norfolk field office on Jan. 5, noting internet posts describing potential violence the next day did not make its way to all relevant parties within the USCP.

Intelligence agencies other than the Capitol Police also failed to communicate the seriousness of a potential attackeven as the planning for that attack was happening, in part, in plain sight on social media. The bulletin from the FBIs Norfolk field office appears to be the only intelligence document produced by the bureau warning of the danger. The Department of Homeland Security, which has its own intelligence analysis arm, meanwhile, never produced any document flagging the potential violence.

The committees seem to be keenly aware of just how absurd it is for executive agencies to claim ignorance of threats posted prominently online. The report quotes one official at the Department of Homeland Securitys Intelligence & Analysis unit saying that he was not aware of any known direct threat to the Capitol before January 6, before dryly noting that this was despite many online posts mentioning violence.

The report also details failures by the USCP to develop sufficient operational and staffing plans for Jan. 6, as well as inadequate training and equipment for officers. On Jan. 6 itself, the report details, there were significant communication failures within the USCP, with rank-and-file officers receiv[ing] little-to-no communication from senior officers during the attack and at no point did USCP leadership take over the radios to communicate with front-line officers.

But the failures outlined in the report are not limited to the USCP. Among the most troubling sections of the report is the discussion of why it took as long as it did for National Guard troops to arrive at the Capitol after USCP requested support. The members of the Capitol Police Board, the reports states bluntly, did not understand the statutory and regulatory authorities of the Capitol Police Board.

Michael Stenger, the former Senate Sergeant-at-Arms, described the board as a clearinghouse of information rather than as an operational bodydespite the fact that the board has responsibility for important operational decisions. The board may request support from executive departments and agenciesincluding the National Guardbut none of the Capitol Police Board members on Jan. 6 could fully explain in detail the statutory requirements for requesting National Guard assistance and there was no formal process for such requests. Board members confusion about the process extended to uncertainty about how many of their votes were required to approve such a request. Stenger asserted that unanimity was needed, while Architect of the Capitol J. Brett Blanton (with whom the possibility of requesting Guard support prior to Jan. 6 was not discussed) posited that only a majority vote was necessary. (Notably, the report itself does not clarify the answer to this question, but among its recommendations is to empower the USCP chief to make independent requests for Guard assistance in emergencies.)

The report outlines how lack of clarity between the Defense Department and the Capitol Police over the procedures for requesting deployment of the Guard contributed to the crucial delays in the Guards arrival on the sceneand confusion and delays at the Pentagon resulted in a three-hour gap between when Capitol Police first requested the deployment of the Guard and when the Guard actually showed up at the Capitol. And excerpts from committee interviews with Christopher Miller, the acting secretary of defense on Jan. 6, and Ryan McCarthy, secretary of the Army on that date, suggest that the Pentagon was skittish about deploying military forces to the Capitol after the debacle of the National Guard deployment to Washington, D.C. in summer 2020 to respond to the protests over George Floyds death.

Meanwhile, the Justice Department, despite having been designated by the White House as the lead agency in charge of coordinating operations to secure Congress that day, appears to have been almost entirely absent from security planning or response. According to one Pentagon official interviewed by the committees, the department failed to conduct any interagency rehearsals or have an integrated security plan, as DOJ did during the summer 2020 protests when it had also been designated as the lead federal agency. Former Acting Defense Secretary Miller told the committees that he convened calls between agencies in the midst of the chaos because the Justice Department was nowhere to be found: [S]omebody needed to do it. This failure is all the more notable because the Justice Department itself denied to the committees that it was ever placed in chargeand, according to the report, has yet to fully comply with the Committees requests for information.

So the report provides a damning account of security and intelligence failures across the board. But theres also a lot that the document does not do. In emphasizing the immediate period leading up to Jan. 6, it does not discuss a longer history of what created the conditions that allowed for the operational failures. The report does quote one USCP officer as observing that 1/6 was not only a result of a few months of intelligence not being analyzed and acted upon, but more so decades of failing to take infrastructure, force protection, emergency planning, and training seriously. But the report does not address how the USCP was allowed to fall short for those decades. Is a lack of congressional oversight to blame and, if so, what changes to Congresss own approach to holding its security bureaucracy accountable are needed? The report offers no answers to those important questions.

It is also telling that the report stops short of recommending a full restructuring of the Capitol Police Board, despite previous efforts and recent bipartisan interest in doing so. It is widely believed that congressional leaderswho nominate two of the members of the board, the House and Senate Sergeants-at-Arms, to their positionsare reluctant to change the forces governance structure. But as Congress moves forward, it must consider whether the current bureaucratic arrangements are the most effective ones for ensuring the Capitol Hill community is safe, for the thousands of members and staff who report to work there each day.

And the report demonstrates the inherent shortcomings of an investigation done by, and recommendations for reform made by, committees in a single chamber of Congress. Take, for example, the relatively brief discussion of shortcomings in the security notifications received by senators and Senate staff. Primary responsibility for security notifications to Senators and Senate staff, the report notes, resides with the Senate SAAwho did not send any Senate-wide email alerts during the attack itself. The USCPs email notifications were more numerous, but more than half of them were sent prior to the breach of the Capitol; the USCP also sent the same message, directing individuals to shelter in place, four times between 2:18 pm and 6:44 pm without adding any additional information or context. The report is silent, however, on the experience of House members and staff with House-specific communications. Indeed, while the House Sergeant-at-Arms office is included on the list of entities from which current and former officials'' participated in interviews as part of the probe, the office itself did not comply with the Senate committees request for information.

While the report is damning in its description of how the intelligence agencies did not effectively seek out and use intelligence in advance of the riot, it doesnt provide answers to some of the obvious questions that arise from that description. Why, for example, was the bulletin from the Norfolk field office the only document the FBI produced warning of danger on Jan. 6?

Or, take the statement by then-FBI Assistant Director Jill Sanborn, quoted in the report, that the FBI was not aware of threats made on social media before Jan. 6 because we cannot collect First Amendment-protected activities in the absence of a preexisting investigation. When Sanborn made this comment at a Mar. 2 Senate hearing, it was the subject of a great deal of skepticism from commentators familiar with the FBIs investigatory practices. And indeed, internal FBI guidelines state that FBI employees may conduct Internet searches of publicly available informationthe definition of which would include public social media postsprior to the initiation of a formal investigation. But the Senate report quotes Sanborn without addressing this discrepancy or explaining what the bureaus authorities actually are when it comes to monitoring online posts, even though this would seem to be an important factor in understanding the FBIs failure to prepare for Jan. 6.

This points to another, deeper hole in the committees analysis. The report discusses egregious failures by various agencies, but it doesnt examine the larger structural factors that created an environment where those failures could take place. Why might it be that the FBI, Department of Homeland Security and Capitol Police were so willing to discount the potential threat posed by a group of largely white Trump supportersespecially compared to the federal governments aggression toward peaceful Black Lives Matter protestors during the summer of 2020? To what extent did they overlook that danger because they did not want to cross the president? For that matter, to what extent was the Justice Departments strange silence during the riot itself a result of the departments desire to placate the president?

These questions will be difficult to answer without a more sustained inquiry into, among other things, the role of Trump and the White House in the events surrounding Jan. 6. And that hurdle may be exactly why they arent addressed in this report. The document is a product of a bipartisan investigation by two Senate committeesand according to the New York Times, that bipartisanship shaped what the committees did and didnt include. As the Times notes, the report does not chart [Trumps] actions or motivations, state that his election claims were false or explore the implications of a president and elected leaders in his party stoking outrage among millions of supporters." This explains one of the odder design choices in the reports presentation: Trumps remarks at the Ellipse immediately preceding the riotAnd we fight. We fight like hell. And if you don't fight like hell, you're not going to have a country anymore.are included in an appendix at the end of the report, but they are not discussed at any length in the body of the document. They are referenced with little detail as part of the timeline of events; President Trump began his address just before noon, the report notes, and during the next 75 minutes, the President continued his claims of election fraud and encouraged his supporters to go to the Capitol.: Essentially, the report just tries to stay as far away from Trump as possiblea tricky task when chronicling a riot that the president sparked with his rhetoric and which he egged on while it was happening.

Given these unanswered questions, Congress must decide what to do next. The Senate committees that produced this report have pledged to keep investigating, including continu[ing] to pursue responses from the agencies, offices, and individuals who did not cooperate with the committees prior requests. But recent experience shows that recalcitrant actors can effectively slow walk committees efforts to obtain information.

The lack of full cooperation from the House Sergeant-at-Arms also illustrates the need for the House to continue its own inquiry. Up to now, this investigative work has involved hearings by four separate panels (the Legislative Branch subcommittee of the Committee on Appropriations; the Committee on Oversight and Reform; the Committee on Homeland Security; and the Committee on House Administration) and letters sent singly or jointly by these committees and five more (House Intelligence; House Judiciary; House Armed Services; and the Subcommittees on the Department of Defense and on Interior, Environment and Related Agencies of the House Appropriations Committee). The dispersed nature of the Houses investigationparticularly in contrast to the joint committee nature of the Senatesis one reason many have pushed for Pelosi to create a select panel in the House to serve as focal point for the inquiry. These calls have intensified in the wake of Senate Republicans tanking legislation to create an independent commission to investigate the insurrection to advance in the Senate.

While supporters of a commission have made clear that this report is not a substitute for an independent inquiry, getting one approved will remain a steep uphill battle. Senate Majority Leader Mitch McConnell took the occasion of the reports release to reiterate his opposition to such an inquiry, saying that he was confident in the ability of existing investigations to uncover all actionable facts about the events of Jan. 6. The Senate report does show that existing congressional committees are capable of serious investigation and reflection on what happened on Jan. 6but it also demonstrates the limitations of those investigations as they currently stand.

Visit link:
The Successes and Limitations of the First Congressional Report on Jan. 6 - Lawfare

COVID-19: Moratorium Madness: Will Challenges to the Eviction Order Force the CDC’s Hand? – JD Supra

As detailed in a recent alert,1 U.S. District Judge Dabney L. Friedrich, in Alabama Association of Realtors v. U.S. Department Of Health And Human Services,2 found that the Centers for Disease Control and Prevention (CDC) did not have authority to impose a nationwide eviction moratorium (CDC Order) under the Public Health Service Act (PHSA). Since the decision, there has been a spate of activity in that case and throughout the country regarding the CDC Order and its enforceability. With the CDC Order set to expire on 30 June 2021 and increasingly encouraging national COVID-19 data, it seems less likely that the CDC Order will remain in place for an extended period. Individual states, however, may ultimately keep certain pandemic protection measures in place in order to protect residential tenants. While landlords and tenants await an official decision from the CDC about a further extension of the moratorium, courts will continue to rule on the enforceability of the CDC Order.

In Alabama Association of Realtors, Judge Friedrich determined that while COVID-19 created a serious public health crisis with unprecedented challenges for public health officials, the PHSA did not grant the CDC with the legal authority to impose a nationwide eviction moratorium.3

As expected, the U.S. Department of Health and Human Services (DHHS) immediately appealed to the D.C. Circuit Court of Appeals.4 In addition, DHHS asked Judge Friedrich, on an emergency basis, to stay her order pending appeal.5 Given its immediate appeal and emergency motion, DHHS confirmed that, despite the improved public health outlook, it remains resolute in its defense of the nationwide eviction moratorium.

On 14 May 2021, after full briefing, the trial court granted DHHSs emergency motion to stay pending appeal. While Judge Friedrich found that DHHS had little chance of success, she nonetheless granted the motion after determining that there was sufficient risk of irreparable harm if the CDC Order does not remain in place.6

Following the decision, both parties filed briefs in the trial court and the D.C. Circuit. The plaintiffs filed a notice of their intention to not only file a motion in the D.C. Circuit, but also that they intend to file an application to vacate the stay in the Supreme Court of the United States,7 while DHHS filed its opposition in both courts arguing that the plaintiffs challenge to the CDC Order is meritless and that the balance of equities favors the government.8

On 2 June 2021, the D.C. Circuit declined to lift the stay, finding that the government made a sufficient showing that it is likely to succeed on the merits.9 In response, the plaintiffs asked the U.S. Supreme Court to stay enforcement of the policy during a further appeal. Submitting an application to Chief Justice John Roberts, who handles emergency appeals from the D.C. Circuit on the courts so-called shadow docket, the plaintiffs urged the justices to intervene and lift the stay of the CDC Order because the stay will prolong the severe financial burdens borne by landlords under the moratorium and [the] governments sweeping position is contrary to the text and structure of the statute.10 The shadow docket includes cases that do not proceed via the Supreme Courts normal briefing and argument process. The speed with which the Supreme Court and D.C. Circuit handles the cases could impact any further extension to the CDC Order. Also, with the CDC Order set to expire on 30 June 2021, there is a chance that the pending appeal could conceivably become moot before there are any further rulings from any court.

Another court addressed the CDC Order as impacted by a new Consumer Financial Protection Bureau (CFPB) rule. In April 2021, the CFPB introduced a new interim rule to, among other things, help residential tenants facing eviction, requiring debt collectors to provide written notice to delinquent tenants informing tenants that they may be eligible for relief under the CDC Order. The rule states that the disclosure of the CDC Order must be made in any jurisdictions where the CDC Order applies. A group of property managers filed suit in the Middle District of Tennessee challenging this rule as violative of their First Amendment rights where it compelled false speech and sought a temporary restraining order to block it.

In The Property Management Connection, LLC v. The Consumer Financial Protection Bureau,11 the court first noted that several federal courts, including the 6th Circuit,12 have determined that the CDC exceeded its authority in issuing the CDC Order, thus making the CDC Order inapplicable. The court then addressed the CFPB rule, determining that by its own terms it only applies during the effective period of the CDC Order, only to tenants to whom the CDC Order reasonably might apply, and only in jurisdictions in which the CDC Order applies.13 The court further highlighted that the CFPB itself opined that the rule does not apply where the CDC Order is inapplicable. Thus, the court concluded that since binding 6th Circuit precedent invalidates the CDC Order, the CFPB rule, by its own terms, does not apply. Because the CDC Order did not apply, the plaintiffs claimed First Amendment violation was not viable, according to the court, because the inapplicable rule compels no speech. Therefore, the court found that the plaintiffs could not demonstrate a likelihood of success on the merits, and it denied the requested relief.

In the Middle District of Florida, a group of realtors with more than 200,000 members and a real estate business filed their own federal lawsuit challenging the CDC Order on 17 May 2021. Like earlier suits, the plaintiffs in Florida Association of Realtors, Inc. v. Centers for Disease Control and Prevention14 contend that the CDC overstepped its authority in issuing a national eviction moratorium. The plaintiffs allege that the CDC does not have the authority to be the landlord-in-chief and that estimated losses from the CDC Order may exceed tens of millions of dollars.15 At this stage, it is unclear whether the CDC will respond to the complaint before deciding whether to extend the moratorium.

Given the DHHSs aggressive defense of the CDC Order in Alabama Association of Realtors, it seems likely that the CDC may be planning to further extend the moratorium even where numerous courts have determined that it does not have the authority to do so. In any event, landlords and tenants alike will be paying close attention to the CDCs decision, as cases addressing the CDC Order will likely take center stage. Regardless of its prospects for longevity, the CDC Order remains in place in most jurisdictions, but its days seemed numbered. Also, no matter the fate of the CDC Order, landlords and tenants will need to monitor state-by-state restrictions. Stay tuned for further updates.

1 See Sean R. Higgins, Edward J. Mikolinski, & Scott G. Ofrias, COVID-19: Federal Judge Rules CDC Not Authorized To Issue Nationwide Eviction Moratorium, K&L GATES HUB (May 10, 2021).

2 Memorandum and Order on Plaintiffs Motion for Expedited Summary Judgment, Defendants Motion for Summary Judgment and Partial Motion to Dismiss, Ala. Assn of Realtors v. U.S. Dept of Health & Hum. Servs., 1:20-cv-03377 (D.D.C. May 5, 2021).

3 Id.

4 Notice of Appeal, Ala. Assn of Realtors, 1:20-cv-03377. Although the emergency motion appears to have been filed on behalf of all defendants, the memorandum opinion only addresses DHHS.

5 Emergency Motion to Stay, Ala. Assn of Realtors, 1:20-cv-03377.

6 Judge Friedrich analyzed the four factors necessary to grant a stay: (1) whether the stay applicant has made a strong showing that he or she is likely to succeed on the merits, (2) whether the applicant will be irreparably injured absent a stay, (3) whether issuance of the stay will substantially injure the other parties interested in the proceeding, and (4) where the public interest lies. Judge Friedrich found that while DHHS did not make a strong showing it was likely to succeed, the court determined that DHHS made a sufficient showing as to the other three factors, and the magnitude of these additional financial losses [if a stay is imposed] is outweighed by DHHSs weighty interest in protecting the public. Memorandum Opinion regarding Defendants Emergency Motion to Stay, Ala. Assn of Realtors v. U.S. Dept of Health & Hum. Servs., 1:20-cv-03377 (D.D.C. May 14, 2021) at 9.

7 Notice, Ala. Assn of Realtors v. U.S. Dept of Health & Hum. Servs., 1:20-cv-03377 (D.D.C. May 17, 2021).

8 Response in Opposition, Ala. Assn of Realtors v. U.S. Dept of Health & Hum. Servs., No. 21-5093 (D.C. Cir. May 24, 2021).

9 Order, Ala. Assn of Realtors v. U.S. Dept of Health & Hum. Servs., No. 21-5093 (D.C. Cir. June 2, 2021).

10 Application (20A169) to vacate stay, Ala. Assn of Realtors v. U.S. Dept of Health & Hum. Servs., No. 20A-____ (U.S. June 3, 2021).

11 3:21-cv-00359 (M.D. Tenn. 2021).

12 Tiger Lily, LLC v. U.S. Dept of Hous. & Urb. Dev., 992 F.3d 518 (6th Cir. 2021). In this case, the court also found that defendants had little chance of success on appeal, which was enough to deny the stay. That case remains pending.

13 Memorandum Opinion of the Court, The Prop. Mgmt. Connection, LLC v. The Consumer Fin. Prot. Bureau, 3:21-cv-00359 (M.D. Tenn. May 14, 2021) at 6.

14 8:21-cv-01196-WFJ-SPF (M.D. Fla. 2021).

15 Id.

Read the original post:
COVID-19: Moratorium Madness: Will Challenges to the Eviction Order Force the CDC's Hand? - JD Supra

Why Is Quantum Computing So Hard to Explain – Quanta Magazine

Quantum computers, you might have heard, are magical uber-machines that will soon cure cancer and global warming by trying all possible answers in different parallel universes. For 15 years, on my blog and elsewhere, Ive railed against this cartoonish vision, trying to explain what I see as the subtler but ironically even more fascinating truth. I approach this as a public service and almost my moral duty as a quantum computing researcher. Alas, the work feels Sisyphean: The cringeworthy hype about quantum computers has only increased over the years, as corporations and governments have invested billions, and as the technology has progressed to programmable 50-qubit devices that (on certain contrived benchmarks) really can give the worlds biggest supercomputers a run for their money. And just as in cryptocurrency, machine learning and other trendy fields, with money have come hucksters.

In reflective moments, though, I get it. The reality is that even if you removed all the bad incentives and the greed, quantum computing would still be hard to explain briefly and honestly without math. As the quantum computing pioneer Richard Feynman once said about the quantum electrodynamics work that won him the Nobel Prize, if it were possible to describe it in a few sentences, it wouldnt have been worth a Nobel Prize.

Not that thats stopped people from trying. Ever since Peter Shor discovered in 1994 that a quantum computer could break most of the encryption that protects transactions on the internet, excitement about the technology has been driven by more than just intellectual curiosity. Indeed, developments in the field typically get covered as business or technology stories rather than as science ones.

That would be fine if a business or technology reporter could truthfully tell readers, Look, theres all this deep quantum stuff under the hood, but all you need to understand is the bottom line: Physicists are on the verge of building faster computers that will revolutionize everything.

The trouble is that quantum computers will not revolutionize everything.

Yes, they might someday solve a few specific problems in minutes that (we think) would take longer than the age of the universe on classical computers. But there are many other important problems for which most experts think quantum computers will help only modestly, if at all. Also, while Google and others recently made credible claims that they had achieved contrived quantum speedups, this was only for specific, esoteric benchmarks (ones that I helped develop). A quantum computer thats big and reliable enough to outperform classical computers at practical applications like breaking cryptographic codes and simulating chemistry is likely still a long way off.

But how could a programmable computer be faster for only some problems? Do we know which ones? And what does a big and reliable quantum computer even mean in this context? To answer these questions we have to get into the deep stuff.

Lets start with quantum mechanics. (What could be deeper?) The concept of superposition is infamously hard to render in everyday words. So, not surprisingly, many writers opt for an easy way out: They say that superposition means both at once, so that a quantum bit, or qubit, is just a bit that can be both 0 and 1 at the same time, while a classical bit can be only one or the other. They go on to say that a quantum computer would achieve its speed by using qubits to try all possible solutions in superposition that is, at the same time, or in parallel.

This is what Ive come to think of as the fundamental misstep of quantum computing popularization, the one that leads to all the rest. From here its just a short hop to quantum computers quickly solving something like the traveling salesperson problem by trying all possible answers at once something almost all experts believe they wont be able to do.

The thing is, for a computer to be useful, at some point you need to look at it and read an output. But if you look at an equal superposition of all possible answers, the rules of quantum mechanics say youll just see and read a random answer. And if thats all you wanted, you couldve picked one yourself.

What superposition really means is complex linear combination. Here, we mean complex not in the sense of complicated but in the sense of a real plus an imaginary number, while linear combination means we add together different multiples of states. So a qubit is a bit that has a complex number called an amplitude attached to the possibility that its 0, and a different amplitude attached to the possibility that its 1. These amplitudes are closely related to probabilities, in that the further some outcomes amplitude is from zero, the larger the chance of seeing that outcome; more precisely, the probability equals the distance squared.

But amplitudes are not probabilities. They follow different rules. For example, if some contributions to an amplitude are positive and others are negative, then the contributions can interfere destructively and cancel each other out, so that the amplitude is zero and the corresponding outcome is never observed; likewise, they can interfere constructively and increase the likelihood of a given outcome. The goal in devising an algorithm for a quantum computer is to choreograph a pattern of constructive and destructive interference so that for each wrong answer the contributions to its amplitude cancel each other out, whereas for the right answer the contributions reinforce each other. If, and only if, you can arrange that, youll see the right answer with a large probability when you look. The tricky part is to do this without knowing the answer in advance, and faster than you could do it with a classical computer.

Twenty-seven years ago, Shor showed how to do all this for the problem of factoring integers, which breaks the widely used cryptographic codes underlying much of online commerce. We now know how to do it for some other problems, too, but only by exploiting the special mathematical structures in those problems. Its not just a matter of trying all possible answers at once.

Compounding the difficulty is that, if you want to talk honestly about quantum computing, then you also need the conceptual vocabulary of theoretical computer science. Im often asked how many times faster a quantum computer will be than todays computers. A million times? A billion?

This question misses the point of quantum computers, which is to achieve better scaling behavior, or running time as a function of n, the number of bits of input data. This could mean taking a problem where the best classical algorithm needs a number of steps that grows exponentially with n, and solving it using a number of steps that grows only as n2. In such cases, for small n, solving the problem with a quantum computer will actually be slower and more expensive than solving it classically. Its only as n grows that the quantum speedup first appears and then eventually comes to dominate.

But how can we know that theres no classical shortcut a conventional algorithm that would have similar scaling behavior to the quantum algorithms? Though typically ignored in popular accounts, this question is central to quantum algorithms research, where often the difficulty is not so much proving that a quantum computer can do something quickly, but convincingly arguing that a classical computer cant. Alas, it turns out to be staggeringly hard to prove that problems are hard, as illustrated by the famous P versus NP problem (which asks, roughly, whether every problem with quickly checkable solutions can also be quickly solved). This is not just an academic issue, a matter of dotting is: Over the past few decades, conjectured quantum speedups have repeatedly gone away when classical algorithms were found with similar performance.

Note that, after explaining all this, I still havent said a word about the practical difficulty of building quantum computers. The problem, in a word, is decoherence, which means unwanted interaction between a quantum computer and its environment nearby electric fields, warm objects, and other things that can record information about the qubits. This can result in premature measurement of the qubits, which collapses them down to classical bits that are either definitely 0 or definitely 1. The only known solution to this problem is quantum error correction: a scheme, proposed in the mid-1990s, that cleverly encodes each qubit of the quantum computation into the collective state of dozens or even thousands of physical qubits. But researchers are only now starting to make such error correction work in the real world, and actually putting it to use will take much longer. When you read about the latest experiment with 50 or 60 physical qubits, its important to understand that the qubits arent error-corrected. Until they are, we dont expect to be able to scale beyond a few hundred qubits.

Once someone understands these concepts, Id say theyre ready to start reading or possibly even writing an article on the latest claimed advance in quantum computing. Theyll know which questions to ask in the constant struggle to distinguish reality from hype. Understanding this stuff really is possible after all, it isnt rocket science; its just quantum computing!

See the article here:
Why Is Quantum Computing So Hard to Explain - Quanta Magazine

With cyberattacks on the rise, organizations are already bracing for devastating quantum hacks – CNBC

Amidst the houses and the car parks sits GCHQ, the Government Communications Headquarters, in this aerial photo taken on October 10, 2005.

David Goddard | Getty Images

LONDON A little-known U.K. company called Arqit is quietly preparing businesses and governments for what it sees as the next big threat to their cyber defenses: quantum computers.

It's still an incredibly young field of research, however some in the tech industry including the likes of Google, Microsoft and IBM believe quantum computing will become a reality in the next decade. And that could be worrying news for organizations' cyber security.

David Williams, co-founder and chairman of Arqit, says quantum computers will be several millions of times faster than classical computers, and would be able to break into one of the most widely-used methods of cryptography.

"The legacy encryption that we all use to keep our secrets safe is called PKI," or public-key infrastructure, Williams told CNBC in an interview. "It was invented in the 70s."

"PKI was originally designed to secure the communications of two computers," Williams added. "It wasn't designed for a hyper-connected world where there are a billion devices all over the world communicating in a complex round of interactions."

Arqit, which is planning to go public via a merger with a blank-check company, counts the likes of BT, Sumitomo Corporation, the British government and the European Space Agency as customers. Some of its team previously worked for GCHQ, the U.K. intelligence agency. The firm only recently came out of "stealth mode" a temporary state of secretness and its stock market listing couldn't be more timely.

The past month has seen a spate of devastating ransomware attacks on organizations from Colonial Pipeline, the largest fuel pipeline in the U.S., to JBS, the world's largest meatpacker.

Microsoft and several U.S. government agencies, meanwhile, were among those affected by an attack on IT firm SolarWinds. President Joe Biden recently signed an executive order aimed at ramping up U.S. cyber defenses.

Quantum computing aims to apply the principles of quantum physics a body of science that seeks to describe the world at the level of atoms and subatomic particles to computers.

Whereas today's computers use ones and zeroes to store information, a quantum computer relies on quantum bits, or qubits, which can consist of a combination of ones and zeroes simultaneously, something that's known in the field as superposition. These qubits can also be linked together through a phenomenon called entanglement.

Put simply, it means quantum computers are far more powerful than today's machines and are able to solve complex calculations much faster.

Kasper Rasmussen, associate professor of computer science at the University of Oxford, told CNBC that quantum computers are designed to do "certain very specific operations much faster than classical computers."

That it is not to say they'll be able to solve every task. "This is not a case of: 'This is a quantum computer, so it just runs whatever application you put on there much faster.' That's not the idea," Rasmussen said.

This could be a problem for modern encryption standards, according to experts.

"When you and I use PKI encryption, we do halves of a difficult math problem: prime factorisation," Williams told CNBC. "You give me a number and I work out what are the prime numbers to work out the new number. A classic computer can't break that but a quantum computer will."

Williams believes his company has found the solution. Instead of relying on public-key cryptography, Arqit sends out symmetric encryption keys long, random numbers via satellites, something it calls "quantum key distribution." Virgin Orbit, which invested in Arqit as part of its SPAC deal, plans to launch the satellites from Cornwall, England, by 2023.

Some experts say it will take some time before quantum computers finally arrive in a way that could pose a threat to existing cyber defenses. Rasmussen doesn't expect them to exist in any meaningful way for at least another 10 years. But he's not complacent.

"If we accept the fact that quantum computers will exist in 10 years, anyone with the foresight to record important conversations now might be in a position to decrypt them when quantum computers come about," Rasmussen said.

"Public-key cryptography is literally everywhere in our digitized world, from your bank card, to the way you connect to the internet, to your car key, to IOT (internet of things) devices," Ali Kaafarani, CEO and founder of cybersecurity start-up PQShield, told CNBC.

The U.S. Commerce Department's National Institute of Standards and Technology is looking to update its standards on cryptography to include what's known as post-quantum cryptography, algorithms that could be secure against an attack from a quantum computer.

Kaafarani expects NIST will decide on new standards by the end of 2021. But, he warns: "For me, the challenge is not the quantum threat and how can we build encryption methods that are secure. We solved that."

"The challenge now is how businesses need to prepare for the transition to the new standards," Kaafarani said. "Lessons from the past prove that it's too slow and takes years and decades to switch from one algorithm to another."

Williams thinks firms need to be ready now, adding that forming post-quantum algorithms that take public-key cryptography and make it "even more complex" are not the solution. He alluded to a report from NIST which noted challenges with post-quantum cryptographic solutions.

Read more here:
With cyberattacks on the rise, organizations are already bracing for devastating quantum hacks - CNBC

The ‘second quantum revolution’ is almost here. We need to make sure it benefits the many, not the few – The Conversation AU

Over the past six years, quantum science has noticeably shifted, from the domain of physicists concerned with learning about the universe on extremely small scales, to a source of new technologies we all might use for practical purposes. These technologies make use of quantum properties of single atoms or particles of light. They include sensors, communication networks, and computers.

Quantum technologies are expected to impact many aspects of our society, including health care, financial services, defence, weather modelling, and cyber security. Clearly, they promise exciting benefits. Yet the history of technology development shows we cannot simply assume new tools and systems will automatically be in the public interest.

We must look ahead to what a quantum society might entail and how the quantum design choices made today might impact how we live in the near future. The deployment of artificial intelligence and machine learning over the past few years provides a compelling example of why this is necessary.

Lets consider an example. Quantum computers are perhaps the best-known quantum technology, with companies like Google and IBM competing to achieve quantum computation. The advantage of quantum computers lies in their ability to tackle incredibly complex tasks that would take a normal computer millions of years. One such task is simulating molecules behaviour to improve predictions about the properties of prospective new drugs and accelerate their development.

One conundrum posed by quantum computing is the sheer expense of investing in the physical infrastructure of the technology. This means ownership will likely be concentrated among the wealthiest countries and corporations. In turn, this could worsen uneven power distribution enabled by technology.

Other considerations for this particular type of quantum technology include concerns about reduced online privacy.

How do we stop ourselves blundering into a quantum age without due forethought? How do we tackle the societal problems posed by quantum technologies, while nations and companies race to develop them?

Last year, CSIRO released a roadmap that included a call for quantum stakeholders to explore and address social risks. An example of how we might proceed with this has begun at the World Economic Forum (WEF). The WEF is convening experts from industry, policy-making, and research to promote safe and secure quantum technologies by establishing an agreed set of ethical principles for quantum computing.

Australia should draw on such initiatives to ensure the quantum technologies we develop work for the public good. We need to diversify the people involved in quantum technologies in terms of the types of expertise employed and the social contexts we work from so we dont reproduce and amplify existing problems or create new ones.

Read more: Scientists want to build trust in science and technology. The alternative is too risky to contemplate

While we work to shape the impacts of individual quantum technologies, we should also review the language used to describe this second quantum revolution.

The rationale most commonly used to advocate for the field narrowly imagines public benefit of quantum technologies in terms of economic gain and competition between nations and corporations. But framing this as a race to develop quantum technologies means prioritising urgency, commercial interests and national security at the expense of more civic-minded concerns.

Its still early enough to do something about the challenges posed by quantum technologies. Its also not all doom and gloom, with a variety of initiatives and national research and development policies setting out to tackle these problems before they are set in stone.

We need discussions involving a cross-section of society on the potential impacts of quantum technologies on society. This process should clarify societal expectations for the emerging quantum technology sector and inform any national quantum initiative in Australia.

Read more: Why are scientists so excited about a recently claimed quantum computing milestone?

Continue reading here:
The 'second quantum revolution' is almost here. We need to make sure it benefits the many, not the few - The Conversation AU