Media Search:



Editorial: Pence and the Constitution survive Jan. 6 – San Antonio Express-News

The man was close to danger; the document was close to destruction.

Forty feet. That was the distance between the mob and the man and the document that governs America.

But on Jan. 6, 2021, America did not look like America. It looked like a country on the verge of anarchy, its ideals upended by a mob that stormed the Capitol and attacked the democracy the building represents.

Here we are, more than a year later. The man was Vice President Mike Pence. The document was the U.S. Constitution. Both survived the Big Lie that voter fraud cost former President Donald Trump the 2020 presidential election. And now a House committee is investigating how close both came to collapsing.

Had either succumbed Pence or the Constitution he swore to protect it would have thrust this nation and our world into unimaginable upheaval and chaos. America, even in these dark, troubled times, is held up as a model of democracy. The rioters, motivated by the lie of election fraud, sought to destroy that model, replacing it with a man who lost the election but refused to accept defeat and was willing to wrest the power from the people, where it belongs.

Of all the heinous acts on Jan. 6, the most chilling may have been the rioters murderous rage directed at Pence. His crime, in their eyes, was defying Trump, who pressured him to reject the electoral votes Joe Biden won. Rioters chanted Hang Pence, and Trump agreed, according to U.S. Rep. Liz Cheney, R-Wyo., committee vice chair.

Make no mistake about the fact that the vice presidents life was in danger, said Rep. Pete Aguilar, D-Calif., a committee member.

The pressure on Pence was part of an effort that included attempts to falsify the vote counts in swing states, according to committee findings.

Two Georgia election workers Ruby Freeman and her daughter, Wandrea Shaye Moss were threatened by conspiracy theorists after the 2020 election.

There is nowhere I feel safe, nowhere, Freeman said during taped testimony.

Pence has been commended for his integrity, and in normal times, the praise would elevate his political standing. But these are not normal times.

Pence, mindful of offending the Trump base if he runs for president in 2024, said Biden is a bigger liar than the president who almost sent him to his death. Such is the sorry state of politics in this era.

View post:
Editorial: Pence and the Constitution survive Jan. 6 - San Antonio Express-News

Trump’s coup was much more organized than we knew – Salon

"What's the downside for humoring him for this little bit of time? No one seriously thinks the results will change," said one senior Republican official. "He went golfing this weekend. It's not like he's plotting how to prevent Joe Biden from taking power on Jan. 20. He's tweeting about filing some lawsuits, those lawsuits will fail, then he'll tweet some more about how the election was stolen, and then he'll leave."--- November 9, 2020,Washington Post

That senior Republican official is very lucky the journalist agreed to confer anonymity. It may be the most laughably incorrect prediction in history. The January 6 committee hearings are proving in meticulously laid out detail that Donald Trump plotted to prevent Joe Biden from taking power from the moment he lost the 2020 election. (Actually,he was laying the groundworklong before the election.)

Thursday's revelations came even before the fourth hearing began whennews brokethat the FBI had raided the home of Jeffrey Clark, the former Department of Justice (DOJ) official, and fierce Trump loyalist, whom Trump had wanted to install as acting attorney general in the days before the Capitol riot. It's unclear what crime they suspect Clark of committing but their suspicions were apparently strong enough to get a judge to issue a search warrant and presumably get the go-ahead from the highest levels of the DOJ. It may or may not be a coincidence that this warrant was served the day before the Jan. 6 committee was scheduled to publicly devote several hours to Trump's plot to enlist the DOJ in his corrupt plot to overturn the 2020 election, a plot in which Clark was intimately involved.

Once againthe witness testimony in the hearingcame from Republicans who had been appointed by Trump and had previously demonstrated fealty to him.

RELATED:Trump's unrelenting attacks against dissident Republicans continue with Rusty Bowers

Former Acting Attorney General Jeffrey Rosen, former Deputy Attorney General Richard Donohue and former assistant attorney general for the Office of Legal Counsel Steven Engel all testified on Thursday. Others from the administration appeared via video depositions, including members of the White House Counsel's office. They all testified to the fact that Donald Trump spent weeks pressuring, harassing and threatening them in an attempt to get them to investigate conspiracy theories and issue false statements about the 2020 election. These Republicans resisted every step of the way, making Trump more and more agitated.

"Just say the election was corrupt+ leave the rest to me and the R. Congressmen," Trump exasperatedly instructed on one call,according to Donohue's handwritten notes.If that sounds familiar, it is a deafening echo ofthe demand Trump made to Ukrainian President VolodymyrZelenskywas that all he wanted was for him to hold a press conference announcing an investigation into Joe Biden. That's Trump's M.O.

Want a daily wrap-up of all the news and commentary Salon has to offer? Subscribe to our morning newsletter, Crash Course.

Jeffrey Clark was an obscure DOJ official who served for most of Trump's term served as assistant attorney generalfor theEnvironment and Natural Resources Division. He was appointed acting head of theCivil Division in September 2020. He was hardly a TV star lawyer who Trump would say was from "central casting" but he was a hardcore Trumpist who suffers froma bad case of Fox News brain rotso he happily found himself in the middle of coup plotting after having been brought to Trump's attention by a fellow Pennsylvanian Republican Rep. Scott Perry.

Rosen testified that he was bewildered when Trump brought him up in passing on one of his phone calls but he soon found out that Clark was being groomed by the president to replace him if he didn't do Trump's bidding. Clark broke protocol by scheming with the White House throughout this period rather than going through the proper channels. Another lawyer who no one had mentioned before the hearing, Ken Klukowski, was scheming with Trump attorney John Eastman and Clark from within the Justice Department, where he had evidently been placed on December 15th to work under the radar. Klukowski drafted a letter the coup plotters wanted Attorney General Bill Barr to send to Georgia and other states saying the DOJ was "investigating various irregularities in the 2020 election for President of the United States," which was misleading at best. (At this point they would perfunctorily follow up on some of Trump's wild ravings but had long since determined there was no fraud that could have changed the outcome of the election.) This letter also recommended that the Georgia General Assembly convene a special session to approve a new slate of electors. It indicated that a set of fake Trump electors had already been transmitted to the U.S. Capitol.

What this means is that the plot was not really operating on separate tracks as previously assumed. We now know that the Department of Justice plot was entwined with the John Eastman fake electors - Mike Pence plot. The coup was more organized than we knew.

RELATED:Ringing the alarm for Merrick Garland: Department of Justice stands in real peril because of Trump

Committee Vice Chair Liz Cheney said:

Had this letter been released on official Department of Justice letterhead, it would have falsely informed all Americans, including those who might be inclined to come to Washington on January 6th, that President Trump's election fraud allegations were likely very real.

At the time Klukowski and Clark drafted the letter, Rosen and Clark were listed as signatories. But they refused to sign it and they and Engel and several others were called to a meeting at the White House on January 3rd during which Trump said he planned to replace Rosen with Clark. In fact, the committee showed the White House call logs for that day which showed they were already referring to Clark as acting attorney general. The DOJ honchos all told Trump that Clark was unqualified for the job. Needless to say, Trump would not care about that --- he's the president who namedMatthew Whitaker,a man much less qualified than Clark, to be acting attorney general after he fired Jeff Sessions. He has long shown that his only criteria for hiring is loyalty to him. (Since they had all been Trump loyalists themselves perhaps that was an awkward realization.)

They had all agreed prior to the meeting that if Trump carried out this "Sunday Afternoon Massacre" they would quiten masse, taking a whole bunch of top DOJ officials with them. White House Counsel Pat Cippolone was quoted telling the president it was a "murder-suicide pact." Engel said the department would be a "graveyard." Trump would hardly care about any of that, of course. What likely caused him to back off was this argument by Engel:

So much for the M.O. The optics just wouldn't work.

That was the end of the DOJ portion of the plot but it didn't stop Trump from calling up Donohue shortly after the meeting to ask him to investigate a cockamamie rumor about a truck full of shredded ballots that were in the custody of an ICE agent down in Georgia.

The Clark coup plot may have been thwarted but nothing was going to stop Trump from pushing the Big Lie, no matter what. After all, January 6 was coming up and Trump knew it was going to be wild.

Read more:
Trump's coup was much more organized than we knew - Salon

IDC Perspective on Integration of Quantum Computing and HPC – HPCwire

The insatiable need to compress time to insights from massive and complex datasets is fueling the demand for quantum computing integration into high performance computing (HPC) environments. Such an integration would allow enterprises to accelerate and optimize current HPC applications and processes by simulating and emulating them on todays noisy intermediate scale quantum (NISQ) computers.

Currently, enterprises are reliant on the advantages that can be achieved using only classical accelerator technology such as GPUs and FPGAs. However, HPC systems are limited in their ability to process and analyze large amounts of data needed to execute multiple workflows, even with the added compute power of classical accelerators. Using quantum computing technologies, not only will enterprises be able to accelerate current HPC processes, but they will also be empowered to solve intractable industry problems beyond the scope of the most advanced classical compute systems.

Today, quantum computing systems are still in early development and far from commercial maturity. Quantum computing hardware vendors are challenged in their ability to stabilize and scale the large number of qubits needed to solve complex problems and allow for error correction due to decoherence. As a result, NISQ machines cannot provide a means for enterprises to realize a quantum advantage, defined by IDC as being able to solve a problem that has actual value to a business, humanity, or otherwise.

Despite these challenges, enterprises are investing in quantum initiatives to identify uses cases and develop algorithms so that they are quantum ready when a fault-tolerant universal machine is realized. As a result, government entities, such as China, Germany and the US; IT industry leaders such as IBM, Google, Microsoft, and Amazon Web Services (AWS); and private investors are escalating funding for quantum computing to push this technology to new levels of maturity.

IDC expects investments in the quantum computing market will reach nearly $16.4 billion by the end of 2027. IDC believes that these investments will lead to waves of technology innovation and breakthroughs that will allow organizations to apply quantum computing to a diverse and expanding group of use cases that involve the analysis of huge amounts of diverse datasets, exponentially large numbers of variables, and an inexhaustible number of possible outcomes.

The ability to address large-scale use cases using quantum computing is possible due to the qubits unique superpositioning and entanglement properties. Quantum and classical computers store and compute data based on a series of 0s and 1s. In classical computing, this is done using a bit. Bits are only capable of holding the values of 0 or 1. Bits cannot hold the value of 0 and 1 simultaneously. Qubits do have this capability. This property is referred to as superposition. Through qubit entanglement, a pair of qubits is connected or linked. Change in the state of one qubit results in a simultaneous, predictable change in the other qubit. Combined, the quantum properties of superpositioning and entanglement provide qubits the ability to process more data faster, cheaper, and better (more accurately or precisely) than a classical computer. As a result, enterprises can use quantum computing systems to explore new and unique use cases which can accelerate current business processes and workloads.

The list of use cases is growing at a rapid pace. Included in this list are performance intensive compute (PIC) specific use cases that address newly defined problems, refine solutions generated and iterated in the PIC environment, simulate quantum algorithms, and more. Energized by this innovative technology, many enterprises dont want to delay the commencement of their quantum journey. Approximately 8 out of 10 enterprises that are currently investing, or planning to invest, in quantum computing expect to integrate quantum computing technologies as a hybrid model to enhance their current performance intensive computing (PIC) capabilities. Because of this trend, IDC anticipates that several performance-intensive computing workloads will initially be turbocharged by quantum computing-based accelerators. Yet, in the long-term many of these workloads will eventually cross the computing paradigm and become quantum only.

Quantum and classical hardware vendors are working to develop quantum and quantum-inspired computing systems dedicated to solving HPC problems. For example, using a co-design approach, quantum start-up IQM is mapping quantum applications and algorithms directly to the quantum processor to develop an application-specific superconducting computer. The result is a quantum system optimized to run particular applications such as HPC workloads. In collaboration with Atos, quantum hardware start-up, Pascal is working to incorporate its neutral-atom quantum processors into HPC environments. NVIDIAs cuQuantum Appliance and cuQuantum software development kit provide enterprises the quantum simulation hardware and developer tools needed to integrate and run quantum simulations in HPC environments.

At a more global level, the European High Performance Computing Joint Undertaking (EuroHPC JU) announced its funding for the High-Performance Computer and Quantum Simulator (HPCQS) hybrid project. According the EuroHPC JU, the goal of the project is to prepare Europe for the post-exascale era by integrating two 100+ qubit quantum simulators into two supercomputers and developing the quantum computing platform, both of which will be accessible via the cloud.

Due to the demand for hybrid quantum-HPC systems, other classical and quantum hardware and software vendors have announced that they too are working to develop a hybrid quantum-HPC solutions. For example, compute infrastructure vendor, HPE, is extending its R&D focus into quantum computing by specializing in the co-development of quantum accelerators. Because quantum software vendor, Zapata, foresees quantum computing, HPC, and machine learning converging, the company is creating the Orquestra Universal Scheduler to manage task executions on HPC clusters and current HPC resources.

Yet, recent results from an IDC survey indicate that approximately 15% of enterprises are still deterred from quantum computing adoption. For quantum computing to take off, a quantum computing workforce made up of quantum scientists, physicists, engineers, developers, and operators needs to evolve. However, this should not deter enterprises from beginning their quantum computing journeys. Instead, hesitant adopters should take advantage of the development and consulting services offered by quantum hardware and software vendors, as well as IT consultants that specialize in quantum computing technologies. Because the choice is clear, become quantum ready or be left behind. IDC projects that worldwide customer spend for quantum computing will grow to $8.6 billion in 2027.

Authors

Heather West, Ph.D., Senior Research Analyst, Infrastructure Systems, Platforms and Technologies Group, IDC

Ashish Nadkami, Group Vice President, Infrastructure Systems, Platforms and Technologies Group, IDC

Sample of IDC Reports

Worldwide Quantum Computing Forecast, 2021-2025: Imminent Disruption for the Next Decade

IDCs Worldwide Quantum Computing Taxonomy, 2022

Emerging Trends in End-User Adoption of Quantum Computing-as-a-Service Solutions

2021 Worldwide Quantum Technologies Use Case Report

Read the original post:
IDC Perspective on Integration of Quantum Computing and HPC - HPCwire

The Spooky Quantum Phenomenon You’ve Never Heard Of – Quanta Magazine

Perhaps the most famously weird feature of quantum mechanics is nonlocality: Measure one particle in an entangled pair whose partner is miles away, and the measurement seems to rip through the intervening space to instantaneously affect its partner. This spooky action at a distance (as Albert Einstein called it) has been the main focus of tests of quantum theory.

Nonlocality is spectacular. I mean, its like magic, said Adn Cabello, a physicist at the University of Seville in Spain.

But Cabello and others are interested in investigating a lesser-known but equally magical aspect of quantum mechanics: contextuality. Contextuality says that properties of particles, such as their position or polarization, exist only within the context of a measurement. Instead of thinking of particles properties as having fixed values, consider them more like words in language, whose meanings can change depending on the context: Timeflies likean arrow. Fruitflies likebananas.

Although contextuality has lived in nonlocalitys shadow for over 50 years, quantum physicists now consider it more of a hallmark feature of quantum systems than nonlocality is. A single particle, for instance, is a quantum system in which you cannot even think about nonlocality, since the particle is only in one location, said Brbara Amaral, a physicist at the University of So Paulo in Brazil. So [contextuality] is more general in some sense, and I think this is important to really understand the power of quantum systems and to go deeper into why quantum theory is the way it is.

Researchers have also found tantalizing links between contextuality and problems that quantum computers can efficiently solve that ordinary computers cannot; investigating these links could help guide researchers in developing new quantum computing approaches and algorithms.

And with renewed theoretical interest comes a renewed experimental effort to prove that our world is indeed contextual. In February, Cabello, in collaboration with Kihwan Kim at Tsinghua University in Beijing, China, published a paper in which they claimed to have performed the first loophole-free experimental test of contextuality.

The Northern Irish physicist John Stewart Bell is widely credited with showing that quantum systems can be nonlocal. By comparing the outcomes of measurements of two entangled particles, he showed with his eponymous theorem of 1965 that the high degree of correlations between the particles cant possibly be explained in terms of local hidden variables defining each ones separate properties. The information contained in the entangled pair must be shared nonlocally between the particles.

Bell also proved a similar theorem about contextuality. He and, separately, Simon Kochen and Ernst Specker showed that it is impossible for a quantum system to have hidden variables that define the values of all their properties in all possible contexts.

In Kochen and Speckers version of the proof, they considered a single particle with a quantum property called spin, which has both a magnitude and a direction. Measuring the spins magnitude along any direction always results in one of two outcomes: 1 or 0. The researchers then asked: Is it possible that the particle secretly knows what the result of every possible measurement will be before it is measured? In other words, could they assign a fixed value a hidden variable to all outcomes of all possible measurements at once?

Quantum theory says that the magnitudes of the spins along three perpendicular directions must obey the 101 rule: The outcomes of two of the measurements must be 1 and the other must be 0. Kochen and Specker used this rule to arrive at a contradiction. First, they assumed that each particle had a fixed, intrinsic value for each direction of spin. They then conducted a hypothetical spin measurement along some unique direction, assigning either 0 or 1 to the outcome. They then repeatedly rotated the direction of their hypothetical measurement and measured again, each time either freely assigning a value to the outcome or deducing what the value must be in order to satisfy the 101 rule together with directions they had previously considered.

They continued until, in the 117th direction, the contradiction cropped up. While they had previously assigned a value of 0 to the spin along this direction, the 101 rule was now dictating that the spin must be 1. The outcome of a measurement could not possibly return both 0 and 1. So the physicists concluded that there is no way a particle can have fixed hidden variables that remain the same regardless of context.

While the proof indicated that quantum theory demands contextuality, there was no way to actually demonstrate this through 117 simultaneous measurements of a single particle. Physicists have since devised more practical, experimentally implementable versions of the original Bell-Kochen-Specker theorem involving multiple entangled particles, where a particular measurement on one particle defines a context for the others.

In 2009, contextuality, a seemingly esoteric aspect of the underlying fabric of reality, got a direct application: One of the simplified versions of the original Bell-Kochen-Specker theorem was shown to be equivalent to a basic quantum computation.

The proof, named Mermins star after its originator, David Mermin, considered various combinations of contextual measurements that could be made on three entangled quantum bits, or qubits. The logic of how earlier measurements shape the outcomes of later measurements has become the basis for an approach called measurement-based quantum computing. The discovery suggested that contextuality might be key to why quantum computers can solve certain problems faster than classical computers an advantage that researchers have struggled mightily to understand.

Robert Raussendorf, a physicist at the University of British Columbia and a pioneer of measurement-based quantum computing, showed that contextuality is necessary for a quantum computer to beat a classical computer at some tasks, but he doesnt think its the whole story. Whether contextuality powers quantum computers is probably not exactly the right question to ask, he said. But we need to get there question by question. So we ask a question that we understand how to ask; we get an answer. We ask the next question.

Some researchers have suggested loopholes around Bell, Kochen and Speckers conclusion that the world is contextual. They argue that context-independent hidden variables havent been conclusively ruled out.

In February, Cabello and Kim announced that they had closed every plausible loophole by performing a loophole free Bell-Kochen-Specker experiment.

The experiment entailed measuring the spins of two entangled trapped ions in various directions, where the choice of measurement on one ion defined the context for the other ion. The physicists showed that, although making a measurement on one ion does not physically affect the other, it changes the context and hence the outcome of the second ions measurement.

Skeptics would ask: How can you be certain that the context created by the first measurement is what changed the second measurement outcome, rather than other conditions that might vary from experiment to experiment? Cabello and Kim closed this sharpness loophole by performing thousands of sets of measurements and showing that the outcomes dont change if the context doesnt. After ruling out this and other loopholes, they concluded that the only reasonable explanation for their results is contextuality.

Cabello and others think that these experiments could be used in the future to test the level of contextuality and hence, the power of quantum computing devices.

If you want to really understand how the world is working, said Cabello, you really need to go into the detail of quantum contextuality.

Read more here:
The Spooky Quantum Phenomenon You've Never Heard Of - Quanta Magazine

AWS sent edgy appliance to the ISS and it worked just like all the other computers up there – The Register

Amazon Web Services has proudly revealed that the first completely private expedition to the International Space Station carried one of its Snowcone storge appliances, and that the device worked as advertised.

The Snowcone is a rugged shoebox-sized unit packed full of disk drives specifically 14 terabytes of solid-state disk a pair of VCPUs and 4GB of RAM. The latter two components mean the Snowcone can run either EC2 instances or apps written with AWSs Greengrass IoT product. In either case, the idea is that you take a Snowcone into out-of-the-way places where connectivity is limited, collect data in situ and do some pre-processing on location. Once you return to a location where bandwidth is plentiful, it's assumed you'll upload the contents of a Snowcone into AWS and do real work on it there.

AWS sent this Snowcone aloft with the crewed Axiom Space mission to the ISS in April 2022. The four astronauts conducted a variety of experiments during their 17-day rotation, which stored data on the Snowcone.

AWS hardened the device to ensure it could survive the trip. Axiom and AWS were able to communicate with the device, which worked as intended and processed data it stored. The cloud colossus has hailed this achievement as proving that processing data on Snowcones can work even in edge locations as extreme as the ISS.

Which is true and yay and all. But let's not forget that the ISS houses myriad computers and has done for years. Running a computer up there does require a combination of rocket science and computer science, but humanity has already well and truly proven it can put them both to work on the space station.

Even for computers that are far more modest than an AWS Snowcone such as the Raspberry Pi.

The Pi Foundation and the European Space Agency have sent several AstroPi machines to the ISS. Just like AWS, those units were prepared especially for the rigors of space travel and were used to run multiple workloads.

The Pi guys even revealed an updated design last year, and this week reported the two units sent aloft in late 2021 have now run 17,168 programs written by young people from 26 countries.

The Register leaves the decision about which is the more impressive and/or inspiring achievement to you.

View original post here:
AWS sent edgy appliance to the ISS and it worked just like all the other computers up there - The Register