Archive for March, 2022

Graph + AI Summit 2022: Industrys Only Open Conference For Accelerating Analytics and AI With Graph to Feature Speakers, Use Cases from Worlds Most…

TigerGraph, Inc.

Virtual Global Event to Take Place May 24-25, 2022; Call for Papers Open Through April 11

REDWOOD CITY, Calif., March 22, 2022 (GLOBE NEWSWIRE) -- TigerGraph, provider of a leading graph analytics platform, today announced the return of Graph + AI Summit, the only open industry conference devoted to democratizing and accelerating analytics, AI, and machine learning with graph algorithms. The virtual global event will take place May 24-25, 2022 and the call for speakers is open through April 11, 2022.

Graph + AI Summit is a global celebration of the power of graph and AI, bringing together business leaders, domain experts, and developers to explore creative ways to solve problems with graph technology, said Yu Xu, CEO and Founder, TigerGraph. We will be showcasing real-world examples of graph with AI and machine learning use cases from world-leading banks, retailers, and fintechs. Well also be revealing all 15 winners of the Graph for All Million Dollar Challenge, an exciting initiative seeking world-changing graph implementations from around the globe. Were looking forward to connecting with global graph enthusiasts this year and hope youll join us.

Past Graph + AI Summits have attracted thousands of attendees from 70+ countries. Data scientists, data engineers, architects, and business and IT executives from over 182 of the Fortune 500 companies participated in the last event alone. Past speakers from Amazon, Capgemini, Gartner, Google, Microsoft, UnitedHealth Group, JPMorgan Chase, Mastercard, NewDay, Intuit, Jaguar Land Rover, Pinterest, Stanford University, Forrester Research, Accenture, KPMG, Intel, Dell, and Xilinx along with many innovative startups shared how their organizations reaped the benefits of graph.

Graph + AI Summit 2022 Call for Papers Open Through April 11, 2022

Are you building cutting-edge graph technology solutions to help your organization adapt to an uncertain world? Maybe youre an expert in supercharging machine learning and artificial intelligence using graph algorithms. Or maybe youre a business leader who knows the value of overcoming the data silos created by legacy enterprise solutions. If any of these scenarios describe you, or if you have deep knowledge of graph technology, we want you to be a speaker at this years Graph + AI Summit.

Story continues

The conference will include keynote presentations from graph luminaries as well as industry and technology tracks. Each track will include beginner, intermediate, and advanced-level sessions. Our audience will benefit from a mix of formal presentations and interactive panel participation. Case studies are particularly welcome. Your submission may include one or more of the following topics:

Artificial intelligence use cases and case studies

Machine learning use cases and case studies

Graph neural networks

Combing Natural Language Processing (NLP) with graph

First-of-a-kind solutions combining AI, machine learning, and graph algorithms

Predictive analytics

Customer 360 and customer journey

Hyper-personalized recommendation engine

Fraud detection, anti-money laundering

Supply chain optimization

Cybersecurity

Industry-specific applications in the internet, eCommerce, banking, insurance, fintech, media, manufacturing, transportation, and healthcare industries.

Please submit your proposal by April 11, 2022 at 12:00 A.M./midnight PT here.

Registration

To register for the event, please visit https://www.tigergraph.com/graphaisummit/.

Graph for All Million Dollar Challenge Winners to be Featured at Graph + AI Summit 2022

Last month, TigerGrpah launched Graph for All Million Dollar Challenge, a global search for innovative ways to harness the power of graph technology and machine learning to solve real-world problems. The challenge brings together brilliant minds to build innovative solutions to better our future with one question: How will you change the world with graph? Since the launch, the challenge has gained major traction worldwide with over 1,000 registrations from 90+ countries so far. TigerGraph will reveal and feature all 15 winners of the challenge at the Graph + AI Summit 2022 event. For more information or to register for the challenge, please visit https://www.tigergraph.com/graph-for-all/.

Helpful Links

About TigerGraph TigerGraph is a platform for advanced analytics and machine learning on connected data. Based on the industrys first and only distributed native graph database, TigerGraphs proven technology supports advanced analytics and machine learning applications such as fraud detection, anti-money laundering (AML), entity resolution, customer 360, recommendations, knowledge graph, cybersecurity, supply chain, IoT, and network analysis. The company is headquartered in Redwood City, California, USA. Start free with tigergraph.com/cloud.

Media Contacts:

North AmericaTanya CarlssonOffleash PRtanya@offleashpr.com+1 (707) 529-6139

EMEAAnne HardingThe Message Machineanne@themessagemachine.com +44 7887 682943

The rest is here:
Graph + AI Summit 2022: Industrys Only Open Conference For Accelerating Analytics and AI With Graph to Feature Speakers, Use Cases from Worlds Most...

Behind the scenes at the Dietrich School’s machine shop – University of Pittsburgh

Take the collaboration between Strang and Assistant Professor Michael Hatridge in the Department of Physics and Astronomy. The two have been especially close partners in the Hatridge labs years-long effort to create more efficient quantum computers.

A lot of the things we need are weird enough that they dont exist as commercial objects, said Hatridge. Instead, he works with Strang to experiment with materials, finishings, machining techniques and binding substances to meet the exacting needs of the labs quantum computer. Those details have a direct influence on the final product, where temperatures are measured in nanokelvin and the computers operation in microseconds.

This is a collaboration. Its a conversation back and forth between us and the machine shop, said Hatridge.

And as the Hatridge lab breaks new ground in quantum computing, the shops machinists are alongside them learning about new materials and techniques to help those advances happen.

The shops portfolio also reaches far beyond the campuss physics labs. Artman once helped assemble an entire pontoon raft for geology researchers, and the groups past projects also include a skeleton key for the Allegheny Observatory and camera-filter mounts for volcano photography.

The flexibility and creativity required of the shops machinists means that Artman has his work cut out for him when trying to hire new machinists. Speaking of Strang and Tomaszewski, It takes a special person to do this, he said. Both of these guys could go out into industry and run entire businesses themselves.

But the same traits are what allow the team to contribute to cutting-edge Pitt research.

When Artmans work is part of a scientific breakthrough, he gets to tell his kids that he and his team are doing things that have never been done before. Now, his own daughter is a Pitt psychology major. And, after years of reading physics books his dad brought home, his 14-year-old son aspires to be a physicist.

No idea where he got that from, said Artman.

Patrick Monahan

See the rest here:
Behind the scenes at the Dietrich School's machine shop - University of Pittsburgh

Elderly care? Bring in the robots! – Modern Diplomacy

What is quantum computing? Why do we need quantum computing? According to Moores law (The complexity of a microcircuit, measured for example by the number of transistors per chip, doubles every 18 months and hence quadruples every 3 years), the density of transistors per area unit on a computing chip doubles every year and a half, which poses two main problems for traditional computers. Firstly, as to computation, high-density transistors will face the problem of power consumption and thermal effects. Secondly, the reduction in size will cause the failure of the classic theory of transistors and their performance will deviate from the original design.

Both of these problems will limit the further shrinkage of transistors, thus putting an end to Moores law. However, although the traditional computer develops until the end of Moores law, it is still unable to cope with many problems that need to be solved. Let us say we calculate the fundamental state energy of N coupled two-level systems, since the number of unknowns will be proportional to 2^N. The current simulation time required for IBMs supercomputer is 2.5 days for a specific computation on Googles 53-qubit quantum computer, which takes about 200 seconds. Qubit is the contraction of quantum bit, the term coined by Benjamin Schumacher to denote the quantum bit, i.e. the basic unit of quantum information.

As the number of qubits continues to increase, conventional computers will soon reach a bottleneck. However, almost all conventional computations involving quantum mechanics face the same problems. Hence many researchers started thinking about how to use the quantum properties themselves as computational resources as early as 1970, which was then summarised by Richard Feynman in 1982.

Hence what advantages do qubits have over traditional computing? The most surprising is none other than the properties of quantum superposition and quantum entanglement. Quantum superposition is a non-classical state that contrasts with empirical intuition and the metaphor is Schrdingers Cat that is both alive and dead.

The superposition state, however, is a real state for qubits on microscopic or mesoscopic scales (spatial scales, viewpoints and the like that are intermediate between macroscopic and microscopic scales). Qubits can be found in the superposition of two characteristic quantum states, and this superposition state is a non-classical state in which being and non-being coexist in the quantum world. In this state, the qubit is neither 0 nor 1, but it is not in a state in which both sides (0 and 1) are uncertain, but rather with equal probability, like a coin before it lands on the palm of the hand.

While in visible nature it is possible to observe a phenomenon without perceptibly influencing it by observation alone (i.e. only by looking at the said phenomenon) in atomic physics and quantum mechanics, a finite and up to a certain point invisible perturbation is connected to every observation. The uncertainty principle is the recognition of absolute chance and arbitrariness in natural phenomena. On the other hand, as will become clear later, quantum mechanics does not predict a single, well-defined result for the observation or for any observer.

The fact that qubits can undergo quantum evolution in a set of superposition states which is neither 0 nor 1 implies quantum parallelism in the relevant computation. The evolution of each qubit, however, is not sufficient to construct all possible evolutions of a multi-qubit system. We must therefore

also interact with different qubits so that they can be intertwined in order to construct a satisfactory algorithm for such a computation. This special superposition is precisely called entangled quantum state.

Let us take two qubits as an example, which is a typical entangled state. Between them, the state representing the first qubit is connected to the state of the second qubit. The two connections are in quantum superposition and we cannot therefore talk about the state in which the two qubits are at that moment hence we talk about entanglement.

There is a more practical view of entanglement in quantum computing, i.e. entangled states usually arise from the control of one qubit (control qubit) over another (target qubit). The relationship between the control qubit and the target qubit is similar to the aforementioned Schrdingers Cat. According to this view, if the controlling part is in a state of superposition, the controlled part will be in a superposition of different controlled situations.

This entanglement process is an important element in quantum computing. We can say that superposition and entanglement synergistically weave the varied parallel evolution of quantum computing. Each measurement can only compute one of the possible states, and the superposition state no longer exists after the first measurement. Hence, with a view to obtaining the statistical information we need in the superposition state, we have to compute and measure results again.

Therefore, in many quantum algorithms (such as the Shors algorithm for factoring [which solves the problem of factor decomposition of integer numbers into primes] and digital quantum simulation), we need to use some interference mechanisms after the computation, so that the information of that phase containing the response in the superposition state is converted into conservation (with the implicit idea of preventing a final spill or loss) due to constructive interference (i.e. by the immediately following variation of other data produced), while further data is eliminated by destructive interference. In this way, the response can be obtained with fewer measurements. Most quantum algorithms rely heavily on the phenomenon of fluctuation and interference hence the relative phase is very important for quantum computing, which is called quantum coherence. In the hardware design of quantum computers, many considerations are related to how to protect the quantum state to prolong the coherence lifetime.

Quantum computers have a variety of hardware implementations, but the design considerations are similar. There are three common considerations: qubit operability, measurability, and protection of quantum states. In response to these considerations, a cavity quantum electrodynamics (cQED) system has been developed. A superconducting quantum system can be taken as an example to introduce the implementation of quantum computers. The difference in frequency between the resonant cavity and the qubit means that the coupling between the resonant cavity and the qubit tends not to exchange energy quanta, but only to generate entanglement, which means that the frequency of the resonant cavity will shift with the state of the qubit. Hence the state of the qubit can be deduced by measuring the microwave penetration or reflection spectrum near the resonant frequency with the bit readout line.

The entanglement mechanism between adjacent qubits is provided by the coupling relative to the electrical capacitance between cross-type capacitors. The coupling effect is controlled by the frequency difference between adjacent qubits. The oscillating behaviour reflects the quantum interference effect and its gradual disappearance leads to the decay of coherence and quantum energy.

The coherent lifetime of qubits is influenced by two factors, an intrinsic and an extrinsic one. The extrinsic influence comes mainly from the coupling between the qubit and the quantum state readout circuit. The presence of a filter-like protection mechanism in the microwave cavity between the bit and the readout line can provide a qubit-like protection mechanism because the cavity and the qubit have a frequency difference of about 718 MHz. The intrinsic influence comes mainly from the loss of the qubit itself and the sensitivity of its frequency to various types of noise, which can usually be suppressed by improved materials and processes and optimisation of the geometric structure.

Quantum computing has a wide range of applications, currently involved in the fields of decryption and cryptography, quantum chemistry, quantum physics, optimisation problems and artificial intelligence. This covers almost all aspects of human society and will have a significant impact on human life after practice. However, the best quantum computers are not yet able to express the advantages of quantum computing. Although the number of qubits on a quantum computer has exceeded 50, the circuit depth required to run the algorithm is far from sufficient. The main reason is that the error rate of qubits in the computation process is still very high, even though we can use quantum correction of qubits and fault-tolerant quantum computation. In the case of quantum computing, the accuracy which gradually improves data will greatly increase the difficulty of producing the hardware and the complexity of the algorithm. At present, the implementation of some well-known algorithms has only reached the level of conceptual demonstration, which is sufficient to demonstrate the feasibility of quantum computing, but practical application still has a long way to go.

But we should remain optimistic because, although general quantum computation still needs to be improved by quantum computer hardware, we can still find new algorithms and applications. Moreover, the development of hardware can also make great strides, just like the development of traditional computers in the beginning. In line with this goal, many existing technological industries could be upgraded in the near future. Research is running fast thanks also to significant public and private investment, and the first commercial applications will be seen in the short term.

Considering defence and intelligence issues, many governments are funding research in this area. The Peoples Republic of China and the United States of America have launched multi-year plans worth billions of yuan and dollars. The European Union has also established the Quantum Flagship Programme for an investment of one billion euros.

Related

Read more here:
Elderly care? Bring in the robots! - Modern Diplomacy

The War of Narratives Over the Origin of COVID-19 and our Virology-Media-Complex – Modern Diplomacy

What is quantum computing? Why do we need quantum computing? According to Moores law (The complexity of a microcircuit, measured for example by the number of transistors per chip, doubles every 18 months and hence quadruples every 3 years), the density of transistors per area unit on a computing chip doubles every year and a half, which poses two main problems for traditional computers. Firstly, as to computation, high-density transistors will face the problem of power consumption and thermal effects. Secondly, the reduction in size will cause the failure of the classic theory of transistors and their performance will deviate from the original design.

Both of these problems will limit the further shrinkage of transistors, thus putting an end to Moores law. However, although the traditional computer develops until the end of Moores law, it is still unable to cope with many problems that need to be solved. Let us say we calculate the fundamental state energy of N coupled two-level systems, since the number of unknowns will be proportional to 2^N. The current simulation time required for IBMs supercomputer is 2.5 days for a specific computation on Googles 53-qubit quantum computer, which takes about 200 seconds. Qubit is the contraction of quantum bit, the term coined by Benjamin Schumacher to denote the quantum bit, i.e. the basic unit of quantum information.

As the number of qubits continues to increase, conventional computers will soon reach a bottleneck. However, almost all conventional computations involving quantum mechanics face the same problems. Hence many researchers started thinking about how to use the quantum properties themselves as computational resources as early as 1970, which was then summarised by Richard Feynman in 1982.

Hence what advantages do qubits have over traditional computing? The most surprising is none other than the properties of quantum superposition and quantum entanglement. Quantum superposition is a non-classical state that contrasts with empirical intuition and the metaphor is Schrdingers Cat that is both alive and dead.

The superposition state, however, is a real state for qubits on microscopic or mesoscopic scales (spatial scales, viewpoints and the like that are intermediate between macroscopic and microscopic scales). Qubits can be found in the superposition of two characteristic quantum states, and this superposition state is a non-classical state in which being and non-being coexist in the quantum world. In this state, the qubit is neither 0 nor 1, but it is not in a state in which both sides (0 and 1) are uncertain, but rather with equal probability, like a coin before it lands on the palm of the hand.

While in visible nature it is possible to observe a phenomenon without perceptibly influencing it by observation alone (i.e. only by looking at the said phenomenon) in atomic physics and quantum mechanics, a finite and up to a certain point invisible perturbation is connected to every observation. The uncertainty principle is the recognition of absolute chance and arbitrariness in natural phenomena. On the other hand, as will become clear later, quantum mechanics does not predict a single, well-defined result for the observation or for any observer.

The fact that qubits can undergo quantum evolution in a set of superposition states which is neither 0 nor 1 implies quantum parallelism in the relevant computation. The evolution of each qubit, however, is not sufficient to construct all possible evolutions of a multi-qubit system. We must therefore

also interact with different qubits so that they can be intertwined in order to construct a satisfactory algorithm for such a computation. This special superposition is precisely called entangled quantum state.

Let us take two qubits as an example, which is a typical entangled state. Between them, the state representing the first qubit is connected to the state of the second qubit. The two connections are in quantum superposition and we cannot therefore talk about the state in which the two qubits are at that moment hence we talk about entanglement.

There is a more practical view of entanglement in quantum computing, i.e. entangled states usually arise from the control of one qubit (control qubit) over another (target qubit). The relationship between the control qubit and the target qubit is similar to the aforementioned Schrdingers Cat. According to this view, if the controlling part is in a state of superposition, the controlled part will be in a superposition of different controlled situations.

This entanglement process is an important element in quantum computing. We can say that superposition and entanglement synergistically weave the varied parallel evolution of quantum computing. Each measurement can only compute one of the possible states, and the superposition state no longer exists after the first measurement. Hence, with a view to obtaining the statistical information we need in the superposition state, we have to compute and measure results again.

Therefore, in many quantum algorithms (such as the Shors algorithm for factoring [which solves the problem of factor decomposition of integer numbers into primes] and digital quantum simulation), we need to use some interference mechanisms after the computation, so that the information of that phase containing the response in the superposition state is converted into conservation (with the implicit idea of preventing a final spill or loss) due to constructive interference (i.e. by the immediately following variation of other data produced), while further data is eliminated by destructive interference. In this way, the response can be obtained with fewer measurements. Most quantum algorithms rely heavily on the phenomenon of fluctuation and interference hence the relative phase is very important for quantum computing, which is called quantum coherence. In the hardware design of quantum computers, many considerations are related to how to protect the quantum state to prolong the coherence lifetime.

Quantum computers have a variety of hardware implementations, but the design considerations are similar. There are three common considerations: qubit operability, measurability, and protection of quantum states. In response to these considerations, a cavity quantum electrodynamics (cQED) system has been developed. A superconducting quantum system can be taken as an example to introduce the implementation of quantum computers. The difference in frequency between the resonant cavity and the qubit means that the coupling between the resonant cavity and the qubit tends not to exchange energy quanta, but only to generate entanglement, which means that the frequency of the resonant cavity will shift with the state of the qubit. Hence the state of the qubit can be deduced by measuring the microwave penetration or reflection spectrum near the resonant frequency with the bit readout line.

The entanglement mechanism between adjacent qubits is provided by the coupling relative to the electrical capacitance between cross-type capacitors. The coupling effect is controlled by the frequency difference between adjacent qubits. The oscillating behaviour reflects the quantum interference effect and its gradual disappearance leads to the decay of coherence and quantum energy.

The coherent lifetime of qubits is influenced by two factors, an intrinsic and an extrinsic one. The extrinsic influence comes mainly from the coupling between the qubit and the quantum state readout circuit. The presence of a filter-like protection mechanism in the microwave cavity between the bit and the readout line can provide a qubit-like protection mechanism because the cavity and the qubit have a frequency difference of about 718 MHz. The intrinsic influence comes mainly from the loss of the qubit itself and the sensitivity of its frequency to various types of noise, which can usually be suppressed by improved materials and processes and optimisation of the geometric structure.

Quantum computing has a wide range of applications, currently involved in the fields of decryption and cryptography, quantum chemistry, quantum physics, optimisation problems and artificial intelligence. This covers almost all aspects of human society and will have a significant impact on human life after practice. However, the best quantum computers are not yet able to express the advantages of quantum computing. Although the number of qubits on a quantum computer has exceeded 50, the circuit depth required to run the algorithm is far from sufficient. The main reason is that the error rate of qubits in the computation process is still very high, even though we can use quantum correction of qubits and fault-tolerant quantum computation. In the case of quantum computing, the accuracy which gradually improves data will greatly increase the difficulty of producing the hardware and the complexity of the algorithm. At present, the implementation of some well-known algorithms has only reached the level of conceptual demonstration, which is sufficient to demonstrate the feasibility of quantum computing, but practical application still has a long way to go.

But we should remain optimistic because, although general quantum computation still needs to be improved by quantum computer hardware, we can still find new algorithms and applications. Moreover, the development of hardware can also make great strides, just like the development of traditional computers in the beginning. In line with this goal, many existing technological industries could be upgraded in the near future. Research is running fast thanks also to significant public and private investment, and the first commercial applications will be seen in the short term.

Considering defence and intelligence issues, many governments are funding research in this area. The Peoples Republic of China and the United States of America have launched multi-year plans worth billions of yuan and dollars. The European Union has also established the Quantum Flagship Programme for an investment of one billion euros.

Related

View post:
The War of Narratives Over the Origin of COVID-19 and our Virology-Media-Complex - Modern Diplomacy

A tale of two universities and two engines – Chess News

[Note that Jon Speelman also looks at the content of the article in video format, here embedded at the end of the article.]

Last Saturday, March 12th, I was at the RACsclubhouse (Royal Auromobile Club) in Londons Pall Mall for the annual Varsity match between Oxford and Cambridge Universities.

First played in 1873, this is the worlds oldest chess contest and was for years reported on in the pages of the famous Russian chess magazine 64. When I played for Oxford from 1975-7, Cambridge were in the ascendant and we lost all three matches: personally, I lost to Michael Stean and drew twice with Jonathan Mestel. These things swing over time, and at the moment its very close. Cambridge started as the Elo favourites, but after an endgame save in the last game to finish, Oxford ran out the winners by the narrowest possible margin of 4-3, with the overall score now 60-58 to Cambridge with 22 draws.

The 1921 Oxford team | Find more info at BritBase, John Saunders excellent games archive

The match has been at the RAC now for nearly half a century, with a dinner afterwards, and in recent years internet coverage and commentary on site. This years commentator was Mathew Sadler and for some of the afternoon I acted as sous-commentator, chatting with Matthew about the games.

At one stage I mentioned that I normally use Houdini as my analysis engine, but Matthew [pictured], who of course is immensely knowledgable about computer chess and has written extensively on Alpha Zero, told me that the latest version of Stockfish is much stronger. I therefore decided to switch to it as my default analysis engine in ChessBase, but Im now wondering (and of course this can be changed with the click of a mouse) whether I was right.

The question of course is how to use the analysis and assessments produced. Most computer engines (Alpha Zero and its daughter Leela are different) are giant bean counters which produce a maximin, maximizing the minimum score they get against the opponent's supposedly best play. Depending on the accuracy of the analysis and the size of the beans, the scores will vary, and while Houdini with its rating, I dunno, of 2700 or 2800 tends to bumble around with assessments quite close to zero,Stockfish thunders its pronouncements giving assessments like +/- 2.5 in positions which look to my human eye to be fairly but not entirely clear; and going up/down to +/- 6 or more when even my human eye can see that it oughtto be winning.

The Ruy Lopez Breyer Variation

Pavel Eljanov explains in depth what Gyula Breyer already saw in 1911 and what became an opening choice of the likes of Kasparov, Kramnik, Anand or Carlsen. The Breyer Variation, which is characterised by the knight retreat to b8.

The certainty is wondrous but rather unsettling. When I was a kid, I no doubt made the mistake of trying to play the best moves. Nowadays, of course, I know better, and while I will stop and indeed try to work out the best solution in an obviously utterly critical position, most of the time I poddle along choosing decent moves without worrying too much about whether there are better ones. To do this, Ive created a story for myself that I can quickly select goodish moves in reasonable positions (of course its much harder if youre under heavy pressure). But gazing into the face of God, I have to be careful not to be blinded and to undermine this essential fiction.

So Im still thinking about what to do. Perhaps with enough time available I should use both, analysing both with St Houdini and the deity Stockfish. Certainly when Im streaming I try much of the time to use my own carbon-based resources and sometimes dip into a fairly hobbled version of Stockfish which isnt too scary. But occasionally, when I want to know the truth I turn to My Lord Sesse (the Norwegian-based fusion of Stockfish and ridiculously powerful hardware).

One point I should make in general is not to take too much notice of computer assessments, even if they are right. They are extremely relevant to the worlds top players when they are doing opening preparation, but for the rest of us they are just a tool. In particular, Ive noticed that when people check their games after playing online, there are some engines which dish out ??s like confetti. Of course people do play some terrible moves, especially at blitz, but ?? should mean a move that loses a piece or maybe even a rook or at a higher level makes a complete mess of the position. It shouldnt mean that the assessment has dropped drastically without in human terms affecting the result.

One reason I go to the Varsity match is to help choose the Best Game and Brilliancy Prize often with Ray Keene, in this case with Matthew. Both receive works by the artist Barry Martin and, in this case, since the Brilliancy Prize was shared, both players got prints.

Cambridge team: back, left to right: Miroslav Macko, Matthew Wadsworth, Imogen Camp, Harry Grieve. Front, left to right: Jan Petr, Declan Shafi (captain), Ognjen Stefanovic, Koby Kalavannan. | Photo: John Saunders

For the best game, we decided on the board 1 win by Oxford, and Ive annotated it, out of interest, using both engines. Ive given them a fairly short time to make an assessment, so they might have changed their minds had they worked for a longer period of time but this experimentnonetheless gives an indication of the huge difference between them.

Select an entry from the list to switch between games

Understanding Middlegame Strategies Vol.3 - The Hedgehog

Throughout my playing career I have found the Hedgehog one of the most difficult type of positions to master. The basic aim of this video is to improve understanding of these complex positions and to help tournament players score better.

Continued here:
A tale of two universities and two engines - Chess News