Archive for the ‘Quantum Computing’ Category

Stock Price Prediction with Quantum Machine Learning in Python – DataDrivenInvestor

An overview of the challenges and opportunities Photo by Anton Maksimov 5642.su on Unsplash

Today, were diving into the intersection of quantum computing and machine learning, exploring quantum machine learning. Our main goal is to compare the performance of a quantum neural network for stock price time series forecasting with a simple single-layer MLP.

To facilitate this project, well be utilizing the Historical API endpoint offered by Financial Modeling Prep (FMP) for reliable and accurate data which is very critical. With that being said, lets dive into the article.

Lets start by importing the necessary libraries for our analysis. These libraries will provide the basic tools required to explore and implement our project.

Weve set up our environment by installing the Qiskit library for working with quantum computing networks, along with other essential libraries. To extract the data, well use the historical data API endpoint provided by Financial Modeling Prep.

FMPs historical data API offers a conveniently accessible endpoint, providing a diverse and extensive collection of historical stock data that proves invaluable at every step of our project. This resource enables us to access a wide range of financial information, enhancing the depth and accuracy of our analysis. Its user-friendly interface and comprehensive dataset contribute significantly to the success and efficiency of our research and implementation.

Now we are going to extract historical data as follows:

Replace YOUR API KEYwith your secret API key which you can obtain by creating an FMP account. The output is a JSON response which looks as follows:

In regular computers, we have tiny switches called digital gates. These switches control how information moves around. They work with basic units of data called bits, which can be either 0 or 1. The gates help computers do calculations and process stuff. Now, in quantum computers, we use something called qubits instead of bits. Qubits are special because they can be both 0 and 1 at the same time. Its like having a coin thats spinning and showing both heads and tails until you catch it, and then it picks one side.

When we say the wave function collapses, its just a fancy way of saying the qubit decides to be either 0 or 1 when we check it. We make these qubits using different things like light particles (photons), tiny particles that make up stuff (atoms), or even small electrical circuits (Josephson junctions). These are like the building blocks for our special qubits.

These quantum systems (particles or circuits) do some interesting things. They can be in different states at the same time (superposition), connect in a special way (entanglement), and even go through barriers they shouldnt (tunneling).

Whats cool is that quantum computers, with their qubits and special behaviors, use certain algorithms to solve some problems faster than regular computers. Its like having a new tool that might help us solve tough puzzles more efficiently in the future.

In traditional computing, we perform operations using basic logic gates like AND, NOT, and OR. These gates work with 0s and 1s, and their rules are based on a simple mathematical system called:

which essentially deals with counting modulo 2.

Now, imagine a quantum computer it also has gates, but these are like supercharged versions. Instead of dealing with simple bits, quantum gates work with quantum bits or qubits. The math behind these quantum gates involves complex numbers and Matrix operations.

Lets take the quantum NOT gate, called:

as an example. Apply it to a qubit initially in the state 0, and the operator flips it to 1 , and if you apply it again, it goes back to 0. Its a bit like flipping a coin.

Theres also the Hadamard gate (H) that does something really cool. Applying it to a qubit initially in the state 0 puts it in this special mix of 0 and 1 states at the same time to show mathematically H operates on |0 and converts it into the standard superposition of the basis states:

Its like having a coin spinning in the air, showing both heads and tails until it lands.

Now, lets talk about the Controlled-NOT (CNOT) gate. This one works on two qubits. If the first qubit is 1, it flips the second qubit from 0 to 1 or vice versa. Its like a quantum switch that depends on the state of the first qubit.

In the quantum world, things get more interesting. If you have two qubits in a special state, the CNOT gate uniquely rearranges their combinations, creating what we call entanglement. This entanglement is like a special connection between the qubits, making them behave in a coordinated manner.

So, in a nutshell, while regular computers use basic rules with 0s and 1s, quantum computers have these fascinating gates that play with probabilities, mix states, and create connections between qubits, opening up a world of possibilities for solving complex problems more efficiently.

In our project, we place special emphasis on a category of gates known as parameterized gates. These gates exhibit behavior that is contingent on specific input parameters, denoted by the symbol . Notably, we focus on rotation gates such as:

each characterized by a unitary matrix as described in the below figure:

Lets delve a bit deeper into these rotation gates. Consider:

envision it as a quantum gate resembling a rotating door that allows for the rotation of a qubit by a specific angle . The

and,

gates function similarly, introducing rotations around different axes.

The significance of these gates lies in their parameterized nature. By adjusting the input parameter , we essentially introduce a customizable element into our quantum algorithms. These gates serve as the foundational components for constructing the quantum neural network integral to our Project.

In essence, acts as a tuning parameter, akin to a knob, enabling us to finely adjust and tailor the behavior of our quantum algorithms within the framework of the quantum neural network. This flexibility becomes pivotal in optimizing and customizing the performance of our quantum algorithms for specific tasks.

Quantum algorithms can be thought of as a series of operations performed on a quantum state, represented by expressions like:

These algorithms are translated into quantum circuits, as illustrated in Figure below. In this depiction, the algorithm starts from the initial state |q_0 q_1 = |00 and concludes with a measurement resulting in either |00 or |11 with an equal probability of 0.5, recorded into classical bits (line c).

In a quantum circuit, each horizontal line corresponds to a single qubit, and gates are applied sequentially until measurement. Its important to note that loops are not allowed in a quantum program. A specific type of quantum circuit is the Variational Quantum Circuit (VQC). Notably, VQC incorporates parameterized gates like the aforementioned R_x(), R_y(), R_z().

In simpler terms, quantum algorithms are like step-by-step instructions for a quantum computer, and quantum circuits visually represent these steps. The Variational Quantum Circuit introduces a special kind of flexibility with parameterized gates, allowing for customization based on specific values, denoted by .

The primary objective of QML is to devise and deploy methods capable of running on quantum computers to address conventional supervised, unsupervised, and reinforcement learning tasks encountered in classical Machine Learning.

What makes QML distinct is its utilization of quantum operations, leveraging unique features like superposition, tunneling, entanglement, and quantum parallelism inherent to Quantum Computing (QC). In our study, we specifically concentrate on Quantum Neural Network (QNN) design. A QNN serves as the quantum counterpart of a classical neural network.

Breaking it down, each layer in a QNN is a Variational Quantum Circuit (VQC) comprising parameterized gates. These parameters act as the quantum equivalents of the weights in a classical neural network. Additionally, the QNN incorporates a mechanism to exchange information among existing qubits, resembling the connections between neurons in different layers of a classical network. Typically, this information exchange is achieved through entanglements, employing operators such as the CNOT gate.

Creating a Quantum Machine Learning (QML) model typically involves several steps, as illustrated in Figure above. First, we load and preprocess the dataset on a classical CPU. Next, we use a quantum embedding technique to encode this classical data into quantum states on a Quantum Processing Unit (QPU) or quantum hardware. Once the classic data is represented in quantum states, the core model, implemented in the ansatz, is executed, and the results are measured using classical bits. Finally, if needed, we post-process these results on the CPU to obtain the expected model output. In our study, we follow this overall process to investigate the application of a Quantum Neural Network for time series forecasting.

A Quantum Neural Network (QNN) typically consists of three main layers:

1. Input Layer: This layer transforms classical input data into a quantum state. It uses a parameterized variational circuit with rotation and controlled-rotation gates to prepare the desired quantum state for a given input. This step, known as quantum embedding, employs techniques like basis encoding, amplitude encoding, Hamiltonian encoding, or tensor product encoding.

2. Ansatz Layer: The heart of the QNN, this layer contains a Variational Quantum Circuit, repeated L times to simulate L network layers classically. Its responsible for processing and manipulating quantum information.

3. Output Layer: This layer performs measurements on qubits, providing the final expected outcome.

For the input layer, we use a tensor product encoding technique. It involves a simple X-rotation gate for each qubit, where the gate parameter is set by scaling the classic data to the range [-, ]. Although its a quick and straightforward encoding method (O(1) operations), it has limitations. The number of qubits needed scales linearly with the input classic data. To address this, we introduce learnable parameters for scaling and bias in the input data, enhancing the flexibility of the quantum embedding. In Figure 3, you can see an example of the input layer for a network with 3 qubits, where classic data features:

, input scale parameters:

, and bias parameters:

come into play.

Regarding the ansatz, its interesting to note that, unlike classical neural networks, there isnt a fixed set of quantum layer structures commonly found in the literature (such as fully connected or recurrent layers). The realm of possible gates for quantum information transfer between qubits is extensive, and the optimal organization of these gates for effective data transfer is an area that hasnt been thoroughly explored yet.

In our Project, we adopt the Real Amplitudes ansatz, a choice inspired by its success in various domains like policy estimation for quantum reinforcement learning and classification. This ansatz initiates with full rotation X/Y/Z parameterized gates, akin to the quantum version of connection weights. It is then followed by a series of CNOT gates arranged in a ring structure to facilitate qubit information transfer. Figure 4 provides a visual representation of how this ansatz is implemented, serving as the quantum equivalent of a network layer for a 3-qubit network.

To break it down, a quantum network layer in our work involves a set of parameters totaling 3 times the number of qubits (3*n), where n represents the number of qubits in the quantum network.

Now, lets talk about the output layer, which is a critical part of our quantum model. In quantum computing, when we want to extract information from our quantum state, we often perform measurements using a chosen observable. One such commonly used observable is represented by the _z operator over the computational basis. To understand this, think of it as a way to extract information from our quantum state.

The network output is determined by calculating the expectation of this observable over our quantum state. This is expressed as |_z|, where | denotes the complex conjugate of |. The result falls within the range of [-1, 1].

No need to stress over those complex mathematical equations our trusty library, Qiskit, has got it covered! Qiskit will handle all the intricate quantum calculations seamlessly, making the quantum computing process much more accessible for us. So, you can focus on exploring the quantum world without getting bogged down by the nitty-gritty math

Now, to make our network output less sensitive to biases and scales inherent in the dataset, we introduce a final scale parameter and bias to be learned. This step adds a layer of adaptability to our model, allowing it to fine-tune and adjust the output based on the specific characteristics of our data. The entire model architecture is visually represented in the figure below.

The training of our proposed Quantum Neural Network (QNN) happens on a regular CPU using classical algorithms like the Adam optimizer. The CPU handles the gradient computation through traditional propagation rules, while on the Quantum Processing Unit (QPU), we calculate the gradient using the parameter-shift rule. Its a bit like having a dual system where the CPU manages the main training, and the QPU comes into play for specific quantum computations.

Visualize the training process pipeline in Figure 6, where represents the scale/bias parameters in the input layer, corresponds to the parameters of the layers containing the ansatz, and are the scale/bias parameters for the network outputs. This orchestration ensures a cohesive training approach, leveraging both classical and quantum computing resources.

As a Quantum Neural Network (QNN) operates as a feedforward model, our initial step involves defining a time horizon, denoted as T. To adapt the time series data for the QNN, we transform it into a tabular format. Here, the target is the time series value at time t, denoted as x(t), while the inputs encompass the values x(t-1), x(t-2), , x(t-T). This restructuring facilitates the models understanding of the temporal relationships in the data, allowing it to make predictions based on past values.

First, we fetch the data using the historical Data API endpoint provided by Financial Modeling Prep as follows:

The output is a Pandas dataframe which looks something like this (before that, make sure to replace YOUR API KEY with your secret API key):

Follow this link:
Stock Price Prediction with Quantum Machine Learning in Python - DataDrivenInvestor

The Revolutionary Tech Supercharging Gains In the Age of AI – InvestorPlace

Editors note: The Revolutionary Tech Supercharging Gains In the Age of AI was previously published in November 2023. It has since been updated to include the most relevant information available.

Artificial intelligence (AI) is not just a buzzword it is a reality that will transform every aspect of our daily lives in the coming years. It will revitalize industries from healthcare to education, from entertainment to cybersecurity, and offer new possibilities currently unheard of.

One possibility comes from an area hardly anyone is talking about right now

Quantum computing (QC).

But to understand why its implications are so massive, we have to first understand what makes AI models run. At their core, AI models are like cars. They have an engine the computer on top of which the models are run. And they have fuel the volume of data the model is trained on.

Obviously, the better the engine in a car and the more fuel it has, the better and farther that car will drive.

Its the same with AI.

The better the engine of an AI model (computing power) and the more fuel it has (data), the better that model will perform.

The top-secret tech Im referring to is all about radically upgrading the computing power AI models have.

And Bank of Americas head of global thematic investing Haim Israel has said this technology could create a revolution for humanity bigger than fire, bigger than the wheel.

Thats because this tech will essentially drive everything in the emerging Age of AI.

Ill start by saying that the underlying physics of this breakthrough quantum mechanics is highly complex. It would likely require over 500 pages to fully understand.

But, alas, heres my best job at making a Cliffs Notes version in 500 words instead.

For centuries, scientists have developed, tested, and validated the laws of the physical world, known as classical mechanics. These scientifically explain how and why things work, where they come from, so on and so forth.

But in 1897, J.J. Thomson discovered the electron. And he unveiled a new, subatomic world of super-small things that didnt obey the laws of classical mechanics at all. Instead, they obeyed their own set of rules, which have since become known as quantum mechanics.

The rules of quantum mechanics differ from that of classical mechanics in two very weird, almost-magical ways.

First, in classical mechanics, objects are in one place at one time. You are either at the store or at home, not both.

But in quantum mechanics, subatomic particles can theoretically exist in multiple places at once before theyre observed. A single subatomic particle can exist in point A and point B at the same time until we observe it. And at that point, it only exists at either point A or point B.

So, the true location of a subatomic particle is some combination of all its possible positions.

This is called quantum superposition.

Second, in classical mechanics, objects can only work with things that are also real. You cant use an imaginary friend to help move the couch. You need a real friend instead.

But in quantum mechanics, all of those probabilistic states of subatomic particles are not independent. Theyre entangled. That is, if we know something about the probabilistic positioning of one subatomic particle, then we know something about the probabilistic positioning of another subatomic particle meaning that these already super-complex particles can actually work together to create a super-complex ecosystem.

This is called quantum entanglement.

So in short, subatomic particles can theoretically have multiple probabilistic states at once, and all those probabilistic states can work together again, all at once to accomplish their task.

And that, in a nutshell, is the scientific breakthrough that stumped Einstein back in the early 1900s.

It goes against everything classical mechanics had taught us about the world. It goes against common sense. But its true. Its real. And now, for the first time ever, we are learning how to harness this unique phenomenon to change everything about everything

This is why the U.S. government is pushing forward on developing a National Quantum Internet in southwest Chicago. It understands that this tech could be more revolutionary than the discovery of fire or the invention of the wheel.

I couldnt agree more.

Mark my words. Everything will change over the next few years because of quantum mechanics and some investors will make a lot of money.

The study of quantum theory has led to huge advancements over the past century. Thats especially true over the past decade. Scientists at leading tech companies have started to figure out how to harness the power of quantum mechanics to make a new generation of super quantum computers. And theyre infinitely faster and more powerful than even todays fastest supercomputers.

And in fact, Haim Israel, managing director of research at Bank of America, believes that: By the end of this decade, the amount of calculations that we can make [on a quantum computer] will be more than the atoms in the visible universe.

Again, the physics behind quantum computers is highly complex, but heres my shortened version

Todays computers are built on top of the laws of classical mechanics. That is, they store information on what are called bits, which can store data binarily as either 1 or 0.

But what if you could turn those classical bits into quantum bits qubits to leverage superposition to be both 1 and 0 stores at once?

Further, what if you could leverage entanglement and have all multi-state qubits work together to solve computationally taxing problems?

Theoretically, youd create a machine with so much computational power that it would make todays most advanced supercomputers seem ancient.

Thats exactly whats happening today.

Google has built a quantum computer that is about 158 million times faster than the worlds fastest supercomputer.

Thats not hyperbole. Thats a real number.

Imagine the possibilities if we could broadly create a new set of quantum computers that are 158 million times faster than even todays fastest computers

Imagine what AI could do.

Today, AI is already being used to discover and develop new drugs and automate manual labor tasks like cooking, cleaning, and packaging products. It is already being used to write legal briefs, craft ads, create movie scripts, and more.

And thats with AI built on top of classical computers.

But built upon quantum computers computer that are a 158 million times faster than classical computers AI will be able to do nearly everything.

The economic opportunities at the convergence of AI and QC are truly endless.

Quantum computing is a game-changer thats flying under the radar.

Its not just another breakthrough its the seismic shift weve been waiting for, rivaling the impact of the internet and the discovery of fire itself.

Andwe think the top stocks at the convergence of AI and QC have a realistic opportunity to soar 1,000% over the next few years alone.

Thats why were laser-focused on finding the best stocks the industry has to offer.

Plus, considering the likelihood that early 2024s slump is nearly over, we think some great buying opportunities are fast-approaching, too.

Find out which names weve got on our shopping list.

On the date of publication, Luke Lango did not have (either directly or indirectly) any positions in the securities mentioned in this article.

P.S. You can stay up to speed with Lukes latest market analysis by reading our Daily Notes! Check out the latest issue on yourInnovation InvestororEarly Stage Investorsubscriber site.

Originally posted here:
The Revolutionary Tech Supercharging Gains In the Age of AI - InvestorPlace

Honeywell Dives into Quantum Computing with Investment in $5 Billion Company – Embedded Computing Design

By Ken Briodagh

Senior Technology Editor

Embedded Computing Design

January 19, 2024

News

Honeywell has joined a $300 millionequity fundraise for Quantinuum, an integrated quantum computing company, at a pre-money valuation of$5 billion. The technology giant was joined by JPMorgan Chase, Mitsui & Co., and Amgen, though Honeywell remains the company's majority shareholder. This investment brings Quantinuum to about $625 million in investments, according to the release.

This was the first funding round for Quantinuum since Cambridge Quantum Computing and Honeywell Quantum Solutions merged inNovember 2021 to form the company. According to the announcement, the money will be used to pursue the companys goal of building the world's first universal fault-tolerant quantum computers.

JPMorgan Chase has been a supporter and advisor since the beginning and reportedly was one of the earliest experimental users of Quantinuum's H-Series quantum processor and one of the most active corporate partners using Quantinuum's SDK, TKET.

Financial services has been identified as one of the first industries that will benefit from quantum technologies, said Lori Beer, Global Chief Information Officer, JPMorgan Chase. We look forward to continuing to work together to positively impact our businesses, customers and the industry at large.

Quantinuum's technologies reportedly are in use at many companies, including Airbus, BMW Group, Honeywell, HSBC, JPMorgan Chase, Mitsui and Thales. These organizations are exploring how to engineer and scale quantum capabilities to help solve some of world's most challenging problems from designing and manufacturing hydrogen cell batteries for transportation, to developing materials to sequester carbon safely from the atmosphere to support the world's energy transition. Quantinuum is also at the forefront of developing Quantum Natural Language Processing, which will help enable the next generation of AI to be scalable and fit for purpose.

The successful completion of this investment round is a testament to Quantinuum's evolution and maturation in the quantum space, said Darius Adamczyk, Executive Chairman of Honeywell and Chairman of the Board of Quantinuum.

J.P. Morgan Securities LLC served as exclusive placement agent to Quantinuum in connection with the financing. Freshfields Bruckhaus Deringer US acted as external legal counsel.

The confidence in our business demonstrated through this investment by our longstanding strategic partners and industry leaders is a clear indication of the value we will continue to create with the world's highest performing quantum computers, groundbreaking middleware to accelerate the developer ecosystem and innovative application software to revolutionize fields like cryptography, computational chemistry, and AI," said Rajeeb Hazra, CEO of Quantinuum.

Ken Briodagh is a writer and editor with two decades of experience under his belt. He is in love with technology and if he had his druthers, he would beta test everything from shoe phones to flying cars. In previous lives, hes been a short order cook, telemarketer, medical supply technician, mover of the bodies at a funeral home, pirate, poet, partial alliterist, parent, partner and pretender to various thrones. Most of his exploits are either exaggerated or blatantly false.

More from Ken

Read the original here:
Honeywell Dives into Quantum Computing with Investment in $5 Billion Company - Embedded Computing Design

Navigating the Future: NVIDIA and Quantum Computing Unraveled – Medium

NVIDIAs Quantum Leap into the Future

In the fast-paced world of tech, two game-changers, NVIDIA and quantum computing, are teaming up to redefine what computers can do. NVIDIA, known for its powerful graphics processing units (GPUs), has been a trailblazer in visual computing. Quantum computing, a revolutionary approach to processing information, is promising to take computational capabilities to unprecedented heights. This article delves into the exciting collaboration between NVIDIA and quantum computing, shedding light on how this partnership is set to reshape the landscape of computing.

NVIDIAs GPU Marvels:

NVIDIA has a rich history of pushing the boundaries of GPU technology. Originally known for graphics in gaming, their GPUs are now a versatile force in various fields like AI, scientific research, and data analytics. NVIDIAs CUDA platform has become a go-to for developers, giving them a powerful toolkit for GPU programming.

Quantum Computing: A New Way of Computing:

Quantum computing operates on entirely different principles than the computers were used to. Instead of regular bits, which can be 0 or 1, quantum computers use qubits. This unique feature enables quantum computers to tackle complex problems at speeds that classical computers can only dream of. Although quantum computing is still in its early days, its potential applications in cryptography, problem-solving, and simulations are groundbreaking.

Collaboration Unveiled:

Understanding the game-changing potential of quantum computing, NVIDIA is strategically partnering with quantum technology developers. The goal is to create hybrid solutions that blend the strengths of classical and quantum computing. One notable collaboration involves Rigetti Computing, a company focused on quantum hardware and software. By combining Rigettis quantum processors with NVIDIAs GPUs, the partnership aims to provide a flexible platform for quantum experimentation and optimization.

Quantum Machine Learning on the Horizon:

The collaboration between NVIDIA and quantum computing extends into the realm of quantum machine learning (QML). By bringing together NVIDIAs GPUs, already vital in advancing machine learning, and quantum computing tasks, the synergy could speed up the development of quantum machine learning algorithms, ushering in a new era of artificial intelligence.

Challenges and What Lies Ahead:

Despite the promise, challenges remain. Quantum computers are finicky and need extremely low temperatures to operate smoothly. Building error-resistant quantum gates is also proving to be a significant hurdle.

Looking to the future, the NVIDIA and quantum computing collaboration is poised to redefine what computers can achieve. As quantum technologies mature and become more accessible, the fusion of classical and quantum computing is likely to change how we approach complex problem-solving.

Conclusion:

NVIDIAs venture into quantum computing signifies a leap into the future. The partnership between NVIDIA and quantum computing pioneers showcases a commitment to pushing the boundaries of technology. As these two forces join hands, we are on the brink of a computing revolution that promises to reshape industries, solve problems we once thought unsolvable, and drive us into a new era of innovation.

See the original post:
Navigating the Future: NVIDIA and Quantum Computing Unraveled - Medium

Trust Stamp publishes a White Paper on the potential impact of Quantum Computing on legacy biometric systems – GlobeNewswire

Atlanta, Georgia, Jan. 05, 2024 (GLOBE NEWSWIRE) -- Trust Stamp (Nasdaq: IDAI), the Privacy-First Identity CompanyTM providing AI-powered trust and identity services has published a White Paper discussing the potential vulnerabilities of legacy biometric systems given the development of Quantum Computing systems.

Dr. Niel Kempson, CB, FREng, Trust Stamps Executive Advisor on Technical Capability commented, Methods currently used to protect communications over the Internet will be secure until quantum computers become a practical reality - experts generally estimate that this is a decade away.

But, we should recognise that a harvest-now decrypt later (HNDL) approach could be executed by resource-rich adversaries, capturing data now that can subsequently be decrypted when quantum computers become available. This would make sense for data that would still have significant value a decade or more in the future. While this has traditionally been the preserve of nation-state actors, cyber-criminals - sometimes state-supported - are now powerful adversaries too.

Financial institutions and others with sensitive data really should question whether they could be a potential target and whether their current implementations would be vulnerable to an HNDL attack. This is especially relevant to biometric systems where biometric data needs protecting carefully for a lifetime - unlike a password, a face or fingerprint cannot easily be reset when compromised.

As an indication of the immediacy of the HNDL risk, in May 2022 the US Government issued a mandate to all US Federal Agencies maintaining sensitive data to deploy symmetric encryption systems to protect quantum vulnerable systems by the end of 2023.

Dr. Kempson went on to comment, Trust Stamps IT2 algorithm is quantum-proof by design. If an enterprise or NGO is implementing or reviewing a biometric system today, it should actively look into the HNDL risk. It makes no sense to implement or maintain technology that will probably be unusable within the next decade, implicitly gambling on future solutions with unknown complexity and cost.

In 2016, the US National Institute of Standards and Technology (NIST) initiated a process to solicit, evaluate and standardize new public-key algorithms that will be secure against a quantum computer (also known as post-quantum or quantum-resistant algorithms). After six years, four algorithms were recommended for standardization with a further four candidates proposed for further consideration. Unfortunately one of those candidates was defeated within a few weeks of its release, reinforcing the difficult nature of this process.

Copies of the White Paper can be requested by emailing Andrew Gowasack, President of Trust Stamp at: agowasack@truststamp.net

Enquiries

Trust Stamp Email: Shareholders@truststamp.ai

Andrew Gowasack, President

About Trust Stamp

Trust Stamp the Privacy-First Identity CompanyTM, is a global provider of AI-powered identity services for use in multiple sectors, including banking and finance, regulatory compliance, government, real estate, communications, and humanitarian services. Its technology empowers organizations with advanced biometric identity solutions that reduce fraud, protect personal data privacy, increase operational efficiency, and reach a broader base of users worldwide through its unique data transformation and comparison capabilities.

Located across North America, Europe, Asia, and Africa, Trust Stamp trades on the Nasdaq Capital Market (Nasdaq: IDAI). The company was founded in 2016 by Gareth Genner and Andrew Gowasack.

Safe Harbor Statement: Caution Concerning Forward-Looking Remarks

All statements in this release that are not based on historical fact are forward-looking statements including within the meaning of the Private Securities Litigation Reform Act of 1995 and the provisions of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended.The information in this announcement may contain forward-looking statements and information related to, among other things, the company, its business plan and strategy, and its industry. These statements reflect managements current views with respect to future events-based information currently available and are subject to risks and uncertainties that could cause the companys actual results to differ materially from those contained in the forward-looking statements. Investors are cautioned not to place undue reliance on these forward-looking statements, which speak only as of the date on which they are made. The company does not undertake any obligation to revise or update these forward-looking statements to reflect events or circumstances after such date or to reflect the occurrence of unanticipated events.

Follow this link:
Trust Stamp publishes a White Paper on the potential impact of Quantum Computing on legacy biometric systems - GlobeNewswire