Archive for the ‘Quantum Computer’ Category

How to introduce quantum computers without slowing economic … – Nature.com

The race is on to develop commercial quantum computers. The breakthroughs they promise new ways of simulating materials, optimizing processes and improving machine learning could transform society, just as todays digital computers have done. But the route to delivering economic benefits is uncertain. The digital revolution took decades and required businesses to replace expensive equipment and completely rethink how they operate. The quantum computing revolution could be much more painful1.

Quantum computers operate in a completely different way from digital computers, and can potentially store and analyse information more efficiently. Digital computers essentially use onoff switches and process binary bits of information (0s and 1s). Quantum computers encode information in the quantum state of atoms, electrons and photons, known as qubits. These qubits can represent many states at once and be combined or entangled, thereby speeding up calculations.

In the long run, businesses adopting quantum computing should have a competitive edge over others. Yet, in the short term, its unclear to what extent the introduction of these machines will prove commercially valuable.

When digital computers started to gain popularity in the 1970s and 1980s, rather than delivering efficiencies, for 15 years they slowed growth in productivity, the value added relative to inputs such as labour, by 0.76 percentage points per annum. Such a dip is known as the productivity paradox. It arose because businesses had to invest in new equipment and learn how to program the devices, as well as work out what to use them for. At first, firms did not invest enough in other innovations that were needed to change core processes and business models2,3. Only after many sectors had adjusted in the 1990s did productivity growth rise again, sharply (see Productivity paradox).

Source: The Conference Board Total Economy Database, 2022

For example, it took a decade of investment, throughout the 1980s, for large firms, such as the retail corporation Walmart, to routinely process data to coordinate planning, and to forecast and replenish their inventory along their supply chains. Walmart gave suppliers access to its sales and inventory data, helping to reduce costs from underproduction or overproduction. The company became able to handle its own distribution and achieve efficiency through economies of scale. All these changes took time and required coordination across many firms2.

We think that the quantum computing revolution could lead to an even more severe and expensive learning curve, for three reasons: high integration costs and few short-term rewards; difficulty in translating quantum concepts for business managers and engineers; and the threat to cryptography posed by quantum computers. As a consequence, assuming that the productivity growth rate slows by 50% more than it did for simpler digital computers, we estimate that the introduction of commercial quantum computers could result in economic losses in gross domestic product (GDP) per capita of approximately US$13,000 over 15 years (based on 2022 levels), or $310 billion per annum in the United States alone.

Fortunately, there are ways to lighten the load and accelerate the benefits to society, three of which we outline here.

Firms might initially adopt quantum computers to solve existing business problems, for which improvements are likely to be incremental. But for more-ambitious uses, the extra costs and likelihood of potential failures might make firms risk-averse. For example, a company that collects vast amounts of data from sensors to inform disaster relief and recovery might look to quantum computers to process information more quickly, to help save lives. But the first such computers might be more prone to faults and errors than are digital ones, with potentially grave consequences for life-critical operations. Such companies might therefore be put off from using quantum computers, until they are more reliable.

These computers will also need to be networked with digital computers, and integrating two such different technologies will be difficult and expensive. Firms will still need digital computers to perform everyday tasks and computations; they will use quantum computers to solve more-complex and specialist problems. Yet, developing hybrid protocols and programs that can work in both situations is much harder than it was to program digital computers in the 1970s.

Hybrid systems will need to be fluent in both digital bits and quantum qubits, and able to encode classical data into quantum states and vice versa. They will need converters to translate digital and analogue signals to transfer information between the two types of processing unit4. Quantum computers are generally large and might need to be cryogenically cooled, making it unlikely that many companies will have a machine of their own. Many will buy services remotely in the cloud through the Internet, for example sourcing extra computing power for simulating materials. Some users, such as traders in financial markets, in which millisecond timing is crucial, might need to host both types of computer.

A chip for quantum computing is tested with a laser at a laboratory of the manufacturing company Q.ANT in Stuttgart, Germany.Credit: Thomas Kienzle/AFP via Getty

To bring firms on board quickly, the commercial advantages will need to be demonstrated in practice. For this, government funding will be needed to attract private investment. We suggest this could be framed as a mission to help companies apply quantum computing to industrial and societal grand challenges. For example, for weather forecasting, quantum systems could analyse huge amounts of data to keep up with rapidly changing conditions. The resilience of the financial system could be improved through better modelling of markets, as would the development of low-carbon technologies to address climate change, such as catalysts for carbon capture or electrolytes for batteries.

Economists will need to devise a framework for evaluating the financial benefits of quantum computing, to encourage firms to invest. Researchers should build proof-of-concept cases, starting by identifying areas in which quantum computers might outperform digital computers for societal grand challenges. Researchers should also set out what firms need to do to adopt quantum technologies, including how they might need to change their business models and practices, as well as working with others along their value chains.

Quantum technologies operate on principles that are often counterintuitive and outside the comfort zone of many engineers and business managers. For example, these technologies work probabilistically and dont seem to obey classical conceptions of cause and effect. According to some schools of thought, in the quantum world, human agency might influence outcomes5, meaning the person operating the computer might need to be considered as part of the system.

And, at present, theres no shared language among scientists, engineers and business managers around quantum computing. Misunderstandings and confusion create delays and therefore further costs. Managers and engineers will need to know enough to be able to select the right class of problems for quantum computers, know what type of information is required to solve them, and prepare data in a quantum-ready format (see go.nature.com/3opfsap).

For example, a delivery logistics company might wish to reschedule its vehicle routes more rapidly to respond better to customer demand for pickups of goods that need returning. Quantum computation could be effective for such replanning which involves solving a complex combinatorial problem in which one change has a knock-on effect on other areas of the business, such as inventory management and financing. But managers would need to be able to spot areas of advantage such as this and know what to do to implement quantum computing solutions.

IBM quantum computer passes calculation milestone

A common semantic and syntactic language for quantum computers needs to be developed. It should be similar to the standardized Unified Modeling Language used for digital computer programming a visual language that helps software developers and engineers to build models to track the steps and actions involved in business processes. Such a tool reduces the costs of software development by making the process intuitive for business managers. Quantum computers also require algorithms and data structures, yet quantum information is much richer than classical information and more challenging to store, transmit and receive6.

A quantum unified modelling language that is similar to the classical one but can also work with quantum information will enable scientists, engineers and managers to stay on the same page while they discuss prototypes, test beds, road maps, simulation models and hybrid information-technology architectures7. Design toolkits that consist of reusable templates and guidelines, containing standard modules for hardware and software development, will allow users to innovate for themselves, shortening development times.

Some of this is beginning to happen. For example, modular workflows are emerging that enable computational chemists and algorithm developers to customize and control chemistry experiments using early versions of quantum computing platforms. A more concerted approach to standardize the language across application areas and hardware platforms is needed to foster commercialization.

Strategies for communicating about quantum computing with the public are also needed, to build trust in these new technologies and ensure that benefits accrue to all parts of society in a responsible manner. Scientists, policymakers and communications specialists should work together to create narratives around the usefulness of quantum technologies. They should focus on practical problems that can be solved rather than tales of weird quantum behaviour.

Although some such initiatives are being set up as part of national quantum programmes, more research is needed to better understand how cognitive biases and ways of learning might influence the adoption of quantum computing. For example, how were cognitive barriers overcome in adopting digital computers and nanotechnologies? Answers to questions such as this will help researchers to develop communication protocols and toolkits.

Quantum computing threatens to break a widely used protocol for encrypting information. Today, sensitive data are typically encrypted by using digital keys in the form of factors for large prime numbers, and sent through fibre-optic cables and other channels as classical bits streams of electrical and optical pulses representing 1s and 0s. The encryption relies on the inability of classical computers to compute the factors for the prime numbers in a reasonable time. However, quantum computers could in principle work out these factors faster and therefore break the encryption.

Are quantum computers about to break online privacy?

Addressing this risk will bring further costs. To protect the security of data and communications, firms will need to invest in new mathematical approaches for encryption, or use quantum-based communications systems, such as quantum key distribution. Quantum key distribution uses qubits sent either through fibre-optic cables or free space (through air, vacuum or outer space), to randomize the generation of keys between the sender and receiver using the probabilistic principles of quantum mechanics. Because of the fragile nature of qubits, if a hacker tries to observe them in transit, the quantum state is affected and the sender and receiver will know that it was tampered with.

Such a threat to sensitive government data and communications8 could also raise geopolitical issues and lead to export controls, such as those imposed by the United States and the Netherlands on microprocessors. The technology bottlenecks for quantum computing are unclear because there are several types of machine that rely on different components and therefore different supply chains. Such restrictions could stifle innovation, increase costs and disrupt the global nature of design, testing and manufacturing processes. Limited exchange of ideas and access to new prototypes would influence the eventual nature of commercial systems and supply chains, as they did for early video cassette recorders reliant on formats such as Betamax and VHS.

Integrating quantum computers and quantum communications technologies across a coordinated network to build a quantum internet9 could overcome this security threat and spur growth across many industries, as the creation of the Internet did. The quantum internet is a network that connects remote quantum devices through a combination of quantum and classical links. This allows distributed quantum computing, in which many devices work together to solve problems, further speeding up computations.

Office workers using computers and telephone headsets in 1965.Credit: Authenticated News/Archive Photos/Getty

The quantum internet could also enable new business models. For example, distributed quantum computers and a process known as blind quantum computing10, which allows fully private computation, could enhance machine learning while preserving proprietary data and guaranteeing that shared data are deleted after computation. Blind quantum computing would, for example, enable data or code from 3D-printing machines at a factory owned by one firm to be shared with machines at another firms factory without either firm seeing the details of the others processes. This would allow the creation and optimization of networks of factories owned by various firms to better cater for changes in product volume. Companies could offer unused 3D-printing production capacity to others, to increase efficiencies, localize production and add flexibility to supply chains.

Researchers need to determine the benefits to customers and firms of sharing data and information with faster computation, enhanced privacy and confidentiality. Would these benefits lead to more products and services that are better tailored to customer needs? What would the impacts be on the wider industrial landscape, and what new business models might emerge?

The promise of quantum computing is great if researchers can help to smooth the path for its implementation.

Visit link:
How to introduce quantum computers without slowing economic ... - Nature.com

NREL & Atom Computing Link Quantum Computer to Grid With … – Executive Gov

The Department of Energys National Renewable Energy Laboratory and Atom Computing have developed an open-source application that can reportedly serve as an interface between quantum computers and power grid equipment and enable researchers to conduct quantum-in-the-loop experiments.

A team of researchers demonstrated the app using Atom Computings quantum computing solution stack and real-time grid simulators from RTDS Technologies and were able to integrate a quantum computing system with an electric grid research platform, NREL said Monday.

To assess the security of next-generation communication protocols and validate current and future quantum algorithms, it is critical to establish a real-world emulation environment with actual hardware and high-speed communication, said Rob Hovsapian, a research adviser at NRELs Advanced Research on Integrated Energy Systems.

This is precisely what we have developed at ARIES with quantum in-the-loop, added Hovsapian.

The interface works by simplifying the translation of optimization problems into quantum variables and facilitating communications between power system simulations and quantum computers.

The research team expects the software interface to help scientists determine problems that could be addressed by quantum computers and assess them through live experiments.

Read the original:
NREL & Atom Computing Link Quantum Computer to Grid With ... - Executive Gov

The 3 Most Undervalued Quantum Computing Stocks to Buy Now … – InvestorPlace

Quantum computing emerges as a pivotal frontier as the tech landscape continually evolves. This article focuses on three undervalued quantum computing stocks, each with growth potential. Despite their undervaluation, these trailblazers are making significant strides in quantum technology, setting the stage for a potential surge in stock prices.

Moving forward, well delve deeper into these undervalued quantum computing stocks. Each company boasts a unique narrative within the quantum computing sector, armed with distinct strategies and offerings. We aim to provide a brief investment thesis for each, equipping you with the insights needed for informed decision-making.So, stay with us as we unravel the promising potential of these stocks in the thrilling world of quantum computing.

Source: Amin Van / Shutterstock.com

IonQs (NYSE:IONQ) strong growth and partnerships with major tech companies like Amazon (NASDAQ:AMZN) make it an attractive investment. Despite recent volatility, the companys positive outlook and increasing bookings suggest a promising future.

The company announced the availability of its quantum computer, IonQ Aria, on Amazon Web Services (AWS) in May this year. This addition to AWSs quantum computing service, Amazon Braket, expands IonQs existing presence on the platform following the debut of IonQs Harmony system in 2020. IonQ Aria, with its 25 algorithmic qubits, allows users to run more complex quantum algorithms to tackle challenging problems.

The expansion of IonQs ecosystem with its partnership makes it an undervalued quantum computing stock to consider. IONQ stocks performance also makes it a momentum play. Its up over 330% year-to-date, and its sales grew 115% quarter-over-quarter.

Source: rafapress / Shutterstock.com

Microsoft (NASDAQ:MSFT) has been doing well, but I would still rank it as one of the undervalued quantum computing stocks. This is primarily due to its competitive positioning and how it harnesses quantum technology. Simply put, its application of topological qubits is seen as a high-risk, high-reward venture. At the same time, other companies like IonQ build less experimental but perhaps less effective quantum systems.

Its theorized by some that Microsofts quantum approach will lead to lower fault tolerance and, ultimately, a faster time to market for a commercial quantum computer. It should be noted this is firmly in the realm of speculation, as MSFTs approach is yet to be proven definitively. But it has made significant headway in its R&D efforts, which suggests its firmly on the right track.

MSFT is also benefiting from the rise of generative AI with its subscription service for its Office suite of products. This led to the company reaching a new all-time high in July.

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Quantum Computing Inc (NASDAQ:QUBT) offers a unique opportunity to invest in a smaller, niche player in the quantum computing field. The companys focus on hardware and software development could position it well for future growth in the quantum computing market.

QUBT is a penny stock with a market cap of $90 million. Still, it has big plans for the future. Its flagship product is the Reservoir Computer, a compact hardware device designed to make neuromorphic hardware accessible and affordable. Neuromorphic computing differs from quantum as it attempts to mirror how neurons and synapses work in a human brain. The advantage is that it lets AI models learn in parallel, as opposed to sequentially in traditional computing, thus allowing them to perform better at tasks such as pattern recognition.

We may see an arms race between neuromorphic computing and quantum as the de facto standard. In a small way, this could be compared to the race between HD DVDs and Blu-ray discs. Both situations involve competing technologies vying to become the dominant standard in their respective fields. Although quantum and neuromorphic hardware technologies are not direct competitors, from an investors point of view, it may be worth diversifying into both forms of tech as we dont know which will come out on top until later.

On the date of publication, Matthew Farley did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed are those of the writer, subject to theInvestorPlace.com Publishing Guidelines.

Matthew started writing coverage of the financial markets during the crypto boom of 2017 and was also a team member of several fintech startups. He then started writing about Australian and U.S. equities for various publications. His work has appeared in MarketBeat, FXStreet, Cryptoslate, Seeking Alpha, and the New Scientist magazine, among others.

Link:
The 3 Most Undervalued Quantum Computing Stocks to Buy Now ... - InvestorPlace

Intel’s Tunnel Falls Quantum Chip Will Help Researchers Advance … – Sourceability

Intel just released its first quantum spin qubit chip, Tunnel Falls. For now, its only available to researchers who will use the hardware to jumpstart experiments and new studies to advance the quantum computing industry.

Quantum computing has come a long way in the past decade. Today, we are closer to practical quantum hardware than ever before. Yet, this tremendous prospect remains out of reachand experts dont know how long it will take to become a reality.

That hasnt stopped startups, tech giants, and chipmakers alike from throwing their hats into the quantum computing ring. Intel is the latest to take a step forward, recently announcing a new quantum chip dubbed Tunnel Falls.

The 12-qubit silicon chip uses spin qubits to perform complex computing functions. However, it wont be available commercially. Intel plans to ship Tunnel Falls chips to leading researchers and academic institutions in hopes the collaboration will advance quantum spin qubit research.

Given that quantum computing is still in its relative infancy, academic institutions dont have the manufacturing fabrication equipment necessary to produce quantum chips at scale. Intel, as one of the worlds largest semiconductor manufacturers, does.

Putting Tunnel Falls chips directly into the hands of researchers means they can start experimenting and working on new research projects immediately. Intel hopes this will open a wide range of experiments and lead to new discoveries about the fundamentals of qubits and quantum dots.

Intels director of quantum hardware, Jim Clarke, said in a statement, Tunnel Falls is Intels most advanced silicon spin qubit chip to date and draws upon the companys decades of transistor design and manufacturing expertise. The release of the new chip is the next step in Intels long-term strategy to build a full-stack commercial quantum computing system.

While there are still fundamental questions and challenges that must be solved along the path to a fault-tolerant quantum computer, the academic community can now explore this technology and accelerate research development, he adds.

The first institutions to receive Tunnel Falls silicon include the University of Marylands Laboratory for Physical Sciences (LPS), Sandia National Laboratories, the University of Rochester, and the University of Wisconsin-Madison. Intel is also collaborating with the Qubits for Computing Foundry (QCF) program through the U.S. Army Research Office.

With this initiative, Intel aims to democratize silicon spin qubits by enabling researchers to gain hands-on experience working with scaled arrays of these qubits.

Notably, information gathered from experiments and research at partner institutions will be shared publicly. So while sharing Tunnel Falls chips is an effort to help Intel advance its quantum silicon ambitions, it is also a source of learning for the wider quantum community.

Tunnel Falls is Intels first spin qubit device being released to the research community. It comes after nearly a decade of research from Intel Labs. The chip is fabricated on 300-millimeter wafers in the companys D1 fabrication facility using Intels advanced manufacturing capabilities.

But what is a spin qubit? Rather than encoding data in traditional binary 1s and 0s, spin qubits encode information in the spin (up/down) of a single electron. Intel likens each qubit device to a single electron transistor.

Notably, this design has similar fabrication requirements as standard complementary metal oxide semiconductors (CMOS). This allows Intel to leverage innovative process control techniques to enable yield and performance. The Tunnel Falls chip has a 95% yield rate across the wafer, which provides over 24,000 quantum dot devices.

In a press release, the company says it believes, Silicon may be the platform with the greatest potential to deliver scaled-up quantum computing.

Intel believes spin qubits are the superior form of qubit technology thanks to the synergy they have with traditional cutting-edge transistors. This approach also comes with a size advantage, making each Intel qubit about one million times smaller than other qubit designs. For commercial quantum computers, millions of qubits will be needed. But Intel believes spin qubits make this possible since they can be packed into chips that resemble a CPU.

Despite the important advances happening across the quantum field, no one really knows how this technology will pan out commercially. Even so, Intel is already working on its next-generation Tunnel Falls chip. The company plans to release it as soon as 2024.

In the meantime, it will work on integrating Tunnel Falls into its full quantum stack. The Intel Quantum Software Development Kit (SDK) will also play an important role. A functional tech stack to support its quantum hardware is an enticing way for Intel to sway prospective buyers in its direction.

Ultimately, however, the commercial application isnt the most exciting part about Tunnel Falls. Putting powerful quantum hardware into the hands of researchers will increase the pace of discoveries in the quantum field and lead to new advancements in the coming days. With the industry working together, quantum computing inches ever closer to practicality.

More:
Intel's Tunnel Falls Quantum Chip Will Help Researchers Advance ... - Sourceability

Hybrid approach for solving real-world bin packing problem … – Nature.com

In this section, we describe in detail the mathematical formulation of the 3 dBPP variant tackled in this research. First, input parameters and variables that compose the problem are shown inTable1.

The 3 dBPP can be solved as an optimization problem where a suitable cost function to minimize must be defined. In our case, this cost function is represented as the sum of three objectives. The strength given to each objective, i.e. the relevance accounted for each one, is up to the user preferences just by multiplying each objective with a suitable weight. Thus, the problem can be stated as (min text { }sum _{i=1}^3omega _io_i) with (omega _i) the weights of each objective (o_i). In our study, we will not consider this bias, i.e. (omega _i=1text { }forall i).

The first and main objective minimizes the total amount of bins used to locate the packages. This can be achieved by minimizing

$$begin{aligned} o_1 = sum _{j=1}^nv_j. end{aligned}$$

(1)

Additionally, for ensuring that items are packed from the floor to the top of the bin, avoiding solutions with floating packages, a second objective is defined by minimizing the average height of the items for all bins

$$begin{aligned} o_2 = frac{1}{mH}sum _{i=1}^mleft( z_i + z'_iright) . end{aligned}$$

(2)

Besides these two objectives reformulated from the reference code28, we further add a third optional objective (o_3) to take into account the load balancing feature. This concern is particularly important when air cargo planes and sailings are the chosen conveyance30,31, for example. In those situations, packages should be uniformly distributed around a given xy-coordinate inside the bin. We can tackle this by computing the so-called taxicab or Manhattan distance between items and the desired center of mass for each bin. As a result, the gaps between items are also reduced. Concerning this, the third objective to be minimized is

$$begin{aligned} o_3 = frac{1}{m}left( frac{1}{L}sum _{i=1}^m {tilde{x}}_i + frac{1}{W}sum _{i=1}^m {tilde{y}}_iright) , end{aligned}$$

(3)

with

$$begin{aligned} {tilde{x}}_i {:}{=}left| left( x_i + frac{x_i'}{2}right) !text { mod } L -{tilde{L}} right| quad text {and}quad {tilde{y}}_i {:}{=}left| y_i + frac{y_i'}{2} -{tilde{W}} right| quad forall iin I, end{aligned}$$

(4)

where (0le x_i< nL) (bins stacked horizontally) and (0le y_i< W) (forall iin I). This objective term minimizes for each item the distance between the center of mass projection in the xy-plane and the (({tilde{L}},{tilde{W}})) coordinate of each bin.

The objectives above defined are subject to certain restrictions, which are essential to derive realistic solutions. The whole pool of constraints is separated into two categories: the ones intrinsic to the BPP definition (intrinsic restrictions), and the ones relevant from a real-world perspective (real-world BPP restrictions).

Item orientations: the fact that inside a bin each item must have only one orientation can be implemented by using

$$begin{aligned} sum _{kin K_i}r_{i,k}=1quad forall iin I. end{aligned}$$

(5)

Set of possible orientations (kin K_i) for a given item i of dimensions ((l_i,w_i,h_i)). (a) (k = 1), (b) (k = 2), (c) (k = 3), (d) (k = 4), (e) (k = 5), (f) (k = 6). SeeTable2.

Orientations give rise to the effective length, width, and height of the items along x, y and z axes

$$begin{aligned} x'_i&= l_ir_{i,1} + l_ir_{i,2} + w_ir_{i,3} + w_ir_{i,4} + h_ir_{i,5} + h_ir_{i,6} quad forall iin I, end{aligned}$$

(6)

$$begin{aligned} y'_i&= w_ir_{i,1} + h_ir_{i,2} + l_ir_{i,3} + h_ir_{i,4} + l_ir_{i,5} + w_ir_{i,6} quad forall iin I, end{aligned}$$

(7)

$$begin{aligned} z'_i&= h_ir_{i,1} + w_ir_{i,2} + h_ir_{i,3} + l_ir_{i,4} + w_ir_{i,5} + l_ir_{i,6} quad forall iin I, end{aligned}$$

(8)

and because of (5), only one term (r_{i,k}) is nonzero in each equation.

It should be deemed that there could be items with geometrical symmetries, as with cubic ones where rotations do not apply. Redundant and non-redundant orientations are considered in the reference code28. In our formulation, we previously check if these symmetries exist to define (K_i) for each item. Thanks to this, (6)(8) are simplified filtering out redundant orientations and leading to a formulation which uses less variables (thus qubits) to represent the same problem, where (kappa =sum _{i=1}^m|K_i|le 6m) variables (r_{i,k}) are needed. For (iin I_text {c}) with (I_text {c}{:}{=}{iin I,|,l_i=w_i=h_i}) (cubic items), we can set (r_{i,1}=1) and 0 otherwise, thus satisfying(5) in advance. InTable2, we can see the non-redundant orientation sets for an item i depending on its dimensions. This simple mechanism reduces the complexity of the problem, being favourable for the quantum hardware to implement.

Non-overlapping restrictions: since we are considering rigid packages, i.e. they can not overlap, a set of restrictions need to be defined to overcome these configurations. For this purpose, at least one of these situations must occur (seeFig.2)

$$begin{aligned} text {Item }itext { is at the left of item }k, (q=1)text {:}{} & {} -(2 - u_{i,j}u_{k,j}-b_{i,k,1})nL+x_i+x'_i-x_kle 0{} & {} forall i,kin I,text { }forall jin J, end{aligned}$$

(9)

$$begin{aligned} text {Item }itext { is behind item }k, (q=2)text {:}{} & {} -(2 - u_{i,j}u_{k,j}-b_{i,k,2})W+y_i+y'_i-y_kle 0{} & {} forall i,kin I,text { }forall jin J, end{aligned}$$

(10)

$$begin{aligned} text {Item }itext { is below item }k, (q=3)text {:}{} & {} -(2 - u_{i,j}u_{k,j}-b_{i,k,3})H+z_i+z'_i-z_kle 0{} & {} forall i,kin I,text { }forall jin J, end{aligned}$$

(11)

$$begin{aligned} text {Item }itext { is at the right of item }k, (q=4)text {:}{} & {} -(2 - u_{i,j}u_{k,j}-b_{i,k,4})nL+x_k+x'_k-x_ile 0{} & {} forall i,kin I,text { }forall jin J,end{aligned}$$

(12)

$$begin{aligned} text {Item }itext { is in front of item }k, (q=5)text {:}{} & {} -(2 - u_{i,j}u_{k,j}-b_{i,k,5})W+y_k+y'_k-y_ile 0{} & {} forall i,kin I,text { }forall jin J, end{aligned}$$

(13)

$$begin{aligned} text {Item }itext { is above item }k, (q=6)text {:}{} & {} -(2 - u_{i,j}u_{k,j}-b_{i,k,6})H+z_k+z'_k-z_ile 0{} & {} forall i,kin I,text { }forall jin J. end{aligned}$$

(14)

As discussed with the orientation variable (r_{i,k}) in(5), the relative position between items i and k must be unique, so

$$begin{aligned} sum _{qin Q}b_{i,k,q}=1quad forall i,kin I. end{aligned}$$

(15)

Representation of (b_{i,k,q}) activated for all relative positions (qin Q) between items i and k. See(9)(14). Both are in contact but it is not mandatory. (a) (b_{{i},{k},1}=1), (b) (b_{{i},{k},2}=1), (c) (b_{{i},{k},3}=1), (d) (b_{{i},{k},4}=1), (e) (b_{{i},{k},5}=1), (f) (b_{{i},{k},6}=1).

Item and container allocation restrictions: the following set of restrictions guarantees an appropriate behaviour during item and bin assignment. In order to avoid packing duplicates of the same item, each item must go to exactly one bin, where

$$begin{aligned} sum _{j=1}^n u_{i,j}=1quad forall iin I. end{aligned}$$

(16)

The following formula verifies if items are being packed inside bins that are already in use

$$begin{aligned} sum _{i=1}^m(1-v_j)u_{i,j}le 0quad forall jin J, end{aligned}$$

(17)

so it activates (v_j) if needed during packaging. Bins can be activated sequentially to avoid duplicated solutions ensuring that

$$begin{aligned} v_jge v_{j+1}quad forall jin Jtext { } | text { }jne n. end{aligned}$$

(18)

Bin boundary constraints: in order to contemplate bin boundaries, the following set of restrictions must be met

$$begin{aligned}{} & {} x_i+x'_i-jL le (1-u_{i,j})nL quad forall iin I,text { }forall jin J, end{aligned}$$

(19)

$$begin{aligned}{} & {} x_i-(j-1)Lu_{i,j} ge 0 quad forall iin I,text { }forall jin Jtext { }|text { }j>1, end{aligned}$$

(20)

$$begin{aligned}{} & {} y_i+y'_i-W le (1-u_{i,j})W quad forall iin I,text { }forall jin J, end{aligned}$$

(21)

$$begin{aligned}{} & {} z_i+z'_i-H le (1-u_{i,j})H quad forall iin I,text { }forall jin J, end{aligned}$$

(22)

where(19) guarantees that items i placed inside the bin j are not outside of the last bin (n-th bin) along the x axis, (20) ensures that item i is located inside of its corresponding bin j along the x axis (activated if (n>1)), (21) confirms that item i placed inside the bin j is not outside along the y axis, while(22) ensures that item i allocated inside the bin j is not outside along the z axis.

In this subsection we introduce those restrictions related with the operative perspective of the problem, i.e. the ones that consider real-world industrial situations. All of the following constraints are optional in our formulation.

Overweight restriction: the weight of each package and the maximum capacity of containers are common contextual data to avoid exceeding the maximum weight capacity of bins, so avoid overloaded containers. We can introduce this restriction as

$$begin{aligned} sum _{i=1}^mmu _iu_{i,j}le Mquad forall jin J. end{aligned}$$

(23)

This restriction is activated if the maximum capacity M is given.

Affinities among package categories: there are commonly preferences for separating some packages into different bins (negative affinities or incompatibilities) or, on the contrary, gathering them into the same container (positive affinities). Let us consider (I_alpha {:}{=}{iin Itext { }|text { }{} texttt {id}text { of }itext { is equal to }alpha }), i.e. (I_alpha subset I) is a subset of all items labelled with id equal to (alpha). Given a set of p negative affinities (A^text {neg}{:}{=}{(alpha _1,beta _1),dots ,(alpha _p,beta _p)}), then the restriction will be

$$begin{aligned} sum _{(alpha ,beta )in A^text {neg}},sum _{(i_alpha ,i_beta )in I_alpha times I_beta },sum _{j=1}^nu_{i_alpha ,j}u_{i_beta ,j}=0, end{aligned}$$

(24)

To activate this restriction, a set of incompatibilities must be given. Moreover, we can satisfy in advance (nu {:}{=}6nsum _{(alpha ,beta )in A^text {neg}}|I_alpha ||I_beta |) non-overlapping constraints (see(9)(14)), leading to a simpler formulation. Conversely, given a set of positive affinities (A^text {pos}) as stated with (A^text {neg}), then the restriction will be posed such that

$$begin{aligned} sum _{(alpha ,beta )in A^text {pos}},sum _{(i_alpha ,i_beta )in I_alpha times I_beta },sum _{j=1}^nleft( 1-u_{i_alpha ,j}u_{i_beta ,j}right) =0, end{aligned}$$

(25)

This restriction is activated if a set of positive affinities is given. If (A^text {pos}) and (A^text {neg}) are given, then both restrictions can be introduced using just one formula adding(24) and (25).

Preferences in relative positioning: relative positioning of items demands that some of them must be placed in a specific position with respect other existing items. This preference allows introducing the ordering of a set of packages according to their positions with respect to the axes. Thus, this preference assists in ordering for many real cases such as: parcel delivery (an item i that has to be delivered before item k will be preferably placed closer to the trunk door) or load bearing (no heavy package should rest over flimsy packages), among others.

Regarding this preference, we can define two different perspectives to treat relative positioning:

Positioning to avoid ((P_q^{-})): list of items (i,k) should not be in the relative position (qin Q) specified. So, (b_{i,k,q}=0) is expected, favouring configurations where the solver selects (q'in Q) with (q'ne q) for the relative positioning of items (i,k).

Positioning to favour ((P_q^{+})): list of items (i,k) should be in a certain relative position q. Activated this preference, (b_{i,k,q}=1) ought to hold and consequently, (b_{i,k,q'}=0 forall q'ne q).

Formally, these preferences are written as

$$begin{aligned} P_q^{-}{:}{=}{(i,k)in I^2text { }|text { }i

(26)

These preferences could be also treated as compulsory pre-selections. In such case, the number of variables needed would be reduced, so would the search space. If we let (smash [t]{p^{-}=sum _{qin Q}|P_q^{-}|}) and (smash [t]{p^{+}=sum _{qin Q}|P_q^{+}|}) with (smash [t]{P^{-}_qcap P^{+}_{q'}=varnothing }), based on(15), the amount of variables reduced would be given by (smash [t]{p^{-}+6p^{+}}). Moreover, (smash [t]{n(p^{-}+5p^{+})}) non-overlapping constraints (see(9)(14)) are satisfied directly and can be ignored, thus simplifying the problem. In this paper, for the sake of clarity, these preferences have been applied for load bearing purposes as hard constraints (HC), as explained in the upcoming Experimental results.

Load balancing: to activate this restriction, a target center of mass must be given. Global positions with respect to the bin as a whole (as described in objective (o_3) in(3)), are fixed using the following constraints

$$begin{aligned} pm frac{1}{n}sum _{j=1}^nleft[ x_i+frac{x_i'}{2} - n(j-1)u_{i,j}L -{tilde{L}}right] le {tilde{x}}_i quad text {and}quad pm left( y_i+frac{y_i'}{2} -{tilde{W}}right) le {tilde{y}}_i quad forall iin I. end{aligned}$$

(27)

This feature is represented inFig.3 for (({tilde{L}},{tilde{W}})=(L/2,W/2)), whose red line shows the available ({tilde{x}}_i) and ({tilde{y}}_i) values (see(4)).

Representation of available ({tilde{x}}_i) and ({tilde{y}}_i) values ensured by the constraints given in(27) for (({tilde{L}},{tilde{W}}) = (L/2,W/2)).

Regarding the complexity of the 3 dBPP proposed in this research, the total amount of variables needed to tackle an arbitrary instance is given inTable3, where our formulation scales as ({mathscr {O}}[m^2+nm]) in terms of variables. Additionally, the total amount of constraints required is provided inTable4, whose quantity grows quadratically as ({mathscr {O}}[m^2+nm]).

Go here to read the rest:
Hybrid approach for solving real-world bin packing problem ... - Nature.com