Archive for June, 2023

The 5 Most Promising AI Hardware Technologies – MUO – MakeUseOf

Artificial Intelligence (AI) has made remarkable advancements since the end of 2022. Increasingly sophisticated AI-based software applications are revolutionizing various sectors by providing inventive solutions. From seamless customer service chatbots to stunning visual generators, AI is enhancing our daily experiences. However, behind the scenes, AI hardware is pivotal in fueling these intelligent systems.

AI hardware refers to specialized computer hardware designed to perform AI-related tasks efficiently. This includes specific chips and integrated circuits that offer faster processing and energy-saving capabilities. In addition, they provide the necessary infrastructure to execute AI algorithms and models effectively.

The role of AI hardware in machine learning is crucial as it aids in the execution of complex programs for deep learning models. Furthermore, compared to conventional computer hardware like central processing units (CPUs), AI hardware can accelerate numerous processes, significantly reducing the time and cost required for algorithm training and execution.

Furthermore, with the growing popularity of AI and machine learning models, there has been an increased demand for acceleration solutions. As a result, companies like Nvidia, the world's leading GPU manufacturer, have witnessed substantial growth. In June 2023, The Washington Post reported that Nvidia's market value surpassed $1 trillion, surpassing the worth of Tesla and Meta. Nvidia's success highlights the significance of AI hardware in today's technology landscape.

If you're familiar with what edge computing is, you likely have some understanding of edge computing chips. These specialized processors are designed specifically to run AI models at the network's edge. With edge computing chips, users can process data and perform crucial analytical operations directly at the source of the data, eliminating the need for data transmission to centralized systems.

The applications for edge computing chips are diverse and extensive. They find utility in self-driving cars, facial recognition systems, smart cameras, drones, portable medical devices, and other real-time decision-making scenarios.

The advantages of edge computing chips are significant. Firstly, they greatly reduce latency by processing data near its source, enhancing the overall performance of AI ecosystems. Additionally, edge computing enhances security by minimizing the amount of data that needs to be transmitted to the cloud.

Here are some of the leading manufacturers of AI hardware in the field of edge computing chips:

Some might wonder, "What is quantum computing, and is it even real?" Quantum computing is indeed a real and advanced computing system that operates based on the principles of quantum mechanics. While classical computers use bits, quantum computing utilizes quantum bits (qubits) to perform computations. These qubits enable quantum computing systems to process large datasets more efficiently, making them highly suitable for AI, machine learning, and deep learning models.

The applications of quantum hardware have the potential to revolutionize AI algorithms. For example, in drug discovery, quantum hardware can simulate the behavior of molecules, aiding researchers in accurately identifying new drugs. Similarly, in material science, it can contribute to climate change predictions. The financial sector can benefit from quantum hardware by developing price prediction tools.

Below are the significant benefits of quantum computing for AI:

Application Specific Integrated Circuits (ASICs) are designed for targeted tasks like image processing and speech recognition (though you may have heard about ASICs through cryptocurrency mining). Their purpose is to accelerate AI procedures to meet the specific needs of your business, providing an efficient infrastructure that enhances overall speed within the ecosystem.

ASICs are cost-effective compared to traditional central processing units (CPUs) or graphics processing units (GPUs). This is due to their power efficiency and superior task performance, surpassing CPUs and GPUs. As a result, ASICs facilitate AI algorithms across various applications.

These integrated circuits can handle substantial volumes of data, making them instrumental in training artificial intelligence models. Their applications extend to diverse fields, including natural language processing of texts and speech data. Furthermore, they simplify the deployment of complex machine-learning mechanisms.

Neuromorphic hardware represents a significant advancement in computer hardware technology, aiming to mimic the functioning of the human brain. This innovative hardware emulates the human nervous system and adopts a neural network infrastructure, operating with a bottom-up approach. The network comprises interconnected processors, referred to as neurons.

In contrast to traditional computing hardware that processes data sequentially, neuromorphic hardware excels at parallel processing. This parallel processing capability enables the network to simultaneously execute multiple tasks, resulting in improved speed and energy efficiency.

Furthermore, neuromorphic hardware offers several other compelling advantages. It can be trained with extensive datasets, making it suitable for a wide range of applications, including image detection, speech recognition, and natural language processing. Additionally, the accuracy of neuromorphic hardware is remarkable, as it rapidly learns from vast amounts of data.

Here are some of the most notable neuromorphic computing applications:

A Field Programmable Gate Array (FPGA) is an advanced integrated circuit that offers valuable benefits for implementing AI software. These specialized chips can be customized and programmed to meet the specific requirements of the AI ecosystem, earning them the name "field-programmable."

FPGAs consist of configurable logic blocks (CLBs) that are interconnected and programmable. This inherent flexibility allows for a wide range of applications in the field of AI. In addition, these chips can be programmed to handle operations of varying complexity levels, adapting to the system's specific needs.

Operating like a read-only memory chip but with a higher gate capacity, FPGAs offer the advantage of re-programmability. This means they can be programmed multiple times, allowing for adjustments and scalability per the evolving requirements. Furthermore, FPGAs are more efficient than traditional computing hardware, offering a robust and cost-effective architecture for AI applications.

In addition to their customization and performance advantages, FPGAs also provide enhanced security measures. Their complete architecture ensures robust protection, making them reliable for secure AI implementations.

AI hardware is on the cusp of transformative advancements. Evolving AI applications demand specialized systems to meet computational needs. Innovations in processors, accelerators, and neuromorphic chips prioritize efficiency, speed, energy savings, and parallel computing. Integrating AI hardware into edge and IoT devices enables on-device processing, reduced latency, and enhanced privacy. Convergence with quantum computing and neuromorphic engineering unlocks the potential for exponential power and human-like learning.

The future of AI hardware holds the promise of powerful, efficient, and specialized computing systems that will revolutionize industries and reshape our interactions with intelligent technologies.

Originally posted here:
The 5 Most Promising AI Hardware Technologies - MUO - MakeUseOf

Quantum computing: The five biggest breakthroughs – Engineers Ireland

Quantum computing is a revolutionary technology already making waves in many industries, such as drug discovery, cryptography, finance, and logistics. It works by exploiting quantum mechanical phenomena to perform complex computations in a fraction of the time classical computers require. Two main quantum mechanical phenomena drive quantum computers' speed and computational prowess superposition and entanglement.

Unlike classical computers, which operate on binary bits (0 and 1), quantum computers operate on quantum bits or qubits. Qubits can exist in a state of superposition. This means that any qubit has some probability of existing simultaneously in the 0 and 1 states, exponentially increasing the computational power of quantum computers.

Another unique property that qubits have is their ability to become entangled. This means that two qubits, no matter how physically far, are correlated so that knowing the state of one particle automatically tells us something about its companion, even when they are far apart. This correlation can be harnessed for processing vast amounts of data and solving complex problems that classical computers cannot.

Classical computers only have the power to simulate phenomena based on classical physics, making it more difficult or slower to solve problems that rely on quantum phenomena. This is where the true importance of quantum computers lies.

Since quantum computers are based on qubits, they can solve challenging problems using classical computers and revolutionise many industries. For example, quantum computers can rapidly simulate molecules and chemical reactions, discovering new drugs and materials with exceptional properties.

Although significant breakthroughs have been made in quantum computing, we are still in the nascent stages of its development.

The objective of quantum supremacy is to demonstrate that a quantum computer can solve a problem that no classical computer can solve in any reasonable length of time, despite the usefulness of the problem. Achieving this goal demonstrates the power of a quantum computer over a classical computer in complex problem-solving.

InOctober 2019, Google confirmedthat it had achieved quantum supremacy using its fully programmable 54-qubit processor called Sycamore. They solved a sampling problem in 200 seconds which would take a supercomputer nearly 10,000 years to solve. This marked a significant achievement in the development of quantum computing.

Richard Feynman first theorised the idea of using quantum mechanics to perform calculations impossible for classical computers. Image:Unknown/Wikimedia Commons

Since then, many researchers have demonstrated quantum supremacy by solving various sampling problems. The impact of achieving quantum supremacy cannot be overstated. It validates the potential of quantum computing to solve problems beyond the capabilities of classical computers, as first theorised by Richard Feynman in the 1980s.

Apart from sampling problems, other applications have been proposed for demonstrating quantum supremacy, such as Shor's algorithm for factoring integers which are extremely important in encryption. However, implementing Shor's algorithm for large numbers is not feasible with existing technology and is hence not the preferred oversampling algorithm for demonstrating supremacy.

The most pressing concern with quantum computers is their sensitivity to errors induced by environmental noise and imperfect control. This hinders their practical usability, as data stored on a quantum computer can become corrupted.

Classical error correction relies on redundancy, ie, repetition. However, quantum information cannot be cloned or copied due to the no-cloning theorem (which states thatit is impossible to create an independent and identical copy of an arbitrary unknownquantum state). Therefore, a new error correction method is required for quantum computing systems.

QEC for a single qubit. Image:Self/Wikimedia Commons

Quantum error correction (QEC) is a way to mitigate these errors and ensure that the data stored on a quantum computer is error-free, thus improving the reliability and accuracy of quantum computers.

The principle of QEC is to encode the data stored on a quantum computer such that the errors can be detected and corrected without disrupting the computation being performed on it.

This is done using quantum error-correction codes (QECCs). QECCs work by encoding the information onto a larger state space. They further correct the error without measuring the quantum state, thereby preventing the collapse of the quantum state.

The first experimental demonstration of QEC was done in 1998with nuclear magnetic resonance qubits. Since then, several experiments to demonstrate QEC have been performed using, for example, linear optics and trapped ions, among others.

A significant breakthrough camein 2016 when researchers extended the lifespan of a quantum bit using QEC. Their research showed the advantage of using hardware-efficient qubit encoding over traditional QEC methods for improving the lifetime of a qubit.

The detection and elimination of errors is critical to developing realistic quantum computers. QEC handles errors in the stored quantum information, but what about the errors after performing operations? Is there a way to correct those errors and ensure that the computations are not useless?

Fault-tolerant quantum computing is a method to ensure that these errors are detected and corrected using a combination of QECCs and fault-tolerant gates. This ensures that errors arising during the computations don't accumulate and render them worthless.

Quantum computing features. Image:Akash Sain/iStock

The biggest challenge in achieving fault-tolerant quantum computing is the need for many qubits. QECCs themselves require a lot of qubits to detect and correct errors.

Additionally, fault-tolerant gates also require a large number of qubits. However, two independent theoretical studies published in1998and2008proved that fault-tolerant quantum computers can be built. This has come to be known as the threshold theorem, which states that if the physical error rates of a quantum computer are below a certain threshold, the logical error rate can be suppressed to arbitrarily low values.

No experimental findings have proven fault-tolerant quantum computing due to the high number of qubits needed. The closest we've come to an experimental realisation is a2022 study published in Nature,demonstrating fault-tolerant universal quantum gate operations.

We have seen teleportation one too many times in science fiction movies and TV shows. But are any researchers close to making it a reality? Well, yes and no. Quantum teleportation allows for transferring one quantum state from one physical location to another without physically moving the quantum state itself. It has a wide range of applications, from secure quantum communication to distributed quantum computing.

Quantum teleportation wasfirst investigated in 1993by scientists who were using it as a way to send and receive quantum information. It was experimentally realised only four years later, in 1997, by two independent research groups. The basic principle behind quantum teleportation is entanglement (when two particles remain connected even when separated by vast distances).

Since 1997, many research groups have demonstrated the quantum teleportation of photons, atoms, and other quantum particles. It is the only real form of teleportation that exists.

In fact, the 2022 Nobel Prize in Physics was awarded to three scientists Alain Aspect, John Clauser, and Anton Zeilinger for experiments with entangled photons. The work demonstrated that teleportation between photons was possible. Their work demonstrated quantum entanglement and showed it could be used to teleport quantum information from one photon to another.

Quantum teleportation is the cornerstone for building a quantum internet. This is because it enables the distribution of entanglement over long distances.

Another important application of quantum teleportation is enabling remote quantum operations, meaning that a quantum computation can be performed on a distant processor without transmitting the qubits. This could be useful for secure communication and for performing quantum computations in inaccessible or hostile environments.

Topology is a branch of mathematics concerned with studying the properties of shapes and spaces preserved when deformed. But what does it have to do with quantum computing?

In essence, topological quantum computing is a theoretical model that uses quasiparticles called anyons (quasiparticles in two-dimensional space) for encoding and manipulating qubits.

The method is founded on the topological properties of matter, and in the case of anyons, the world lines (the path that an object traces in four-dimensional spacetime) of these particles form braids. These braids then make up the logic gates which are the building blocks of computers.

No experimental studies demonstrate topological quantum computing. Image:FMNLab/Wikimedia Commons

Topological qubits are protected against local perturbations and can be manipulated with high precision, making them less susceptible to decoherence. Additionally, topological quantum computing is more resistant to errors due to its inherent redundancy and topological protection, making it a promising candidate for fault-tolerant quantum computing.

Most topological quantum computing research is theoretical; currently, no studies provide substantial experimental support for the same. But, developments in this area of research are vital for building practical and scalable quantum computers.

With a mix of theoretical and experimental demonstrations, quantum computing is still in the early stages of research and development. These developments can potentially revolutionise several industries and academic disciplines, including financial services, materials science, cryptography, and artificial intelligence.

Even if there is still more study, the implications for quantum computing's future are promising. We may anticipate further developments and innovations in the years to come.

Continued here:
Quantum computing: The five biggest breakthroughs - Engineers Ireland

Accelerating the Accelerator: Scientist Speeds CERN’s HPC With … – Nvidia

Editors note: This is part of a series profiling researchers advancing science with high performance computing.

Maria Girone is expanding the worlds largest network of scientific computers with accelerated computing and AI.

Since 2002, the Ph.D. in particle physics has worked on a grid of systems across 170 sites in more than 40 countries that support CERNs Large Hadron Collider (LHC), itself poised for a major upgrade.

A high-luminosity version of the giant accelerator (HL-LHC) will produce 10x more proton collisions, spawning exabytes of data a year. Thats an order of magnitude more than it generated in 2012 when two of its experiments uncovered the Higgs boson, a subatomic particle that validated scientists understanding of the universe.

Girone loved science from her earliest days growing up in Southern Italy.

In college, I wanted to learn about the fundamental forces that govern the universe, so I focused on physics, she said. I was drawn to CERN because its where people from different parts of the world work together with a common passion for science.

Tucked between Lake Geneva and the Jura mountains, the European Organization for Nuclear Research is a nexus for more than 12,000 physicists.

Its 27-kilometer ring is sometimes called the worlds fastest racetrack because protons careen around it at 99.9999991% the speed of light. Its superconducting magnets operate near absolute zero, creating collisions that are briefly millions of times hotter than the sun.

In 2016, Girone was named CTO of CERN openlab, a group that gathers academic and industry researchers to accelerate innovation and tackle future computing challenges. It works closely with NVIDIA through its collaboration with E4 Computer Engineering, a specialist in HPC and AI based in Italy.

In one of her initial acts, Girone organized the CERN openlabs first workshop on AI.

Industry participation was strong and enthusiastic about the technology. In their presentations, physicists explained the challenges ahead.

By the end of the day we realized we were from two different worlds, but people were listening to each other, and enthusiastically coming up with proposals for what to do next, she said.

Today, the number of publications on applying AI across the whole data processing chain in high-energy physics is rising, Girone reports. The work attracts young researchers who see opportunities to solve complex problems with AI, she said.

Meanwhile, researchers are also porting physics software to GPU accelerators and using existing AI programs that run on GPUs.

This wouldnt have happened so quickly without the support of NVIDIA working with our researchers to solve problems, answer questions and write articles, she said. Its been extremely important to have people at NVIDIA who appreciate how science needs to evolve in tandem with technology, and how we can make use of acceleration with GPUs.

Energy efficiency is another priority for Girones team.

Were working on experiments on a number of projects like porting to lower power architectures, and we look forward to evaluating the next generation of lower power processors, she said.

To prepare for the HL-LHC, Girone, named head of CERN openlab in March, seeks new ways to accelerate science with machine learning and accelerated computing. Other tools are on the near and far horizons, too.

The group recently won funding to prototype an engine for building digital twins. It will provide services for physicists, as well as researchers in fields from astronomy to environmental science.

CERN also launched a collaboration among academic and industry researchers in quantum computing. The technology could advance science and lead to better quantum systems, too.

In another act of community-making, Girone was among four co-founders of a Swiss chapter of the Women in HPC group. It will help define specific actions to support women in every phase of their careers.

Im passionate about creating diverse teams where everyone feels they contribute and belong its not just a checkbox about numbers, you want to realize a feeling of belonging, she said.

Girone was among thousands of physicists who captured some of that spirit the day CERN announced the Higgs boson discovery.

She recalls getting up at 4 a.m. to queue for a seat in the main auditorium. It couldnt hold all the researchers and guests who arrived that day, but the joy of accomplishment followed her and others watching the event from a nearby hall.

I knew the contribution I made, she said. I was proud being among the many authors of the paper, and my parents and my kids felt proud, too.

Check out other profiles in this series:

More:
Accelerating the Accelerator: Scientist Speeds CERN's HPC With ... - Nvidia

Scott Wilson Joins Big Rig Media as VP of Digital Marketing … – RV Business

LA QUINTA, Calif. June 5, 2023 Big Rig Media, the recognized leader in providing high-growth modern marketing solutions to the outdoor hospitality industry, announced today the recent hire of Scott Wilson as Vice President of Digital Marketing.

Scott Wilson

Wilson brings 20 years of marketing experience to the team with extensive leadership in strategic planning, management, operations and sales, and specializing in everything digital including search engine optimization, Google Ads, content development, messaging, social media, analytics, media buying and production. In addition to the digital space, he has managed large scale print, theatrical, television and other traditional marketing projects both domestic and international.

As Vice President of Digital Marketing, Wilson will lead digital marketing strategy, account services and project management for Big Rig Medias client portfolio.

Wilson most recently served as Senior Vice President of Internet Marketing with Scorpion, a technology marketing company serving over 14,000 clients, for over ten years. His responsibilities included providing leadership and strategic development, while implementing best practices in digital marketing platforms.

Prior to his tenure at Scorpion, Wilson served as a Senior Account Manager for Deluxe Media Management where he coordinated and executed marketing efforts across all media platforms, while managing day-to-day fulfillment operations for large studio clients, as well as serving in other capacities. He oversaw clients in the motion picture and television industries both domestic and international.

We are excited to announce the addition of Scott to our organization to head our digital initiatives, said Big Rig Media Founder and CEO Jeff Beyer. His vast experience and knowledge in this industry make him an instrumental member of our leadership team, and we look forward to him complementing our client services.

Boasting only five-star reviews and remaining on the forefront of the latest integrated technologies, Big Rig Media is committed to delivering a rapid digital transformation for enhanced sales efficiency.The firm proudly acts as an extension of a clients team, providing unparalleled personalized service with a staff possessing incomparable expertise in this market.

For additional information and client case studies, visitwww.bigrigmedia.com, or call 866.524.4744.

See the original post:
Scott Wilson Joins Big Rig Media as VP of Digital Marketing ... - RV Business

US Surgeon General Sounds the Alarm on Harmful Social Media … – Boston University

Talking with US Senator Edward Markey (Hon.04), Vivek Murthy says smartphones and social media apps exacerbate young peoples mental health challenges

Nearly one in three high school girls in the United States seriously contemplated suicide in 2021, and nearly three in five teen girls felt persistently sad or hopelessthe highest level reported in nearly a decade, according to recent data by the Centers for Disease Control and Prevention, US Surgeon General Vivek Murthy noted during an appearance at the School of Public Health June 5.

One in three adolescent girls who consider taking their lifethat is an extraordinary number that we should never allow ourselves to get used to or numb to, Murthy said.

He joined US Senator Ed Markey (Hon.04) for a Public Health Conversation at the school to discuss solutions to the urgent mental health crisis that is plaguing the nations youth at a level unlike any previous generation of young people. More than 1,000 people attended or tuned in to the event, which was held in person and online.

In addition to gun violence and climate change, excessive social media use and social isolation are contributing to the worsening mental health among todays children and teens, said Murthy, who along with Markey has prioritized improving youth mental health. In extraordinary moves last month, the Surgeon Generals Office issued separate public health warnings about the harms of social media driving insecurities, depression, anxiety, and low self-esteem, as well as the nations growing epidemic of loneliness.

In speaking with youth across the nation, Murthy said, teens have told him that using social media platforms such as Instagram and TikTok makes them feel worse about themselves and their friendships, and that they cant seem to control the time they spend on these sites.

These platforms are often designed to maximize the amount of time that our kids are spending on these platforms, he said. One in three adolescents are now saying that they stay up to midnight or later on weeknights on their screens, and thats predominantly time using social media. What I care about as a parent and as a doctor is maximizing the health and well-being of my kids and all of our kids, and these platforms need to be designed for that outcome.

Markey noted that he has secured $15 million in funding to support research by the National Institutes of Health that will address the impact of technology and media on children and teens, but said much more needs to be done. He recently reintroduced the Children and Teens Online Privacy Protection Act (COPPA 2.0), which would prohibit internet companies from collecting personal data of users aged 13 to 16 without consent, ban targeted advertising to children and teens, and establish a Digital Marketing Bill of Rights for teens and a youth marketing and privacy division at the Federal Trade Commission.

Big tech is a big problem, Markey said. Big-tech CEOs leverage data about kids and teens and use it against them, serving up an endless stream of toxic content that grabs their attention and keeps them scrolling. We need a definitive statement by the federal government of the impact that social media is having upon the children in our country.

Both officials said the onus should not be placed on parents to address these issues on their own.

Parents everywhere are seeing kids in crisis. We cannot put the entire burden of managing social media on the shoulders of parents, Murthy said. When a child is ready to drive, we dont tell a parent, Why dont you go out and inspect the brakes by yourself? because thats not a reasonable expectation.

The two speakers urged a number of legislative solutions to increase mental health resources and improve access to care, including training more mental health providers, expanding insurance coverage for this care, reducing stigma around mental health, and training kids to maintain healthy relationships.

A lot of our kids dont get training or the skills to handle conflict or to understand emotions, Murthy said, adding that it takes on average of 11 years from when a child exhibits mental health symptoms to receive treatment in the United States. I actually think those skills are just as important as learning to write and do math in terms of your success in life and your overall health and well-being. And weve got to make it easier for people to recognize that theres no shame in admitting that you need help.

Originally posted here:
US Surgeon General Sounds the Alarm on Harmful Social Media ... - Boston University