Archive for the ‘Artificial Intelligence’ Category

Covid-19 driven advances in automation and artificial intelligence risk exacerbating economic inequality – The BMJ

Anton Korinek and Joseph E Stiglitz make the case for a deliberate effort to steer technological advances in a direction that enhances the role of human workers

The covid-19 pandemic has necessitated interventions that reduce physical contact among people, with dire effects on our economy. By some estimates, a quarter of all jobs in the economy require physical interaction and are thus directly affected by the pandemic. This is highly visible in the medical sector, where workers and patients often come into close contact with each other and risk transmitting disease. In several countries medical workers have experienced some of the highest incidences of covid-19. Moreover, as patients were advised to postpone non-essential visits and procedures, medical providers in many countries have also experienced tremendous income losses.1

In economic language, covid-19 has added a shadow cost on labour that requires proximity. This shadow cost reflects the dollar equivalent of all the costs associated with the increased risk of disease transmission, including the costs of the adaptations required for covid-19. It consists of losses of both quality adjusted life days from increased morbidity and quality adjusted life years from increased mortality, as well as the cost of measures to reduce these risks, such as extra protective equipment and distancing measures for workers. Some sectors will incur increased costs from changing the physical arrangements in which production and other interactions occur so that there can be social distancing. It is, of course, understandable that we take these measures to reduce the spread of the disease: by some estimates, the social cost of one additional case of covid-19 over the course of the pandemic is $56000 (40000; 46000) to $111000.2

This shadow cost on labour is also accelerating the development and adoption of new technologies to automate human work. One example is the increasing use of telemedicine. Telemedicine is currently provided in a way that changes the format of delivery of care but leaves the role of doctors largely unchanged. However, it reduces the need for workers who provide ancillary services and who typically have lower wages than doctorsfor example, front office or cleaning staffthus increasing inequality. Moreover, going forward, it may also make it possible to provide medical services from other countries, which has hitherto been difficult, and hence reduce demand for doctors in high income countries.3

Complementary investments, for example internet connected devices such as thermometers, fingertip pulse oximeters, blood pressure cuffs, digital stethoscopes, and electrocardiography devices could further revolutionise the delivery of medical care and may also reduce demand for nurses.45 Such technologies have already made it possible to establish virtual wards for patients with covid-19.6 But even once covid-19 is controlled, medical providers will take into account the risk of future pandemics when choosing which technologies to invest in. Looking further ahead, technologies powered by artificial intelligence (AI), such as Babylon Healths chatbot, foreshadow a possible future in which medical functions traditionally done by doctors may also be automated. This would reduce labour demand and generate a whole new set of potential problems.7

In the past, cybersecurity risks such as computer viruses have held back automation, especially in the medical sector, in which privacy and security are of particular concern. It is ironic that a human virus is now levelling the playing field and forcing automation because it has lessened the appetite for employing humans.

These developments have the potential to reduce labour demand and wages across the economy, including in healthcare. However, making labour redundant is not inevitable. Technological progress in AI and related fields can be steered so that the benefits of advances in technology are widely shared.

The fear of job losses has accompanied technological progress since the Industrial Revolution.8 The history of progress has been one of relentless churning in the labour market, whereby progress made old jobs redundant and created new ones. This churning has always been painful for displaced workers, but economists used to believe that the new jobs created by progress would be pay better than the ones that became redundant so that progress would make workers better off on balance, once they had gone through the adjustment.9

The most useful way to analyse the effects of a new technology on labour markets is not to look at whether it destroys jobs in the short termmany technologies have done so, even though they turned out to be beneficial for workers in the long run. Instead, it is most useful to categorise the effects of technological progress according to whether they are labour using or labour savingthat is, whether they increase or decrease overall demand for labour at given wages and prices. For example, automating many of the processes involved in medical consultations, as in the example of telemedicine, is likely to be labour saving, whereas new medical treatments to improve patients health are likely to be labour using if they are performed by humans.10 In the long run, as markets adjust, changes in labour demand are mainly reflected in wages not in the number of jobs created or lost.

Overall, technological progress since the Industrial Revolution has been labour usingit increased labour demand by leaps and bounds, leading to a massive increase in average wages and material wealth in advanced countries. The reason was that innovation has increased the productivity of workersmaking them able to produce more per hourrather than replacing labour with robots.

However, more recently, the economic picture has been less benign: a substantial proportion of workers in the USfor example, production and non-supervisory workersearn lower wages now (when adjusted for inflation) than in the 1970s.11 Moreover, although it is not clear whether this finding holds in the rest of the world, the share of economic output in the US going to workers rather than the owners of capital has declined from 65% to less than 60% over the past half century.1213 Lower skilled workers have been the most affected. Many recent automation technologies have displaced human workers from their jobs in a way that reduced overall demand for human labour.14

Advances in AI may contribute to more shared prosperity,6 but there is also a risk that they accelerate the trend of the past four decades. The defining attribute of AI is to automate the last domain in which human workers had a comparative advantage over machinesour thinking and learning.15 And if the covid-19 pandemic adds extra incentives for labour saving innovation, the economic effects would be even more painful than in past episodes of technological progress. When the economy is expanding and progress is biased against labour, workers may still experience modest increases in their incomes even though the relative share of output that they may earn is declining. However, at a time when economic output across the globe is falling because of the effects of covid-19, a decline in the relative share of output earned by workers implies that their incomes are falling at faster rates than the rest of the economy. And unskilled manual workers who are at the lower rungs of the earnings distribution are likely to be most severely affected.

An additional aspect of digital technologies such as AI is that they generate what is often called a superstar phenomenon, which may lead to further increases in inequality. Digital technologies can be deployed at almost negligible cost once they have been developed.16 They therefore give rise to natural monopolies, leading to dominant market positions whereby superstar firms serve a large fraction of the marketeither because they are better than any competitors or because no one even attempts to duplicate their efforts and compete. These superstar effects are well known from entertainment industries. In the music industry, for example, the superstars have hundreds of millions of fans and reap in proportionate rewards, but the incomes of musicians further down the list decline quickly. Most of the rewards flow to the top. And empirical work documents that these superstar effects have played an important role in the rise in inequality in recent decades.17

A similar mechanism may soon apply in medicine, accelerated by the covid-19 pandemic. A commonly cited example is radiology. If one of the worlds top medical imaging companies develops an AI system that can read and robustly interpret mammograms better than humans, it would become the superstar in the sector and would displace the task of reading mammograms for thousands of radiologists. Since the cost of processing an additional set of images is close to zero, any earnings after the initial investment in the system has been recouped would earn high profit margins, and the company is likely to reap substantial economic benefits, at least as long as its intellectual property is protected by patents or trade secrets. (The design of the intellectual property regime is an important determinant of the extent of the inequality generated by the economic transformations discussed here.) The more widespread such diagnostic and decision making tools become, the more the medical sector will turn into a superstar industry.

Economic forces are continuing to drive rapid advances in AI, and covid-19 is adding strong tailwinds to these forces. The task now is to shape the forms that these advances will take to ensure that their effect on both patients and medical workers is desirable. The stakes are high since the choices that we make now will have long lasting effects.

We have a good sense of what happens at one extreme: if the direction of progress is determined purely by market forces without regard for shared human wellbeing, our technological future will be shaped by the shortcomings and failures of the market.1518

Markets may provide a force towards efficiency but are blind to distributional concerns, such as the deleterious consequences of labour saving progress or the superstar phenomenon. Responsible decision makers should pursue technologies that maintain an active role for humans and preserve a role for medical workers of all educational levels. For example, medical AI systems can be designed to be human centred tools that provide decision support or they can be designed to automate away human tasks.19 They should also focus on providing high quality care and value to patients with limited financial means rather than just serving patients according to their ability to pay.

Market failures are pervasive in both innovation and healthcare, and even more so at the intersection of the two. Markets encourage incremental advances that may not provide much value to society. They do not adequately provide incentives for larger scale breakthroughs that are most socially beneficial. And as the covid-19 pandemic has shown, they undervalue the benefits of preventive actions, including preventive actions against small probability but existential risks.

Market failures are sometimes exacerbated by government policies, which increase the cost of labour relative to capital, disadvantaging humans relative to machines. Examples include the low taxes on capital (especially capital gains) relative to labour and the artificially low interest rates that have prevailed since the 2008 financial crisis (although low interest rates are also boosting aggregate demand, which is beneficial for workers).

Our institutions and norms interact in important ways with market incentives for technological progress. Most visibly, our system of intellectual property rights, by providing temporary monopoly power to inventors, is meant to facilitate innovation. But often it has the opposite effectinhibiting access to existing knowledge and making the production of new ideas more difficult. Moreover, by inhibiting competition, both innovation and access to the benefits of the advances that occur are reduced. These are arguments for keeping the scope and length of intellectual property rights limited.

Finally, markets are inherently bad at delivering the human element that is so important in medical care. Markets do not adequately reward the empathy and compassion that medical workers provide to their patients and, in fact, provide incentives to scrimp on them. If our technological choices are driven solely by the market, they will reflect the same bias and patient care is likely to be affected. It is essential that decision makers act to ensure that our technological choices reflect our human values.20

The covid-19 pandemic has increased the risk and raised the cost of direct physical contact between humans, as is particularly visible in healthcare

This has accelerated advances in AI and other forms of automation to decrease physical contact and mitigate the risk of disease transmission

These technological advances benefit technologists but could reduce labour demand more broadly and slow wage growth, increasing inequality between workers and the owners of technology

These forces can be counteracted by intentionally steering technological progress in AI to complement labour, increasing its productivity

Contributors and sources: AK and JES wrote this article jointly by invitation from Sheng Wu at WHO. The two have collaborated on a series of papers investigating the effects of advances in AI on economic inequality, on which this analysis is based. All authors edited the manuscript before approving the final version. AK is guarantor.

Competing interests: We have read and understood BMJ policy on declaration of interests and have the following interests to declare: AK and JES are supported by a grant from the Institute for New Economic Thinking. AK serves as a senior adviser to the Partnership on AIs shared prosperity initiative working on related topics. JES is chief economist and senior fellow at the Roosevelt Institute working on a related theme.

Provenance and peer review: Commissioned; externally peer reviewed.

This collection of articles was proposed by the WHO Department of Digital Health and Innovation and commissioned by The BMJ. The BMJ retained full editorial control over external peer review, editing, and publication of these articles. Open access fees were funded by WHO.

This is an Open Access article distributed under the terms of the Creative Commons Attribution IGO License (https://creativecommons.org/licenses/by-nc/3.0/igo/), which permits use, distribution, and reproduction for non-commercial purposes in any medium, provided the original work is properly cited.

Korinek A. Labor in the age of automation and AI. Policy brief. Economists for Inclusive Prosperity, 2019.

Korinek A, Ng DX. Digitization and the macro-economics of superstars. Working paper. University of Virginia, 2019.

Korinek A, Stiglitz JE. Steering technological progress. Working paper. University of Virginia, 2021.

Read more from the original source:
Covid-19 driven advances in automation and artificial intelligence risk exacerbating economic inequality - The BMJ

Artificial Intelligence (AI) In Retail Market Value Expected To Reach US$ 6,762.3 Million By 2027: Acumen Research And Consulting – GlobeNewswire

Acumen Research and Consulting, a global provider of market research studies, in a recently published report titledArtificial Intelligence (AI) in Retail Market Global Industry Analysis, Market Size, Opportunities and Forecast, 2020-2027

LOS ANGELES, March 10, 2021 (GLOBE NEWSWIRE) -- The Global Artificial Intelligence (AI) In Retail Market is expected to grow at a CAGR of around 34.9% from 2020 to 2027 and reach the market value of over US$ 6,762.3 Mn by 2027.

Based on regional landscape, North America is dominating the AI in retail market growth owing to its surging investments in retail development activities and presence of large pool of tech- startups. Within North America, US is considered as one of the early adopters for innovative technologies involving AI, augmented reality, virtual reality, and robotics. The presence of prominent players is one of the key factors responsible for the growth of North America regional market. On the other hand, Europe holds the second largest market share for artificial intelligence (AI) in retail market. The factors that contribute for the growth of the regional market involves large pool of retail supply chains involved in the development of the apparel industry utilizing artificial intelligence research. This factor acts positively witnessing healthy regional growth resulting in global market growth of AI in retail.

DOWNLOAD SAMPLE PAGES OF THIS REPORT@ https://www.acumenresearchandconsulting.com/request-sample/2483

Furthermore, APAC is projected to record all time high CAGR in the forthcoming years. As retail companies in the emerging Asia-Pacific are taking up the AI-oriented initiatives it is stimulating the growth of AI in retail in APAC regional market. In addition, Latin America and the Middle East & Africa are projected to demonstrate significant growth in the global market in the forecast period. Development of customer engagement and advanced transition platform are witnessing healthy adoption for AI-driven retail solutions across Latin America.

Market Drivers

Advancements in virtual fitting rooms augment the overall market growth of artificial intelligence (AI) in retail market

Through integration of artificial intelligence with digital mirrors it enhances the buyers to try a variety of dresses, goggles, accessories, and other products without actually wearing them. Artificial intelligence along with advanced technologies such as AI, AR, VR, provides real-time simulation for virtual dressings solutions. Such technology enhances the customer experiences as well as engagement in both online and offline medium.

Restraints

Well established retailers are trying every possible way to improve the engagement of their customers; however, certain factors are limiting the AI in retail market growth rate. The prominent suppliers and global level retailers like Wal-Mart have already deployed artificial based systems to their shops as well as to online portals. However, small and medium sized retailers (SMEs) are still far away from the technology owing to a lack of infrastructure and absence of skilled expertise. The high implementation cost associated retail solutions acts as a major barrier for small retailers that is involved in limiting the adoption. These factors refrain the market of artificial intelligence (AI) in retail market to grow worldwide.

VIEW TABLE OF CONTENT OF THIS REPORT@ https://www.acumenresearchandconsulting.com/artificial-intelligence-ai-in-retail-market

Market Trend

Product optimization

Product optimization and planning will be growing at a faster pace for AI in retail market globally. The advancements in big data analytics drive the growing adoption of artificial enabled devices and services across different industrial domains and verticals. According to the Consumer Technology Association, AI in retail markets have benefits involving cost saving, increasing productivity, faster resolution of business problems, faster delivery of new products and services, rising innovation, and many more. These factors have a positive influence and help to improve customer analytics and behavior experiences raising the significance of product optimization.

Segmental Outlook

Artificial intelligence (AI) in retail markets is segmented based on technology, deployment model, and application. By technology, the AI retail market is segmented as machine learning, natural language processing, and among others. By deployment model, the AI in retail market is bifurcated into cloud and on-premise. Furthermore, by application, the market is segmented as predictive merchandising, programmatic advertising, in-store visual monitoring and surveillance, location-based marketing, and others.

Browse Upcoming Market Research Reports@ https://www.acumenresearchandconsulting.com/forthcoming-reports

Competitive Landscape

The prominent players of the data center servers industry involve International Business Machines Corporation(IBM), Microsoft Corporation, Amazon Web Services, Oracle Corporation, SAP SE, Intel Corporation, NVIDIA Corporation, Google LLC, Sentient Technologies, and among others.

Some of the key observations regarding artificial intelligence (AI) in retail industry include:

INQUIRY BEFORE BUYING@ https://www.acumenresearchandconsulting.com/inquiry-before-buying/2483

BUY THIS PREMIUM RESEARCH REPORT https://www.acumenresearchandconsulting.com/buy-now/0/2483

Would like to place an order or any question, please feel free to contact atsales@acumenresearchandconsulting.com| +1 407 915 4157

For Latest Update Follow Us:

https://www.facebook.com/acumenresearchandconsulting

https://www.linkedin.com/company/acumen-research-and-consulting

Read more from the original source:
Artificial Intelligence (AI) In Retail Market Value Expected To Reach US$ 6,762.3 Million By 2027: Acumen Research And Consulting - GlobeNewswire

Can artificial intelligence predict whether someone will die from COVID? – The Jerusalem Post

Artificial intelligence can predict with up to 90% accuracy if someone is going to die from the novel coronavirus before they are even infected, a group of scientists from the University of Copenhagen Faculty of Science have found in a study published in the science magazine Nature.Machine learning or artificial intelligence-based computer algorithms that improve automatically through experience by using the collected data was developed during the study and was found to be able to predict the risks at the different stages of illness.The researchers studied 3,944 positive cases in Denmark and used positive cases taken by UK Biobank for "external validation" and took common risk factors such as age, BMI and hypertension into account to formulate the algorithm.The AI model predicted risk of death at different stages: at diagnosis, at hospital admission, and at Intensive Care Unit (ICU) admission.Out of the 3,944 patients who were tracked for the study, 324 died of COVID-19. The men who died were all between 73 and 87 years old with clear signs of high blood pressure and BMI impacting the results. This group of men proved to be the one with the highest risk of mortality as a result, and so the AI program would predict that men in that age range with high blood pressure and BMI are at higher risk.Surprisingly, some of the top risk features "shifted towards markers of shock and organ dysfunction in ICU patients" rather than the aforementioned common risk factors.

cnxps.cmd.push(function () { cnxps({ playerId: '36af7c51-0caf-4741-9824-2c941fc6c17b' }).render('4c4d856e0e6f4e3d808bbc1715e132f6'); });

The study developed an algorithm which managed to predict the risk of death and the findings were further backed by the results in the external validation cohort.

Such technology could help hospitals and medical care facilities throughout the world take extra preventative measures and may help prioritizing some patients over others and therein preventing unnecessarily high mortality rates.

This is not the first study to present the potential use of machine learning in taking preventative measures amid the coronavirus pandemic. The Copenhagen study, however, points out that these studies focused on patients already admitted to the hospital while it is unclear "whether the classification ability transfers to other healthcare systems." Another concern was that they were not entirely accurate machine learning algorithms because they did not take milder cases into account.

In addition, the previous studies, according to the researchers, were based on Chinese models which are vulnerable to bias.

View original post here:
Can artificial intelligence predict whether someone will die from COVID? - The Jerusalem Post

IBM uses artificial intelligence to develop potential break-throughs in antibiotics – WRAL Tech Wire

RESEARCH TRIANGLE PARK IBM scientists have utilized artificial intelligence to help speed up development of molecules for potential use in new novel antibiotics that are needed as the spread of antibiotic resistance grows and the need for new drugs increases.

In a blog post and a paper published in Nature Biomedical Engineering, the IBM team said the system would help pace the way to accelerated discovery.

[O]ur IBM Research team has developed an AI system that can help speed up the design of molecules for novel antibiotics. And it works, wroteAleksandra MojsilovicandPayel Das in the blog.

Noting the rise of resistance to antibiotics, the two said the threat is no joke. Its a huge threat to human health even more so during the raging pandemic. We need new antibiotics, and we need them fast.

AI could help provide part of a better solution.

The paper is titled Accelerating Antimicrobial Discovery with Controllable Deep Generative Models and Molecular Dynamics.

[W]e outline how we used it to create two new non-toxic antimicrobial peptides (AMPs) with strong broad-spectrum potency. Peptides are small molecules they are short strings of amino acids, the building blocks of proteins. Our approach outperforms other leading de novo AMP design methods by nearly 10 percent, the two scientists wrote.

The IBM scientists warned that very few newantibiotics are being developed to replace those that no longer work.Thats because drug design is an extremely difficult and lengthy process there are more possible chemical combinations of a new molecule than there are atoms in the Universe.

We want to help, they wrote.

In the papers abstract, the research team notes progress was madein less than seven weeks:

The de novo [from the beginning] design of antimicrobial therapeutics involves the exploration of a vast chemical repertoire to find compounds with broad-spectrum potency and low toxicity. Here, we report an efficient computational method for the generation of antimicrobials with desired attributes. The method leverages guidance from classifiers trained on an informative latent space of molecules modelled using a deep generative autoencoder, and screens the generated molecules using deep-learning classifiers as well as physicochemical features derived from high-throughput molecular dynamics simulations. Within 48days, we identified, synthesized and experimentally tested 20 candidate antimicrobial peptides, of which two displayed high potency

Excerpt from:
IBM uses artificial intelligence to develop potential break-throughs in antibiotics - WRAL Tech Wire

Cloud Computing trends: Artificial Intelligence leads Twitter mentions in February 2021 – Verdict

Verdict lists the top five terms tweeted on cloud computing in February 2021, based on data from GlobalDatas Influencer Platform. The top tweeted terms are the trending industry discussions happening on Twitter by key individuals (influencers) as tracked by the platform.

The top strategic technology trends for 2021, and the role of AI-powered cloud-based solutions in supporting Covid-19 vaccination drive and businesses were the most popular topics discussed on Twitter in February 2021. Antonio Grasso, the founder and CEO of Digital Business Innovation Srl, an information technology company, tweeted an article on nine interdependent strategic technology trends that will support businesses coherently to overcome the existing socioeconomic challenges of the world.

AI engineering is one such technology listed in the article, which will facilitate execution, scalability, interpretability and trust-worthiness of AI models with the full AI investments value. The distributed cloud that allows organisations to benefit from the public cloud without managing their own expensive and complex private cloud is another technology trend. Privacy-enhancing computation, hyper-automation, cybersecurity mesh, Internet of Behaviour (IoB), total experience strategy, were some of the technology trends discussed in the article.

Ronald Van Loon, CEO of the Intelligent World, an analyst and influencer network, discussed the term to note how AI can remove the IT challenges being faced by healthcare organisations during the Covid-19 vaccination programme. The management of the patient and consumer data through the ordering, shipping and tracking of medicines and vaccines is very complicated due to the lack of coordination among the local, state, and federal government entities.

The potential threat to data security and regulatory compliances are also a major cause of concern. Therefore, various healthcare providers in the US are utilising data analytics and AI to determine who should receive the initial Covid-19 vaccines, according to the specified protocols. Cloud-based programming optimisation solutions are also being utilised to assist in improving the rate of vaccination, simulating the demand and supply disparities and facilitating decision-making on locations for vaccine administration.

AI was also discussed by Craig Milroy, a data capitalist, in relation to the low-code or no-code platforms for rapid design and launch of applications. The platforms are built upon extended cloud-based platform-as-a-service environments to provide rapid and comprehensive solutions to business problems. Salesforce, an all-business platform, OutSystems, an AI-enhanced development, and Appian Low-code Automation Platform are a few leading vendors with low- and no-code offerings.

A new technique of cyberattack called dependency confusion and the ways of destroying the cybercrime economy were most popularly discussed in February 2021. Ratan Jyoti, the chief information security officer at the Ujjivan Small Finance Bank, shared an article published by Microsoft about a new type of cyberattack technique called dependency confusion that can compromise the process of application development in a corporate setting.

A team of security researchers demonstrated that the attackers discover the names of private libraries used in a companys app-building process and upload public libraries containing malicious code. During the app development process, developers confuse their internal library with the malicious library hosted on the public repository and their package manager prioritises the latter. Microsoft has recommended referring to one private feed, protecting the private packages using controlled scopes on public package repositories, and utilising client-side verification features to prevent such cyberattacks.

Malware was also discussed by Marc Wilczek, COO at the IT security provider Link11, through an article on the ways to destroy the cybercrime economy. The distributed denial-of-service (DDoS) attacks have become a more common and effective method for online extortion. The companies need to protect their IT infrastructure with cloud-based services, which can ward off large-scale attacks by spotting anomalies using AI and machine learning before the attack.

Data security management amid rising Big Data and connected devices, Amazon Web Services (AWS) partnership with space companies for streamlining data and the availability of Databricks data platform across all major clouds were the most discussed topics in February. Robin Kiera, the founder of Digitalscouting, a fintech services company, shared an article on how the growth of Big Data and connected devices poses a security risk to collected data.

Cloud-based services, however, can protect this data without the need for organisations to invest in IT infrastructure. The organisations can set extensive cloud-based control security solutions, hosted by a cloud provider to secure the data collected from multiple Internet of Things (IoT)-enabled devices for analysis, the article noted.

Kirk Borne, a principal data scientist at Booz Allen Hamilton, a technology and management consulting services provider, discussed the term with respect to the partnership between Amazons AWS and space companies to streamline the collection, distribution and analysis of data for businesses. The companies such as Capella Space, a satellites operating company, are utilising AWS cloud computing services to reduce the time required to provide data to its customers and to serve them better.

Big Data was also discussed by Andrew Brust, the founder of Blue Badge Insights, a computer software company, in relation to Databricks, the Apache Spark-based cloud data platform, which is now available across three major cloud platforms including AWS, Microsoft Azure and Google Cloud. Databricks can intake a big set of data from various sources and execute a scalable transformation of data. It makes Big Data simple by preparing data, performing analytics, and building data products.

The utility of IoT in safer policing, the requirement for a new type of IoT service provider, and IoT-driven cloud decentralisation were the popularly discussed topics during the month. Giuliano Liguori, the founder of Digital Leaders, digital transformation services company, tweeted on how the police department can be digitally transformed using IoT devices. A graphical representation by BCX, an information and communications company, shared by Liguori highlights the smarter digital solutions for safer policing.

The solutions including mobile applications, biometrics, cloud technology, e-courts, smart corrections, and connected officers can empower the police department and improve the safety of citizens in the long-term. Cloud technology, for example, can offer a secure online portal that connects law enforcement officials, optimise processes and offer sustainable data storage, while reducing paperwork and costs.

Shane Brighton, a customer solutions architect at KCOM, an integration services provider, shared an article that mentioned how the rise in IoT deployments has changed the enterprise connectivity requirements for a more secure and resilient network designed for specific IoT applications. The article highlighted a survey conducted by Pod Group, an Enterprise Network Operator, among technology decision-makers working in organisations in the UK. The survey revealed that the technology enterprises need a new type of IoT connectivity service provider for addressing issues related to technical support and troubleshooting, sensitive data security and new technologies implementation.

IoT was also discussed by Shelly Kramer, principal analyst at Future Research, a digital technology advisory firm, in the context of how IoT and 5G are accelerating the cloud decentralisation process and contributing to the growth of distributed edge computing. IoT and Big Data are the major contributors to the high-performance computing requirements of organisations.

These requirements are increasing cloud computing exponentially and posing unprecedented challenges to technical infrastructure of data centres apart from power, cooling and connectivity. Regionalised hybrid edge clouds are increasingly becoming popular due to these challenges. They bridge the centralised and decentralised IT by combining private cloud with the public cloud services to create an integrated and well-managed computing environment.

The digital transformation of healthcare using cloud computing, data storage advancements helping digital transformation of businesses, and how cloud computing can help overcome the digital transformation challenges were the most talked about topics in February 2021. Yves Mulkers, a data strategist, shared an article on how the digital transformation of the healthcare sector has been influenced by cloud computing. Cloud computing has helped healthcare providers to facilitate high quality and customised patient care at lower operational expenses.

The on-demand availability of computer resources via cloud computing reduces the need for healthcare facilities to spend on hardware and servers. Cloud computing makes the patients data available to the physicians and patients instantaneously from sources, irrespective of locations. It also applies high-powered analytics to improve medical research, while promoting interoperability among sectors such as pharmaceuticals and insurance.

Dana Gardner, the principal analyst at Interarbor Solutions, a digital market research company, discussed the term by sharing an article on the introduction of new flash storage, cloud storage and hybrid-cloud storage systems by IBM. The FlashSystem storage platform of IBM supports hybrid cloud and deployment of container and provides high-end features to the customers. It also includes IBM Storage Insights to make the complex storage environments more visible, and IBM Spectrum Virtualize to virtualise and bring together multi-vendor storage technologies.

Another discussion surrounding digital transformation was shared by Craig Milroy about overcoming the challenges associated with digital transformation using the cloud. The implementation of the work from anywhere (WFA) model during the pandemic across organisations accelerated the use of cloud technologies and improved collaboration and sharing of information across data platforms and digital ecosystems. Cloud technologies will be helpful in integrated business applications and innovation, data modernisation, and social impact driven by AI technologies, the article highlighted.

Continued here:
Cloud Computing trends: Artificial Intelligence leads Twitter mentions in February 2021 - Verdict