Archive for the ‘Machine Learning’ Category

Applications of Machine Learning in the Life Sciences Industry – GlobeNewswire

Dublin, Jan. 10, 2020 (GLOBE NEWSWIRE) -- The "Machine Learning in the Life Sciences" report has been added to ResearchAndMarkets.com's offering.

Artificial intelligence (AI) is a term used to identify a scientific field that covers the creation of machines (e.g., robots) as well as computer hardware and software aimed at reproducing wholly or in part the intelligent behavior of human beings. AI is considered a branch of cognitive computing, a term that refers to systems able to learn, reason, and interact with humans. Cognitive computing is a combination of computer science and cognitive science.

Artificial intelligence covers various aspects of human behavior including creativity, planning and scheduling, reasoning, imaging, writing, learning, auditing, and natural language processing. The concept of artificial intelligence, however, is in continuous evolution. In fact, once the use of machines with specific smart features becomes widespread, new systems with even more advanced capabilities are developed. By enhancing equipment functionality and productivity, AI is revolutionizing virtually every sector, from research and development to manufacturing and services.

The Report Includes:

Key Topics Covered:

Technology Highlights and Market Outlook

List of TablesTable 1: Applications of Machine Learning in the Life Sciences, by FieldTable 2: Global Market for the Applications of Machine Learning in the Quantum Computing, by Country/Region, Through 2024Table 3: Current and Emerging Trends in the Applications of Machine Learning in the Life Sciences, by FieldTable 4: Global Market for the Applications of Machine Learning in the Life Sciences, by Country/Region, Through 2024

List of FiguresFigure 1: Global Market Shares for the Applications of Machine Learning in the Quantum Computing, by Country/Region, 2024Figure 2: Global Market Shares for the Applications of Machine Learning in the Life Sciences, by Country/Region, 2024

For more information about this report visit https://www.researchandmarkets.com/r/wd6nbg

About ResearchAndMarkets.comResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Originally posted here:
Applications of Machine Learning in the Life Sciences Industry - GlobeNewswire

The 4 Hottest Trends in Data Science for 2020 – Machine Learning Times – machine learning & data science news – The Predictive Analytics Times

Originally published in Towards Data Science, January 8, 2020

2019 was a big year for all of Data Science.

Companies all over the world across a wide variety of industries have been going through what people are calling a digital transformation. That is, businesses are taking traditional business processes such as hiring, marketing, pricing, and strategy, and using digital technologies to make them 10 times better.

Data Science has become an integral part of those transformations. With Data Science, organizations no longer have to make their important decisions based on hunches, best-guesses, or small surveys. Instead, theyre analyzing large amounts of real data to base their decisions on real, data-driven facts. Thats really what Data Science is all about creating value through data.

This trend of integrating data into the core business processes has grown significantly, with an increase in interest by over four times in the past 5 years according to Google Search Trends. Data is giving companies a sharp advantage over their competitors. With more data and better Data Scientists to use it, companies can acquire information about the market that their competitors might not even know existed. Its become a game of Data or perish.

Google search popularity of Data Science over the past 5 years. Generated by Google Trends.

In todays ever-evolving digital world, staying ahead of the competition requires constant innovation. Patents have gone out of style while Agile methodology and catching new trends quickly is very much in.

Organizations can no longer rely on their rock-solid methods of old. If a new trend like Data Science, Artificial Intelligence, or Blockchain comes along, it needs to be anticipated beforehand and adapted quickly.

The following are the 4 hottest Data Science trends for the year 2020. These are trends which have gathered increasing interest this year and will continue to grow in 2020.

(1) Automated Data Science

Even in todays digital age, Data Science still requires a lot of manual work. Storing data, cleaning data, visualizing and exploring data, and finally, modeling data to get some actual results. That manual work is just begging for automation, and thus has been the rise of automated Data Science and Machine Learning.

Nearly every step of the Data Science pipeline has been or is in the process of becoming automated.

Auto-Data Cleaning has been heavily researched over the past few years. Cleaning big data often takes up most of a Data Scientists expensive time. Both startups and large companies such as IBM offer automation and tooling for data cleaning.

Another large part of Data Science known as feature engineering has undergone significant disruption. Featuretools offers a solution for automatic feature engineering. On top of that, modern Deep Learning techniques such as Convolutional and Recurrent Neural Networks learn their own features without the need for manual feature design.

Perhaps the most significant automation is occurring in the Machine Learning space. Both Data Robot and H2O have established themselves in the industry by offering end-to-end Machine Learning platforms, giving Data Scientists a very easy handle on data management and model building. AutoML, a method for automatic model design and training, has also boomed over 2019 as these automated models surpass the state-of-the-art. Google, in particular, is investing heavily in Cloud AutoML.

In general, companies are investing heavily in building and buying tools and services for automated Data Science. Anything to make the process cheaper and easier. At the same time, this automation also caters to smaller and less technical organizations who can leverage these tools and services to have access to Data Science without building out their own team.

(2) Data Privacy and Security

Privacy and security are always sensitive topics in technology. All companies want to move fast and innovate, but losing the trust of their customers over privacy or security issues can be fatal. So, theyre forced to make it a priority, at least to a bare minimum of not leaking private data.

Data privacy and security has become an incredibly hot topic over the past year as the issues are magnified by enormous public hacks. Just recently on November 22, 2019, an exposed server with no security was discovered on Google Cloud. The server contained the personal information of 1.2 Billion unique people including names, email addresses, phone numbers, and LinkedIn and Facebook profile information. Even the FBI came in to investigate. Its one of the largest data exposures of all time.

To continue reading this article click here.

Read the rest here:
The 4 Hottest Trends in Data Science for 2020 - Machine Learning Times - machine learning & data science news - The Predictive Analytics Times

Technology Trends to Keep an Eye on in 2020 – Built In Chicago

Artificial intelligence and machine learning, with an eye toward task automation.

For Senior Data Scientist James Buban at iHerb, those are just a couple of the tech trends hell be watching in 2020.

As companies enter a new decade, its important for their leaders to anticipate how the latest tech trends will evolve in order to determine how they can benefit their businesses and their customers. CEO of 20spokes Ryan Fischer said his company uses machine learning data to provide a better user experience for our clientscustomers by leveraging data on individual user behavior.

We asked Buban, Fischer and other local tech execs which trends theyre watching this year and how theyll be utilizing them to enhance their businesses. From natural language processing to computer vision, these are the trends that will be shaping tech in 2020.

As a development agency, 20spokes specializes in helping startups plan, build and scale innovative products. CEO Ryan Fischer said he is looking to AI and machine learning to design better chatbots and wrangle large data sets.

What are the top tech trends you're watching in 2020? What impact do you think these trends will have on your industry in particular?

In 2020, we expect AI to play an even bigger role for our clients. When we talk about AI, we are really discussing machine learning and using data to train a model to use patterns and inference.

Working with machine learning continues to get easier with many large providers working on simpler implementations, and we expect the barrier to entry to continue to lower in 2020. We also have more user data which allows us to use machine learning to design more tailored and intelligent experiences for users.

We areusing machine learning to improve chatbots to create more dynamic dialogue.

How are you applying these trends in your work in the year ahead?

At 20spokes, we use machine learning to provide a better user experience for our clients' customers by leveraging data on individual user behavior to make more accurate recommendations and suggestions. We're continuing to look at how we can apply it to different sets of data, from providing better insights of reports for large data sets to sending us real-time updates based on trained patterns. We are also using machine learning to improve chatbots to create more dynamic dialogue.

In order to deliver trusted insights on consumer packaged goods, Label Insights Senior Data Scientist James Buban said they have to first process large amounts of data. Using machine learning and automation, data collection processes can be finished quickly and more accurately for customers.

What are the top tech trends you're watching in 2020?

The top tech trends that well be watching in 2020 are artificial intelligence and machine learning, with an eye toward task automation. In particular, we are interested in advancements in computer vision, such as object detection and recognition. We are also interested in natural language processing, such as entity tagging and text classification. In general, we believe that machine learning automation will play a big role in both the data collection industry and in e-commerce, particularly in the relatively new addition of the food industry in the retail space.

We plan to use computer vision and natural language processing toautomate tasksthroughout 2020.

How are you applying these trends in your work in the year ahead?

At Label Insight, we are building up a large database of attributes for consumables based on package information. To do so, we first need to collect all package data, which has traditionally been accomplished through a team of dedicated data entry clerks. Due to the huge volume of products that need to be added to our database, this data entry process is expensive, tedious and time-consuming.

Therefore, we plan to use computer vision and natural language processing to begin automating these tasks throughout 2020. We are also planning to use this technology to make our e-commerce solutions more scalable.

Continue reading here:
Technology Trends to Keep an Eye on in 2020 - Built In Chicago

Going Beyond Machine Learning To Machine Reasoning – Forbes

From Machine Learning to Machine Reasoning

The conversation around Artificial Intelligence usually revolves around technology-focused topics: machine learning, conversational interfaces, autonomous agents, and other aspects of data science, math, and implementation. However, the history and evolution of AI is more than just a technology story. The story of AI is also inextricably linked with waves of innovation and research breakthroughs that run headfirst into economic and technology roadblocks. There seems to be a continuous pattern of discovery, innovation, interest, investment, cautious optimism, boundless enthusiasm, realization of limitations, technological roadblocks, withdrawal of interest, and retreat of AI research back to academic settings. These waves of advance and retreat seem to be as consistent as the back and forth of sea waves on the shore.

This pattern of interest, investment, hype, then decline, and rinse-and-repeat is particularly vexing to technologists and investors because it doesn't follow the usual technology adoption lifecycle. Popularized by Geoffrey Moore in his book "Crossing the Chasm", technology adoption usually follows a well-defined path. Technology is developed and finds early interest by innovators, and then early adopters, and if the technology can make the leap across the "chasm", it gets adopted by the early majority market and then it's off to the races with demand by the late majority and finally technology laggards. If the technology can't cross the chasm, then it ends up in the dustbin of history. However, what makes AI distinct is that it doesn't fit the technology adoption lifecycle pattern.

But AI isn't a discrete technology. Rather it's a series of technologies, concepts, and approaches all aligning towards the quest for the intelligent machine. This quest inspires academicians and researchers to come up with theories of how the brain and intelligence works, and their concepts of how to mimic these aspects with technology. AI is a generator of technologies, which individually go through the technology lifecycle. Investors aren't investing in "AI, but rather they're investing in the output of AI research and technologies that can help achieve the goals of AI. As researchers discover new insights that help them surmount previous challenges, or as technology infrastructure finally catches up with concepts that were previously infeasible, then new technology implementations are spawned and the cycle of investment renews.

The Need for Understanding

It's clear that intelligence is like an onion (or a parfait) many layers. Once we understand one layer, we find that it only explains a limited amount of what intelligence is about. We discover there's another layer thats not quite understood, and back to our research institutions we go to figure out how it works. In Cognilyticas exploration of the intelligence of voice assistants, the benchmark aims to tease at one of those next layers: understanding. That is, knowing what something is recognizing an image among a category of trained concepts, converting audio waveforms into words, identifying patterns among a collection of data, or even playing games at advanced levels, is different from actually understanding what those things are. This lack of understanding is why users get hilarious responses from voice assistant questions, and is also why we can't truly get autonomous machine capabilities in a wide range of situations. Without understanding, there's no common sense. Without common sense and understanding, machine learning is just a bunch of learned patterns that can't adapt to the constantly evolving changes of the real world.

One of the visual concepts thats helpful to understand these layers of increasing value is the "DIKUW Pyramid":

DIKUW Pyramid

While the Wikipedia entry above conveniently skips the Understanding step in their entry, we believe that understanding is the next logical threshold of AI capability. And like all previous layers of this AI onion, tackling this layer will require new research breakthroughs, dramatic increases in compute capabilities, and volumes of data. What? Don't we have almost limitless data and boundless computing power? Not quite. Read on.

The Quest for Common Sense: Machine Reasoning

Early in the development of artificial intelligence, researchers realized that for machines to successfully navigate the real world, they would have to gain an understanding of how the world works and how various different things are related to each other. In 1984, the world's longest-lived AI project started. The Cyc project is focused on generating a comprehensive "ontology" and knowledge base of common sense, basic concepts and "rules of thumb" about how the world works. The Cyc ontology uses a knowledge graph to structure how different concepts are related to each other, and an inference engine that allows systems to reason about facts.

The main idea behind Cyc and other understanding-building knowledge encodings is the realization that systems can't be truly intelligent if they don't understand what the underlying things they are recognizing or classifying are. This means we have to dig deeper than machine learning for intelligence. We need to peel this onion one level deeper, scoop out another tasty parfait layer. We need more than machine learning - we need machine reasoning.

Machine reason is the concept of giving machines the power to make connections between facts, observations, and all the magical things that we can train machines to do with machine learning. Machine learning has enabled a wide range of capabilities and functionality and opened up a world of possibility that was not possible without the ability to train machines to identify and recognize patterns in data. However, this power is crippled by the fact that these systems are not really able to functionally use that information for higher ends, or apply learning from one domain to another without human involvement. Even transfer learning is limited in application.

Indeed, we're rapidly facing the reality that we're going to soon hit the wall on the current edge of capabilities with machine learning-focused AI. To get to that next level we need to break through this wall and shift from machine learning-centric AI to machine reasoning-centric AI. However, that's going to require some breakthroughs in research that we haven't realized yet.

The fact that the Cyc project has the distinction as being the longest-lived AI project is a bit of a back-handed compliment. The Cyc project is long lived because after all these decades the quest for common sense knowledge is proving elusive. Codifying commonsense into a machine-processable form is a tremendous challenge. Not only do you need to encode the entities themselves in a way that a machine knows what you're talking about but also all the inter-relationships between those entities. There are millions, if not billions, of "things" that a machine needs to know. Some of these things are tangible like "rain" but others are intangible such as "thirst". The work of encoding these relationships is being partially automated, but still requires humans to verify the accuracy of the connections... because after all, if machines could do this we would have solved the machine recognition challenge. It's a bit of a chicken and egg problem this way. You can't solve machine recognition without having some way to codify the relationships between information. But you can't scalable codify all the relationships that machines would need to know without some form of automation.

Are we still limited by data and compute power?

Machine learning has proven to be very data-hungry and compute-intensive. Over the past decade, many iterative enhancements have lessened compute load and helped to make data use more efficient. GPUs, TPUs, and emerging FPGAs are helping to provide the raw compute horsepower needed. Yet, despite these advancements, complicated machine learning models with lots of dimensions and parameters still require intense amounts of compute and data. Machine reasoning is easily one order or more of complexity beyond machine learning. Accomplishing the task of reasoning out the complicated relationships between things and truly understanding these things might be beyond today's compute and data resources.

The current wave of interest and investment in AI doesn't show any signs of slowing or stopping any time soon, but it's inevitable it will slow at some point for one simple reason: we still don't understand intelligence and how it works. Despite the amazing work of researchers and technologists, we're still guessing in the dark about the mysterious nature of cognition, intelligence, and consciousness. At some point we will be faced with the limitations of our assumptions and implementations and we'll work to peel the onion one more layer and tackle the next set of challenges. Machine reasoning is quickly approaching as the next challenge we must surmount on the quest for artificial intelligence. If we can apply our research and investment talent to tackling this next layer, we can keep the momentum going with AI research and investment. If not, the pattern of AI will repeat itself, and the current wave will crest. It might not be now or even within the next few years, but the ebb and flow of AI is as inevitable as the waves upon the shore.

See the original post:
Going Beyond Machine Learning To Machine Reasoning - Forbes

AI and machine learning trends to look toward in 2020 – Healthcare IT News

Artificial intelligence and machine learning will play an even bigger role in healthcare in 2020 than they did in 2019, helping medical professionals with everything from oncology screenings to note-taking.

On top of actual deployments, increased investment activity is also expected this year, and with deeper deployments of AI and ML technology, a broader base of test cases will be available to collect valuable best practices information.

As AI is implemented more widely in real-world clinical practice, there will be more academic reports on the clinical benefits that have arisen from the real-world use, said Pete Durlach, senior vice president for healthcare strategy and new business development at Nuance.

"With healthy clinical evidence, we'll see AI become more mainstream in various clinical settings, creating a positive feedback loop of more evidence-based research and use in the field," he explained. "Soon, it will be hard to imagine a doctor's visit, or a hospital stay that doesn't incorporate AI in numerous ways."

In addition, AI and ambient sensing technology will help re-humanize medicine by allowing doctors to focus less on paperwork and administrative functions, and more on patient care.

"As AI becomes more commonplace in the exam room, everything will be voice enabled, people will get used to talking to everything, and doctors will be able to spend 100% of their time focused on the patient, rather than entering data into machines," Durlach predicted. "We will see the exam room of the future where clinical documentation writes itself."

The adoption of AI for robotic process automation ("RPA") for common and high value administrative functions such as the revenue cycle, supply chain and patient scheduling also has the potential to rapidly increase as AI helps automate or partially automate components of these functions, driving significantly enhanced financial outcomes to provider organizations.

Durlach also noted the fear that AI will replace doctors and clinicians has dissipated, and the goal now is to figure out how to incorporate AI as another tool to help physicians make the best care decisions possible effectively augmenting the intelligence of the clinician.

"However, we will still need to protect against phenomenon like alert fatigue, which occurs when users who are faced with many low-level alerts, ignore alerts of all levels, thereby missing crucial ones that can affect the health and safety of patients," he cautioned.

In the next few years, he predicts the market will see a technology that finds a balance between being too obtrusive while supporting doctors to make the best decisions for their patients as the learn to trust the AI powered suggestions and recommendations.

"So many technologies claim they have an AI component, but often there's a blurred line in which the term AI is used in a broad sense, when the technology that's being described is actually basic analytics or machine learning," Kuldeep Singh Rajput, CEO and founder of Boston-based Biofourmis, told Healthcare IT News. "Health system leaders looking to make investments in AI should ask for real-world examples of how the technology is creating ROI for other organizations."

For example, he pointed to a study of Brigham & Women's Home Hospital program, recently published in Annals of Internal Medicine, which employed AI-driven continuous monitoring combined with advanced physiology analytics and related clinical care as a substitute for usual hospital care.

The study found that the program--which included an investment in AI-driven predictive analytics as a key component--reduced costs, decreased healthcare use, and lowered readmissions while increasing physical activity compared with usual hospital care.

"Those types of outcomes could be replicated by other healthcare organizations, which makes a strong clinical and financial case to invest in that type of AI," Rajput said.

Nathan Eddy is a healthcare and technology freelancer based in Berlin.Email the writer:nathaneddy@gmail.comTwitter:@dropdeaded209

Go here to read the rest:
AI and machine learning trends to look toward in 2020 - Healthcare IT News