Archive for the ‘Machine Learning’ Category

Putting AI and Machine Learning to Work in Cloud-Based BI and Analytics – AiThority

Machine learning (ML) in the cloud is powering a whole new generation of intelligent and predictive cloud analytics solutions like Azure Databricks and Azure Synapse. The benefits of cloud economics, tooling and flexibility, along with next-level insights to drive real time business decisions are the primary drivers behind the growing trend of on-premise data lake migrations to the cloud.

Cloud analytics services like Synapse are designed to collect and analyze current and actionable data delivering insights into processes and workflows that can impact business operations. But what if you need those insights immediately, and you need them in the hands of employees and experts who are working simultaneously across the globe in real time and always accurate and up to date? IT stakeholders are turning to the cloud for faster, more accurate and timelier business insights especially in the face of Covid-19 where companies are looking to operate as economically possible and millions are forced into remote working locations.

Even before the pandemic, a 2019 survey by TechTarget found that 27% of respondents plan to deploy cloud analytics in 2020. That same study points to an increase in cloud technology as the number two activity that companies are employing to improve employee experience and productivity, and notes that 38% of companies plan to bolster their cloud technology in 2020. In speaking to the experts at AWS and Azure, that number is higher today. Hindsight is also 20/20!

There are multiple reasons that organizations are moving their data lakes and analytics capabilities to the cloud. First among them is cost: the move streamlines a workforce, so even though there are start-up costs involved in the migration process, the long-term cost-benefit analysis plays out in their favor. Companies are also able to run faster and lighter with cloud analytics with no need to run dedicated client-side applications and IT teams freed of the necessity of coordinating upgrades across an entire infrastructure. In our experience across our customer base at WANdisco and in working with CSPs like Azure and AWS, we have found, on average, that the total cost of ownership to manage a 1PB Hadoop data lake on premise over a three year period costs a company $2M. To manage that same 1PB in AWS S3 or Azure ADLS Gen 2 storage costs $900,000 over three years.

The question is how to most rapidly (time to value) migrate that 1PB data lake with zero downtime and ensuring the data is consistent on prem and in the cloud during migration as the data is always changing if its business critical. The architects and data teams have two choices.

They can use various flavors of open source DistCP tools and scripts, which is the manual approach to a data lake migration. Dont be fooled by fancy names by the Hadoop or Cloud vendors. Its all DistCP under the covers. Whats wrong with this approach? Its an IT project. And like most IT projects, 61% of them either fail or suffer cost and SLA overruns. Heres what you have to do in this scenario:

How long can this take? We have seen teams struggle for months and even years depending on data volume and business requirements around acceptable application downtime, data availability and data consistency. Weve seen companies put 8-10 people on projects, fail after 6 months, then pay $1M to a systems integrator and fail after another 9 months. OUCH.

There is a better way. And forward-looking companies like AMD, Daimler, and many others have figured it out. How?

By leveraging modern technology to automate data lake migration and replication to the cloud with WANdisco LiveData Cloud Services through its patented Distributed Coordination Engine platform.

This innovation is founded on fundamental IP which is based around forming consensus in a distributed network. This is an extremely hard problem to solve and to this day some people believe that it cannot be solved. So what is this problem at a high level? If you have a network of nodes, distributed across the world with little to no knowledge of the distance and bandwidth between the nodes, how can you get the nodes to coordinate between each other without worrying about any failure scenarios?

The solution is the application of a consensus algorithm and the gold standard in consensus is an algorithm called Paxos. Our chief Scientist Dr. Yeturu Aahlad, an expert in distributed systems, devised the first, and even now only, commercialised version of Paxos. By doing so, he solved a problem that had been puzzling computer scientists for years.

WANdiscos LiveData Cloud Services are based on this core IP including our products focused on analytical data and the challenge of migrating this data to the cloud and keeping the data consistent in multiple locations.

As businesses request to have data available in a more and more decentralized environment, the old mechanisms to provide and manage data are not sufficient anymore. Moreover, the amount of data is rising exponentially which leads to a phenomenon called data gravity. With an increasing volume of data, the more it is a challenge to provide this in a distributed environment, allow changes to the data in any environment, and ensure it remains consistent across all environments. Additionally regulation and compliance requirements make it even more challenging for data managers to fulfil businesses needs.

As enterprises look to leverage the scale and economics of the cloud, WANdisco offers a fundamentally different approach to manage these large volumes of data accelerating the ability for enterprises to undergo digital transformation.

Heres what Merv Adrian, Research VP of Data and Analytics at Gartner had to say, WANdiscos ability to move petabytes of data without interrupting production and without risk of losing the data midflight is something no other vendor does and, until now, has been virtually impossible to accomplish.

The Bottom Line

Cloud computing has completely transformed entire industries, computing paradigms and enterprises, and has become the ideal for storing and accessing big data sets. The Covid-19 pandemic has only accelerated this move given the need to operate as economically as possible with more employees working remotely. Cloud computing saves both money and time, which makes it immediately attractive to businesses, while also increasing access for global companies, providing a synergic platform for coordination and cooperation between far-flung employees. 85% of the Fortune 500 have moved to the cloud and continue to do so. The migration of static data has been easy. The challenge now has been how to quickly migrate and replicate large on-premises data lakes and applications to the cloud, when the data is business critical and application downtime, data loss and inconsistencies cannot be tolerated. The good news is that now there is a better way via automated migration and replication that delivers 10X faster time to value, is 100% safer, while ensuring zero downtime during migration.

Share and Enjoy !

Read the original here:
Putting AI and Machine Learning to Work in Cloud-Based BI and Analytics - AiThority

IDTechEx Report Suggests Machine Learning will be Accessible across Chemical and Materials Companies in the Future – CIO Applications

Material Informatics (MI) is a data-centric approach applicable to specific material science and chemistry R&D. Without a doubt, this will become a standard method in a research scientist toolkit.

FREMONT, CA: Machine learning has rapidly become an essential part of every industry. Material scientists and chemists will all have access to machine learning tools to enhance their Research & Development in the future. Seamlessly integrating these underlying operations will not happen quickly, but overlooking the developments in materials informatics will lead to a loss of competitive advantage.

Material Informatics (MI) is a data-centric approach applicable to specific material science and chemistry R&D. Without a doubt, this will become a standard method in a research scientist toolkit. Instead of just grabbing headlines, some form of MI will be assumed in all developments. The key to MI is around the integration, implementation, and manipulation of data infrastructures as well as machine learning approaches designed for chemical and materials datasets.

There is a significant amount of evidence to support this. However, the best backing is how the industries are responding to the technology. There has been a large amount of activity over recent years, including partnerships, investments, and announcements from some of the most notable chemical and materials companies.

Machine learning, by itself, can be used in various kinds of projects, from finding new structure-property relationships, proposing new candidates or process conditions, reducing the number of expensive and time-consuming computer simulations, and more. Machine learning approaches can take numerous forms of supervised and unsupervised learning methods. Generative methods can be effective at screening for optimized outputs across organic compounds. At the same time, even simple modified random forest models can be useful for proposing follow-on reactions to meet a desired set of criteria.

However, this is still at an early stage and requires a lot more development. There is a lot to be leveraged from existing developments in AI, but will first require integrating specialist domain knowledge and coping with the unique challenges of a materials dataset. The application space is broad, and studies have shown success ranging from organometallics, thermoelectrics, nanomaterials, and ceramics to many more.

Original post:
IDTechEx Report Suggests Machine Learning will be Accessible across Chemical and Materials Companies in the Future - CIO Applications

inPowered Selected by ANA as Winner of ‘Best Use of AI/Machine Learning’ Category at 2020 B2 Awards – Yahoo Finance

Content Marketing Has an ROI Problem & AI Can Fix That

SAN FRANCISCO, July 28, 2020 /PRNewswire/ --inPowered, the AI platform delivering business outcomes with content marketing, was awarded the top honors for the "Best Use of AI/Machine Learning" categoryat the 2020 Association of National Advertiser's B2 Awards. This marks the first time that inPowered has received this accolade from the ANA, one of the most highly regarded organizations within the advertising and marketing space.

The entry, titled "Content Marketing has an ROI Problem & AI Can Solve That," discussed the current pain point surrounding measurement and ROI that continues to frustrate marketers. inPowered has challenged the industry standard of evaluating success based off "CPC" or "CPM" by inventing a new content economy to measure KPIs; one that concentrates on consumer engagement versus clicks and impressions. Powered by an artificial intelligence (AI) engine, inPowered's proprietary technology doesn't optimize for clicks but instead for interactions that last a minimum of 15 seconds with each piece of content. This focus on authentically engaged users allows data collected from the technology to guide consumers towards post-click engagement and next-action business outcomes; resulting in a digital funnel entirely optimized for achieving real results and establishing concrete key performance indicators at the lowest cost per engagement.

"Since inception our mission has been to deliver real business outcomes with content marketing, as opposed to the vanity metrics like clicks and impressions that come from display advertising," said Peyman Nilforoush, CEO and Co-Founder at inPowered."This award from the ANA highlights the enormous opportunity for brands to achieve real ROI with content marketing by utilizing AI-powered content distribution, instead of DSP's or ad-network buys that result in expensive costs per visit, low times on-site and high bounce rates from un-engaged users."

The Association of National Advertisers had their biggest year yet with submissions for the 2020 B2 Awards, receiving hundreds of entries across more than three dozen categories. As the largest & oldest marketing organization in the United States, the ANA's mission is to drive growth for marketing professionals, brands and businesses, and for the industry as a whole. "B2B marketing is a cornerstone of our industry, and these awards honor the best and the brightest in the business," said Bob Liodice, Chief Executive Officer at the ANA.

ABOUT INPOWERED:

inPowered is the AI platform built to deliver business outcomes with content marketing. Using inPowered's artificial intelligence-powered technology, brands are able to increase the ROI of their content marketing initiatives by optimizing advertising spend towards the lowest cost across channels; as well as placing calls to action at optimized times to convert already-engaged audiences into tangible business outcomes. The company was founded in 2014 by Peyman Nilforoush and Pirouz Nilforoush after selling their previous company to Ziff Davis. http://www.inpwrd.com

MEDIA CONTACT:

Chelsea Waite, Director of Communications(415) 968-9859chelsea.waite@inpwrd.com

Related Images

inpowered-logo.jpg inPowered Logo inPowered Logo

View original content to download multimedia:http://www.prnewswire.com/news-releases/inpowered-selected-by-ana-as-winner-of-best-use-of-aimachine-learning-category-at-2020-b2-awards-301101420.html

SOURCE inPowered

Follow this link:
inPowered Selected by ANA as Winner of 'Best Use of AI/Machine Learning' Category at 2020 B2 Awards - Yahoo Finance

Machine Learning & Cloud Technologies can make you a valuable resource today: Heres how you can succeed – Times of India

A few years or even months ago, if we were asked about the importance of cloud technology and the ability to remotely access data in a secure manner, there would be few businesses that would show interest. However, in recent times, cloud technologies have proven to be the backbone of running a business. As remote working becomes the norm, the focus has quickly shifted to IT Infracture of companies and Machine Learning & Cloud Computing have finally been recognised for the key role that they play in any business. And so the question of whether you are up to date with the latest changes and revolutions in the industry comes to the forefront.

There are many who have analysed this trend and recognised the power that a key understanding of ML & Cloud can have in their career. Cloud technologies not only empower the IT team to provision new application servers and infrastructure on the go but also gives businesses the power to commission and decommission IT infrastructure at a much faster pace. What would have once taken hours or even days can easily be achieved in just a few minutes, thanks to Cloud Technology. upGrad has understood this fast-paced growth of the industry. IIT Madras, in association with upGrad, has designed an online program that can equip you with the required skill set as well as knowledge to set foot in this industry.

The Importance of ML & Cloud Anyone in the world of Information Technology and management knows that Machine Learning and cloud are the future of every industry. Big Data already plays a key role in every decision-making process and focusing on ML & Cloud today can truly help you revamp your career in an impressive and interesting avenue. The Advanced Certification in Machine Learning and Cloud from IIT Madras in association with upGrad offers just that, with utmost ease and comfort.

What the Advanced Certification in Machine Learning and Cloud Program OffersThe 12-month program which offers Advanced Certification from IIT Madras is a brilliant introduction to Machine Learning and also serves as the perfect tool to gain some practical knowledge in this field. The program has been designed to particularly appease ML enthusiasts who are keen on accelerating in this field by giving them a key understanding of machine learning models using Cloud.

Who is the program designed for? The 12-month program requires 12-15 hours of your undivided attention per work, making it a perfect choice not only for freshers but also senior professionals who are looking to accustom their skills with the new developments in technology. The Advanced Certification in Machine Learning and Cloud is priced at a nominal Rs 2,00,000 and you can also avail the no-cost EMI option that makes this program all the more accessible.

Why upGrad?upGrad has already made a name for itself in the Ed-tech segment. Not only does it provide reliable and articulately designed courses that help amplify your career graph but it also has an array of accolades to the brand name. For the Advanced Certification in Machine Learning and Cloud, upGrad has partnered with more than 300 Hiring Partners as well as industry experts from leading companies like Flipkart, Gramener, among others.

"This program puts you from a beginner level to a person who can understand and provide a Machine Learning solution to any given problem provided one has the passion to learn new techniques in a rigorous manner,said Vignesh Ram, who has benefited from upGrads programs that have steered his career in the right direction.

Here is the original post:
Machine Learning & Cloud Technologies can make you a valuable resource today: Heres how you can succeed - Times of India

Deep Learning is Coming to Hockey – Last Word on Hockey

Analytics have been transforming how we watch hockey. The revolution is just beginning. Statisticians and quantitative experts have led the way. Their impact has changed how we discuss and watch hockey.Analytics have been influential. Deep learning will be disruptive.

Advances in computing and understanding of complex relationships will massively alter the sporting landscape. Hockey will not be immune.

Every decision point is potentially affected. This will lead to impacts on and off the ice. Whoever gets there first will have an enormous competitive advantage. Think Moneyball, but with a team that maybe doesnt lose in the playoffs.

Our technology is getting smarter. Deep Learning (also known as machine learning) is coming to many aspects of life. The basic idea is using a computer to analyze complex interactions to come to conclusions. We have seen the concept applied to medicine with great results. The worlds greatest GO player has left the game after realizing the robots cant be beat. Team sports will be conquered next.

High-end computers can do mathematical calculations we humans can only dream of. This is the basis of how it can work.

Machine learning is an application of Artificial Intelligence (AI.) The focus is providing data to computers, which then learn and improve with experience. These machines arent programmed in the traditional sense, rather they are developed by allowing computers to access data and learn from it themselves.

Like in the outside world, the impacts for sports are numerous. There are many potential applications for deep learning. A look at the call for papers for the 2020 Machine Learning and Data Mining for Sports Analytics conference shows what this world is working on.Expected topics include items such as:

A quick glance at the topics demonstrates the field is getting into increasingly complex issues. This has the potential to reshape coaching, management, and player development.

There is good data and bad data. Like the larger debate about analytics, the availability and value of information is of concern. The sheer number of variables in the chaotic environment on the ice makes the analysis complex. Stop and go sports like baseball and football are easier to analyze as the statistics tend to be more clear cut.

All numbers arent created equal. The issue of inconsistent stat keepers will slow progress down. A shot or a hit in one arena may not be the same in the next. Stats also become less reliable away from professional leagues, and so a close look at the numbers going in are needed to produce accuracy. Quantitative analysis is wonderful, but critical analysis to ensure accuracy is needed. In science speak, you need to operationalize things properly.

The complexity of hockey will make adopting deep learning difficult. It will be one of the last sports to truly be able to take advantage of it. There are many ways it will affect the game for fans, players, and teams. The complexity problem will be overcome.

Whos going to win? Can statistics help us understand the answer? Apparently, yes.

Predicting results has been a primary focus of deep learning applied to sports. The first tests have focused on predicting results. The potential of figuring out whos going to win, and how to efficiently bet would be lucrative for outsiders. Like in other sports, this is the first area where deep learning is likely to come.

It has been a long road, but expert pundits are falling. In the early days of deep learning, the experts at prediction on tv were better. This is changing. Back in 2003, early attempts computers were not able to beat expert pundits at prediction. Recently, a deep learning machine (75% accuracy) was able to beat the ESPN teams 63% accuracy over the same time. This is just the first step.

Football experts were the first to fall. Machine learning will change the game well beyond that. They have the ability to be early adopters in the field. Particularly as the NFL has so much money, they are likely to continue to be the league to watch for the effects of deep learning.

That said, this is spreading. It has been applied to the English Premier League and many other sports. When it arrives in the hockey world, it will change how teams manage their decision making at all levels. From who to sign as a free agent, to who to trade for, and even lineup decisions night to night. The applications are limited only to the availability of the data.

While hockey is chaotic and numbers are inconsistent, this problem can be lessened. Stathletes seem likely to be the people who do it. Hockey is well aware of the name Chayhka already. Meghan is the one to watch in this case. She was one of 3 co-founders of the company along with brother John and Neil Lane.

What they do:

Using proprietary video tracking software, Stathletes pulls together thousands of performance metrics per game and compiles analytics related to each player and team. These analytics can provide baseline benchmarking, player comparisons, line matching, and player and team performance trends. Stathletes currently tracks data in 22 leagues worldwide and sells data to a wide variety of clients, including the National Hockey League (NHL). Via FedDev

If they are using machine learning, it is not clear. If not, it seems inevitable that they will. Meghan Chayka currently works with an expert in machine learning at the TD Management Data and Analytics Lab at Rotman (business school) at University of Toronto. Seems likely they can benefit each other, and would know this. (This may be part of the reason why Arizona seems peeved at Chayka currently. They may have just become a data have not.)

Stathletes and other groups are gaining knowledge and information. They will improve as they go. The NHL is open to this, its coming.

Machine learning has arrived. As the ability to obtain information improves, it will coincide with further developments and whats to come. If you are able to follow, Neil Lane (current Stathletes CEO) is to speak at the University of Waterloo on what sports managers can learn from analytics. This should be enlightening.

Embedded items will be key. Chips and sensors in various hockey items are coming. Jerseys and pucks will be transmitting the information. Learning computers will put it together.

The impacts will be numerous. Coaches, players, agents, and teams will have considerably more knowledge. This changes decision making. Training. Diet. Trades. Penalty Kill lineups. The possibilities are endless.

Deep learning will lead to hockey having more knowledge of all aspects. If people like Pierre McGuire hate analytics now, just wait for whats to come.

Main Photo:

Embed from Getty Images

Go here to read the rest:
Deep Learning is Coming to Hockey - Last Word on Hockey