Archive for the ‘Ai’ Category

Mysterious Entity Paying Reddit $60 Million to Train AI With Users’ Posts – Futurism

"When you use something for free, you are the product." Reddit and Weep

Underlying the storm of hype and funding in the AI sector right now is a scarce resource: data, created by old-fashioned humans, that's needed to train the huge models like ChatGPT and DALL-E that generate text and imagery.

That demand is causing all sorts of drama, from lawsuits by authors and news organizations that say their work was used by AI companies without their permission to the looming question of what happens when the internet fills up with AI-generated content and AI creators are forced to usethat to train future AI.

And, of course, it's also fueling new business deals as AI developers rush to lock down repositories of human-generated work that they can use to train their AI systems. Look no further than this wild scoop from Bloomberg: that an undisclosed AI outfit has struck a deal to pay Reddit $60 millionper year for access to its huge database of users' posts perhaps the surest sign yet that user data is the key commodity in the AI gold rush.

It's not the first time we've seen an AI company cough up for access to a cache of text material. Remember when Axel Springer, the owner of publications ranging from Politico to Business Insider, inked a deal with OpenAI to use its outlets' work in ChatGPT?

But in some respects, it does differ from that bargain. For one, journalists are paid for their work, even if they don't stand to benefit and may actually be harmed by its inclusion in AI systems. Redditors, though, have contributed their vast supply of words as a labor of love which has to rankle when it's all vacuumed up for profit.

"Where the fuck is my cut?" quipped on Redditor in response to the news.

"When you use something for free, you are the product," another retorted.

Even stranger is that in spite of the appreciable sum changing hands remember, this is $60 million every single year we don't actually know who's paying for all this data.

"As an AI language model, I cannot condone the selling of public forums' user data as training data without compensation for the users of said forum," another Redditor wrote of the AI deal, riffing on the way ChatGPT and other systems frequently demur from answering controversial questions.

More on AI:Amazon AGI Team Say Their AI Is Showing "Emergent Abilities"

See the original post:

Mysterious Entity Paying Reddit $60 Million to Train AI With Users' Posts - Futurism

Reddit has a new AI training deal to sell user content – The Verge

Reddit will let an unnamed large AI company have access to its user-generated content platform in a new licensing deal, according to Bloomberg yesterday. The deal, worth about $60 million on an annualized basis, the outlet writes, could still change as the companys plans to go public are still in the works.

Whether or not thats true after all, one of the best ways to get around SEO spam in search results is to add the word Reddit to your search query Reddit has shown that its willing to play hardball before. Last year, it successfully stonewalled its way out of the biggest protest in its history after changes to its third-party API access pricing caused developers of the most popular Reddit apps to shut down.

As Bloomberg writes, Reddits year-over-year revenue was up by 20 percent by the end of 2023, but it was still $200 million shy of a $1 billion target it had set two years prior. The company was reportedly advised to seek a $5 billion valuation when it opens up for public investment, which is expected to happen in March. Thats half the $10$15 billion it might have achieved the last time it filed to go public in 2021, before a market downturn held it back.

Read the rest here:

Reddit has a new AI training deal to sell user content - The Verge

Reddit reportedly signed a multi-million dollar licensing deal to train AI models – Mashable

Reddit posts might be the next fuel in the AI innovation machine, as the "front page of the internet" reportedly negotiated a content licensing deal to allow its data to be used to train AI models.

Ahead of a potential $5 billion IPO debut in March, Bloomberg reported the social media platform had signed a $60 million deal with an undisclosed (but big player) AI company, potentially as a last-minute sell to investors that the platform has potential money-making avenues in the world of AI.

Reddit has yet to confirm the deal.

The move means that Reddit posts, from the most popular subreddits to the comments of lurkers and small accounts, could build up already-existing LLMs or provide a framework for the next generative AI play. It's a dicey decision from Reddit, as users are already at odds with the business decisions of the nearly 20-year-old platform.

Last year, following Reddit's announcement that it would begin charging for access to its APIs, thousands of Reddit forums shut down in protest. Shortly after, the site itself crashed, and days later a group of Reddit hackers threatened to release previously stolen site data unless Reddit CEO Steve Huffamn reversed the API plan or paid them $4.5 million. Later, Reddit removed years of private chat logs and messages from users' accounts, citing it was clearing data from before January 1, 2023, to prepare a new chat infrastructure.

Reddit announced other changes, as well, including a new "official" badge intended to distinguish real accounts from impersonators and new automatic moderation features. In September, Reddit removed the option to turn off ad personalization, rallying even more users against the platform's evolution.

This new AI deal could generate even more user ire, as debate rages on about the ethics of using public data, art, and other human-created content to train AI.

View original post here:

Reddit reportedly signed a multi-million dollar licensing deal to train AI models - Mashable

Billionaire Investor Chase Coleman Has 46% of His Portfolio Invested in 5 Brilliant Artificial Intelligence (AI) Growth … – The Motley Fool

Billionaire investor Chase Coleman is one of Wall Street's original whiz kids. When he was just 24 years old, he founded Tiger Global Management with starting capital from his former boss and mentor, the iconic hedge fund manager Julian Robertson, Jr.

Coleman parlayed this seed money of $25 million into one of the world's most successful hedge fund empires, with roughly $58 billion in assets under management. He's currently ranked as the world's 500th richest person by Forbes with a net worth estimated at $5.7 billion.

Coleman is best known for spotting big winners early on, making notable investments in (among others) Spotify; Facebook, now Meta Platforms (META -2.21%); and LinkedIn, now owned by Microsoft (MSFT -0.61%).

He's no stranger to bold bets, so it isn't surprising that to close out 2023, Coleman had a whopping 45.8% of Tiger Global Management's equity portfolio invested in just five artificial intelligence (AI) stocks:

Let's look at Coleman's top holdings to see why he's so heavily weighted in these AI stocks.

Image source: Getty Images.

Meta Platforms is Coleman's largest holding by far, which isn't surprising since he discovered the company when it was still Facebook.

When it comes to AI, Meta has a long track record of using sophisticated algorithms to its advantage. The lion's share of its revenue is generated by digital advertising, and using AI helps surface more relevant ads and appropriate content for its social media users.

With the ongoing recovery in ad spending, 2023 was a banner year for the world's second-largest digital advertiser. Meta helped fuel its growth by providing advertisers with a suite of AI-powered tools to help improve their results.

The company also jumped feet first into generative AI by developing a top-shelf AI model -- the LLaMA (large language model Meta AI) -- which is available on all the major cloud platforms, giving Meta an entirely new revenue stream. The social media company also announced its first-ever dividend.

Meta stock sells for about 23 times forward earnings, a discount compared to the broader market -- which likely factored into Coleman's investing decision.

Microsoft stunned tech aficionados last year when it invested $13 billion in ChatGPT creator OpenAI, focusing the limelight on the potential applications for generative AI. Many big tech companies followed suit, and the AI arms race was on.

Microsoft made the most of its investment, quickly integrating AI functionality across a broad cross-section of its most popular productivity tools. It further bolstered demand by offering the most sought-after AI models on its Azure Cloud.

One of the biggest opportunities, however, rests in its suite of AI-fueled assistants: Microsoft Copilot. The ability of these tools to increase users' productivity has resulted in strong demand, which could generate incremental revenue of $100 billion by 2027, according to the I/O Fund's Beth Kindig. Azure's growth outpaced the competition in the fourth quarter, and Microsoft noted that 6 percentage points of that growth came from increased demand for AI services.

Microsoft currently trades for 35 times forward earnings. That's a slight premium to the overall market, but the company's strong history of growth and the additional potential resulting from AI make it a must-own AI stock, which likely provided extra incentive for Coleman to buy.

Amazon also has a long history of deploying AI algorithms to manage its e-commerce business, using AI to make product recommendations, manage inventory levels, schedule delivery routes, and more.

The company has also jumped on the generative AI bandwagon, using the technology to improve product descriptions, summarize reviews, and polish merchant advertising. Amazon also plans to offer a customer-focused tool to answer specific product questions.

Furthermore, as the leading cloud infrastructure provider, Amazon Web Services (AWS) offers a laundry list of popular generative AI models via its Bedrock AI. The company is also several generations along in the development of its AI processors -- namely Inferentia and Trainium -- to provide improved and lower-cost AI processing for its cloud customers.

Despite its run-up over the past year, Amazon stock sells for just 2 times forward sales, the standard for an undervalued stock, which likely didn't escape Coleman's notice.

Like its rival Meta Platforms, Alphabet makes the vast majority of its revenue from its ad tech business, driven by Google search. Alphabet has a long and distinguished history of using AI to improve its search and targeted advertising results, and the rebound in the digital advertising market will no doubt boost its fortunes.

Alphabet has been hard at work developing generative AI solutions, incorporating this next-generation functionality into a broad cross-section of its namesake Google and Android products and services.

Its position as a leading cloud infrastructure provider puts the company in the perfect position to market AI solutions to its cloud clients. The recent debut of Gemini AI was hailed by Google as its "largest and most capable AI model."

Furthermore, Alphabet's Vertex AI platform offers a growing suite of more than 130 foundational AI models for customers to choose from.

One of the most intriguing things about Alphabet's stock is the price: just 25 times earnings, a discount to the broader market -- and a valuation that Coleman likely couldn't pass up.

Nvidia is the poster child for the accelerating adoption of AI. Its processors revolutionized the gaming industry and were adapted to handle the rigors of AI and have since become the gold standard.

Its graphics processing units (GPUs) dominate the market in machine learning and data centers, with an estimated 95% share in each market. This made Nvidia the no-brainer choice when demand for generative AI ramped up.

While rivals are working furiously to develop competing chips, Nvidia's pace of innovation makes it difficult to gain ground. Further frustrating those efforts is the company's heavy spending on research and development, which amounted to $6.2 billion -- or 16% of total revenue -- for the nine months ended Oct. 29.

Despite the stock's triple-digit gains, Nvidia is still incredibly cheap, with a price/earnings-to-growth (PEG) ratio of less than 1 -- the standard for an undervalued stock -- and Coleman was no doubt keenly aware of that.

Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool's board of directors. John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. Danny Vena has positions in Alphabet, Amazon, Meta Platforms, Microsoft, and Nvidia. The Motley Fool has positions in and recommends Alphabet, Amazon, Meta Platforms, Microsoft, Nvidia, and Spotify Technology. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.

Read the original here:

Billionaire Investor Chase Coleman Has 46% of His Portfolio Invested in 5 Brilliant Artificial Intelligence (AI) Growth ... - The Motley Fool

OpenAI’s Sam Altman has huge chip ambitions. They might not work – Quartz

OpenAI CEO Sam Altman wants to raise trillions of dollars to reshape the global semiconductor industry, The Wall Street Journal reported earlier this month, an effort to boost chip-making capacity and power more artificial intelligence. Its an eye-boggling amount, one that was put to Nvidia CEO Jensen Huang the man behind the AI company of the moment for his thoughts.

Tarek El Moussa's road out of debt to being a millionaire | Your Wallet

When asked during the World Government Summit in Dubai this week how many GPUs can be bought for $7 trillion, Huang jokingly responded: Apparently all the GPUs. (GPUs, or graphics processing units, power generative AI applications like ChatGPT and OpenAIs new video-generating AI Sora.)

Huang then expressed skepticism about the figure. He said computers powering AI will continue to advance, which drives down costs.

You cant assume just that you would just buy more computers, you also have to assume that the computers are going to become faster, and therefore, the total amount you need will not be as much, Huang said.

Otherwise, he added, the mathematics, if you just assume that computers never get any faster you might come to the conclusion we need 14 different planets and three different galaxies and four more suns to fuel all this. But obviously, computer architecture continues to advance.

Lets take a step back and look at what exactly $7 trillion might fund.

The AI models behind ChatGPT do require a lot of computing power, more than many people realize, said Willy Shih, a professor at Harvard Business School who previously worked at IBM.

If Altmans ambition is to make bigger models for OpenAI, he could spend trillions on data centers, which house the GPUs needed to train AI models that power products like ChatGPT and Sora. The U.S. data center construction market was valued at $24.63 billion in 2024, research firm IDC estimates. So if he spent $1 trillion on chips, he could buy 40 times as many data centers than currently exist.

Data centers currently use less than 1% of the electricity supply of the U.S., Shih said. So Altman would need to build a lot of electricity generation facilities which produce electricity from various energy sources to support his new data centers. Then he would need to upgrade the very electric grid that actually distributes the energy to the data centers. When you consider the money being spent through the federal Inflation Reduction Act and the Infrastructure and Investment Act to incentivize clean energy production and grid modernization in the U.S., a trillion there would probably be a good investment, Shih said.

Perhaps Altman wants to expand global chip-making capacity. There are just a handful of leading-edge fabs, which are manufacturing plants that produce chip parts, being built in the world right now: TSMC in Taiwan, Arizona, and Japan, Samsung in Korea and Texas, and Intel in Arizona, Ohio, and Israel, among others.

Meanwhile, $7 trillion could buy more than 200 leading-edge semiconductor fabs for $30 billion each, Berstein semiconductor analyst Stacy Rasgon estimates.

With 200 or even 100 fabs, you would need to start building out steel mills and concrete plants, Shih said. Altman would also need to buy a lot of construction equipment. Getting a supplier to produce the leading-edge UV machines needed for the scale of Altmans project could take decades, Shih added.

Then theres the money needed to train the workers to fill the factories. Chip companies like TSMC have already complained that workers arent skilled enough for their CHIPs Act projects in Arizona, delaying the opening of new factories.

If money could buy what we want, China would have gotten much further with the $150 billion Made in China 2025 investment into its domestic chips, Shih said. China hasnt quite achieved self-reliance yet. The country, for instance, spends twice as much importing semiconductors as it spends on oil, according to a report from the Canadian bank RBC Wealth Management.

The question then is not whether one can spend all that money, but how far will all that money go?

At least for now, the math doesnt seem to add up.

Read the rest here:

OpenAI's Sam Altman has huge chip ambitions. They might not work - Quartz