Archive for the ‘Ai’ Category

New study on AI-assisted creativity reveals an interesting social dilemma – PsyPost

Generative artificial intelligence (AI) is revolutionizing many aspects of our lives, from customer support to artistic creation. A new study published in Science Advances provides insight into how these AI systems, specifically large language models, impact human creativity in writing. The findings suggest that AI can enhance the perceived creativity and quality of short stories, particularly for less inherently creative writers, but it also raises concerns about the potential homogenization of creative outputs.

Creativity is a cornerstone of human expression and innovation, yet the advent of generative AI technologies has begun to challenge traditional views on the uniqueness of human-created content. In their new study, Anil Doshi (an assistant professor at UCL School of Management) and Oliver Hauser (a professor and deputy director of the Institute for Data Science and Artificial Intelligence at University of Exeter) aimed to investigate how generative AI affects individuals ability to produce creative written content, specifically focusing on short fiction.

We were both excited by the potential of generative AI, the researchers told PsyPost. We both thought there would be an opportunity to work in an area of common interest. Why we focused on the question of creativity: because generative AI is such a new and potentially transformative technology, we wanted to focus on a core characteristic of being humanthat is, our ability to be creative and express new ideas and output.

The researchers recruited 500 participants from the Prolific platform, an online research participant pool. They ensured a reliable sample by including only participants with a high approval rating and based in the United Kingdom. After accounting for dropouts and exclusions, 293 participants completed the study.

Participants were randomly assigned one of three writing topics: an adventure on the open seas, an adventure in the jungle, or an adventure on a different planet. They were instructed to write an eight-sentence story suitable for a teenage and young adult audience. The participants were further divided into three groups based on the availability of AI assistance:

After completing their stories, participants rated their own work on various stylistic attributes, including creativity and enjoyability. The stories were then evaluated by a separate group of 600 individuals from the same online platform. These evaluators assessed the creativity, quality, and originality of the stories without knowing whether the stories were written with AI assistance.

The researchers found that stories written with access to AI-generated ideas were rated higher in creativity, quality, and enjoyability compared to those written without AI assistance. This enhancement was particularly notable among participants with lower inherent creativity. For these less creative writers, having access to multiple AI ideas resulted in substantial improvements in both the novelty of their stories. These improvements brought their work to a level comparable to that of more inherently creative participants.

We find that getting ideas from generative AI improves the creativity of a story, Doshi and Hauser told PsyPost. What surprised us was that almost all of the increase in creativity was experienced by the least creative writers in our sample. Not only that, but getting multiple AI ideas put the assessed creativity of their stories on par with those who are the most creative in our sample. We saw a clear level the playing field effect of getting AI ideas on the creativity of the story.

A downside of using AI-generated ideas, however, was the increased similarity among the stories. The researchers found that stories from the AI-assisted groups were more alike both to each other and to the AI-generated ideas. This raises concerns about the potential homogenization of creative outputs if AI tools become widely used. The increased similarity suggests that while AI can enhance individual creativity, it might do so at the expense of collective diversity and novelty in creative works.

Another interesting finding was the discrepancy between participants self-assessments and the external evaluations of their stories. Participants who used AI assistance did not rate their own stories as more creative or enjoyable compared to those who did not use AI. However, external evaluators consistently rated the AI-assisted stories higher. This suggests that individuals might not fully recognize the enhancements provided by AI to their creative outputs.

Generative AI tools, like ChatGPT, improve the average creativity of a writers story, but collectively, stories that had AI ideas looked more like one another than those that did not receive AI assistance, Doshi and Hauser said. So there are potentially significant implications both positive and negativefor individuals and society as a whole.

The researchers added that the findings point to a social dilemma: Individual stories are evaluated as being more creative, so people looking to improve their writing might turn to AI. But, if we all do so, then the collective novelty of ideas decreases, which may not be desirable from societys viewpoint.

The study highlights both the potential benefits and risks of AI-assisted creativity. But as with all research, there are some caveats to note.

Our study included a specific use of AI in order for us to better control the experiment, the researchers explained. We controlled the prompt and we did not allow for participants to interact with the AI. We did so because we did not want to create a situation where, say, better writers can provide better prompts to elicit better ideas from AI and they write better stories. That would break our goal of identifying a causal effect of AI ideas on creativity. So, there is opportunity to build on our work and understand how different prompts and interactions play a role in the creative process.

We are developing a research agenda around generative AI to understand how it might be use in a broad array of economic activities, Doshi and Hauser added. For example, we are thinking working on a project to look at how AI assists with creation of new ideas in different settings, such as the development of a companys strategy. We are also looking at how different types of people might respond differently to generative AI. Overall, our goal is to provide research that organizational and societal leaders can use when considering their own AI policies and strategies.

The study, Generative AI enhances individual creativity but reduces the collective diversity of novel content, was published July 12, 2024.

View original post here:

New study on AI-assisted creativity reveals an interesting social dilemma - PsyPost

AMD launches Amuse 2.0 generative AI tool for Ryzen and Radeon, features XDNA Super Resolution – VideoCardz.com

Shortly after Intel announced its AI Playground, AMD made a similar announcement.

This is the first release of the software, which is clearly labeled as Beta and in an experimental state. It may not perform as expected, but according to AMDs blog post, it should already provide a lot of fun for users. However, the user base will be limited to Ryzen and Radeon users.

The Amuse 2.0 software, developed with TensorStack, is designed to be simple to use, without the need to download a lot of external dependencies, use command lines, or run anything other than a single one-click executable. Compared to Intels AI Playground, the Amuse software does not support running chatbots based on Large Language Models. Currently, Amuse is only for generative AI for images (for now?)

According to AMD, Amuse uses Stable Diffusion models (open-weight models). The software will support AMD Ryzen AI 300 (recently launched Strix Point laptops), AMD Ryzen 8040 (Hawk Point), and Radeon RX 7000 series. The list is rather short, and its unclear why Radeon RX 6000 and below are excluded and why Ryzen 7040 (Phoenix), featuring nearly identical specs to Hawk Point, is not included either. However, it is assumed this will change in the future.

AMUE AI Tool, Source: AMD

AMD recommends 24GB of RAM or higher for Ryzen AI 300 and 32GB of RAM for Ryzen 8040. Theres no memory requirement listed for Radeon RX 7000 GPUs.

Amuse 2.0 features

It is worth noting that the tool supports something called XDNA Super Resolution, a special mode that is supposed to upscale images by a factor of 2. Here is a full list of supported features.

Source: AMD

See the rest here:

AMD launches Amuse 2.0 generative AI tool for Ryzen and Radeon, features XDNA Super Resolution - VideoCardz.com

Google claims math breakthrough with proof-solving AI models – Ars Technica

Enlarge / An illustration provided by Google.

On Thursday, Google DeepMind announced that AI systems called AlphaProof and AlphaGeometry 2 reportedly solved four out of six problems from this year's International Mathematical Olympiad (IMO), achieving a score equivalent to a silver medal. The tech giant claims this marks the first time an AI has reached this level of performance in the prestigious math competitionbut as usual in AI, the claims aren't as clear-cut as they seem.

Google says AlphaProof uses reinforcement learning to prove mathematical statements in the formal language called Lean. The system trains itself by generating and verifying millions of proofs, progressively tackling more difficult problems. Meanwhile, AlphaGeometry 2 is described as an upgraded version of Google's previous geometry-solving AI modeI, now powered by a Gemini-based language model trained on significantly more data.

According to Google, prominent mathematicians Sir Timothy Gowers and Dr. Joseph Myers scored the AI model's solutions using official IMO rules. The company reports its combined system earned 28 out of 42 possible points, just shy of the 29-point gold medal threshold. This included a perfect score on the competition's hardest problem, which Google claims only five human contestants solved this year.

The IMO, held annually since 1959, pits elite pre-college mathematicians against exceptionally difficult problems in algebra, combinatorics, geometry, and number theory. Performance on IMO problems has become a recognized benchmark for assessing an AI system's mathematical reasoning capabilities.

Google states that AlphaProof solved two algebra problems and one number theory problem, while AlphaGeometry 2 tackled the geometry question. The AI model reportedly failed to solve the two combinatorics problems. The company claims its systems solved one problem within minutes, while others took up to three days.

Google says it first translated the IMO problems into formal mathematical language for its AI model to process. This step differs from the official competition, where human contestants work directly with the problem statements during two 4.5-hour sessions.

Google reports that before this year's competition, AlphaGeometry 2 could solve 83 percent of historical IMO geometry problems from the past 25 years, up from its predecessor's 53 percent success rate. The company claims the new system solved this year's geometry problem in 19 seconds after receiving the formalized version.

Despite Google's claims, Sir Timothy Gowers offered a more nuanced perspective on the Google DeepMind models in a thread posted on X. While acknowledging the achievement as "well beyond what automatic theorem provers could do before," Gowers pointed out several key qualifications.

"The main qualification is that the program needed a lot longer than the human competitorsfor some of the problems over 60 hoursand of course much faster processing speed than the poor old human brain," Gowers wrote. "If the human competitors had been allowed that sort of time per problem they would undoubtedly have scored higher."

Gowers also noted that humans manually translated the problems into the formal language Lean before the AI model began its work. He emphasized that while the AI performed the core mathematical reasoning, this "autoformalization" step was done by humans.

Regarding the broader implications for mathematical research, Gowers expressed uncertainty. "Are we close to the point where mathematicians are redundant? It's hard to say. I would guess that we're still a breakthrough or two short of that," he wrote. He suggested that the system's long processing times indicate it hasn't "solved mathematics" but acknowledged that "there is clearly something interesting going on when it operates."

Even with these limitations, Gowers speculated that such AI systems could become valuable research tools. "So we might be close to having a program that would enable mathematicians to get answers to a wide range of questions, provided those questions weren't too difficultthe kind of thing one can do in a couple of hours. That would be massively useful as a research tool, even if it wasn't itself capable of solving open problems."

View original post here:

Google claims math breakthrough with proof-solving AI models - Ars Technica

Reddit is now blocking major search engines and AI bots except the ones that pay – The Verge

Reddit is ramping up its crackdown on web crawlers. Over the past few weeks, Reddit has started blocking search engines from surfacing recent posts and comments unless the search engine pays up, according to a report from 404 Media.

Right now, Google is the only mainstream search engine that shows recent results when you search for posts on Reddit using the site:reddit.com trick, 404 Media reports. This leaves out Bing, DuckDuckGo, and other alternatives likely because Google has struck a $60 million deal that lets the company train its AI models on content from Reddit.

This is not at all related to our recent partnership with Google, Reddit spokesperson Tim Rathschmidt says in a statement to The Verge. We have been in discussions with multiple search engines. We have been unable to reach agreements with all of them, since some are unable or unwilling to make enforceable promises regarding their use of Reddit content, including their use for AI.

Last month, to enforce its policy against scraping, Reddit updated the sites robots.txt file, which tells web crawlers whether they can access a site. Its a signal to those who dont have an agreement with us that they shouldnt be accessing Reddit data, Ben Lee, Reddits chief legal officer, told my colleague Alex Heath in Command Line.

In a statement to The Verge, Microsoft spokesperson Caitlin Roulston said, Microsoft respects the robots.txt standard and we honor thedirections provided bywebsites that do not want content on their pages to be used with our generative AI models, adding that Bing stopped crawling Reddit when the platform updated its robots.txt file on July 1st.

Its a bold move for a massive website like Reddit to block some of the most popular search engines, but its not all that surprising. Over the past year, Reddit has become more protective of its data as it looks to open up another source of revenue and appease new investors. After making its API more expensive for some third-party developers, Reddit reportedly threatened to cut off Google if it didnt stop using the platforms data to train AI for free.

With AI chatbots filling the internet with questionable content, finding things written by a fellow human has never been more important. I, like many others, have started appending Reddit to many of my searches just to get human answers, and its pretty frustrating to know that Ill now only be able to do that on Google (or search engines that rely on it) especially when I do many of my searches on Bing.

Update, July 24th: Added a statement from Reddit.

Update, July 25th: Added a statement from Microsoft.

Continued here:

Reddit is now blocking major search engines and AI bots except the ones that pay - The Verge

Unlock real-time insights with AI-powered analytics in Microsoft Fabric – Microsoft

The data and analytics landscape is changing faster than ever. From the emergence of generative AI to the proliferation of citizen analysts to the increasing importance of real-time, autonomous action, keeping up with the latest trends can feel overwhelming. Every trend requires new services that customers must manually stitch into their data estatedriving up both cost and complexity.

With Microsoft Fabric, we are simplifying and future-proofing your data estate with an ever-evolving, AI-powered data analytics platform. Fabric will keep up with the trends for you and seamlessly integrate each new capability so you can spend less time integrating and managing your data estate and more time unlocking value from your data.

Set up Fabric for your business and discover resources that help you take the first steps

Aurizon, Australias largest rail freight operator, turned to Fabric to modernize their data estate and analytics system.

With Microsoft Fabric, weve answered many of our questions about navigating future growth, to remove legacy systems, and to streamline and simplify our architecture. A trusted data platform sets us up to undertake complex predictive analytics and optimizations that will give greater surety for our business and drive commercial benefits for Aurizon and our customers in the very near future.

Aurizon is just one among thousands of customers who have already used Fabric to revolutionize how they connect to and analyze their data. In fact, a 2024 commissioned Total Economic Impact (TEI) study conducted by Forrester Consulting found that Microsoft Fabric customers saw a three-year 379% return on investment (ROI) with a payback period of less than six months. We are thrilled to share a huge range of new capabilities coming to Fabric. These innovations will help you more effectively uncover insights and keep you at the forefront of the trends in data and analytics. Check out a quick overview of the biggest changes coming to Fabric.

Prepare your data for AI innovation with Microsoft Fabricnow generally available

Fabric is a complete data platformgiving your data teams the ability to unify, transform, analyze, and unlock value from data from a single, integrated software as a service (SaaS) experience. We are excited to announce additions to the Fabric workloads that will make Fabrics capabilities even more robust and even customizable to meet the unique needs of each organization. These enhancements include:

When we introduced Fabric, it launched with seven core workloads which included Synapse Real-time Analytics for data streaming analysis and Data Activator for monitoring and triggering actions in real-time. We are unveiling an enhanced workload called Real-Time Intelligence that combines these workloads and brings an array of additional new features, in preview, to help organizations make better decisions with up-to-the-minute insights. From ingestion to transformation, querying, and taking immediate action, Real-Time Intelligence is an end-to-end experience that enables seamless handling of real-time data without the need to land it first. With Real-Time Intelligence, you can ingest streaming data with high granularity, dynamically transform streaming data, query data in real-time for instant insights, and trigger actions like alerting a production manager when equipment is overheating or rerunning jobs when data pipelines fail. And with both simple, low-code or no-code, and powerful, code-rich interfaces, Real-Time Intelligence empowers every user to work with real-time data.

Behind this powerful workload is the Real-time hub, a single place to discover, manage, and use event streaming data from Fabric and other data sources from Microsoft, third-party cloud providers, and other external data sources. Just like the OneLake data hub makes it easy to discover, manage, and use the data at rest, the Real-time hub can help you do the same for data in motion. All events that flow through the Real-time hub can be easily transformed and routed to any Fabric data store and users can create new streams that can be discovered and consumed. From the Real-time hub, users can gain insights through the data profile, configure the right level of endorsement, set alerts on changing conditions and more, all without leaving the hub. While the existing Real-Time Analytics capabilities are still generally available, the Real-time hub and the other new capabilities coming to the Real-Time Intelligence workload are currently in preview. Watch this demo video to check out the redesigned Real-Time Intelligence experience:

Elcome, one of the worlds largest marine electronics companies, built a new service on Fabric called Welcome that helps maritime crews stay connected to their families and friends.

Microsoft Fabric Real-Time Intelligence has been the essential building block thats enabled us to monitor, manage, and enhance the services we provide. With the help of the Real-time hub for centrally managing data in motion from our diverse sources and Data Activator for event-based triggers, Fabrics end-to-end cloud solution has empowered us to easily understand and act on high-volume, high-granularity events in real-time with fewer resources.

Real-time insights are becoming increasingly critical across industries like route optimization in transportation and logistics, grid monitoring in energy and utilities, predictive maintenance in manufacturing, and inventory management in retail. And since Real-Time Intelligence comes fully optimized and integrated in a SaaS platform, adoption is seamless. Strathan Campbell, Channel Environment Technology Lead at One NZthe largest mobile carrier in New Zealandsaid they went from a concept to a delivered product in just two weeks. To learn more about the Real-Time Intelligence workload, watch the Ingest, analyze and act in real time withMicrosoft Fabric Microsoft Build session or read the Real-Time Intelligence blog.

Fabric was built from the ground up to be extensible, customizable, and open. Now, we are making it even easier for software developers and customers to design, build, and interoperate applications within Fabric with the new Fabric Workload Development Kitcurrently in preview. Applications built with this kit will appear as a native workload within Fabric, providing a consistent experience for users directly in their Fabric environment without any manual effort. Software developers can publish and monetize their custom workloads through Azure Marketplace. And, coming soon, we are creating a workload hub experience in Fabric where users can discover, add, and manage these workloads without ever leaving the Fabric environment. We already have industry-leading partners building on Fabric including SAS, Esri, Informatica, Teradata, and Neo4j.

You can also learn more about the Workload Development Kit by watching the Extend and enhance your analytics applications with Microsoft FabricMicrosoft Build session.

We are also excited to announce two new features, both in preview, created with developers in mind: API for GraphQL and user data functions in Fabric. API for GraphQL is a flexible and powerful RESTful API that allows data professionals to access data from multiple sources in Fabric with a single query API. With API for GraphQL, you can streamline requests to reduce network overheads and accelerate response rates. User data functions are user-defined functions built for Fabric experiences across all data services, such as notebooks, pipelines, or event streams. These features enable developers to build experiences and applications using Fabric data sources more easily like lakehouses, data warehouses, mirrored databases, and more with native code ability, custom logic, and seamless integration. You can watch these features in action in the Introducing API for GraphQL and User Data Functions in Microsoft Fabric Microsoft Build session.

You can also learn more about the Workload Development Kit, the API for GraphQL, user data functions, and more by reading the Integrating ISV apps with Microsoft Fabric blog.

We are also announcing the preview of Data workflows in Fabric as part of the Data Factory experience. Data workflows allow customers to define Directed Acyclic Graphs (DAG) files for complex data workflow orchestration in Fabric. Data workflows is powered by the Apache Airflow runtime and designed to help you author, schedule and monitor workflows or data pipelines using python. Learn more by reading the data workflows blog.

The typical data estate has grown organically over time to span multiple clouds, accounts, databases, domains, and engines with a multitude of vendors and specialized services. OneLake, Fabrics unified, multi-cloud data lake built to span an entire organization, can connect to data from across your data estate and reduce data duplication and sprawl.

We are excited to announce the expansion of OneLake shortcuts to connect to data from on-premises and network-restricted data sources beyond just Azure Data Lake Service Gen2, now in preview. With an on-premises data gateway, you can now create shortcuts to Google Cloud Storage, Amazon S3, and S3 compatible storage buckets that are either on-premises or otherwise network-restricted. To learn more about these announcements, watch the Microsoft Build session Unify your data with OneLake and Microsoft Fabric.

Insights drive impact only when they reach those who can use them to inform actions and decisions. Professional and citizen analysts bridge the gap between data and business results, and with Fabric, they have the tools to quickly manage, analyze, visualize, and uncover insights that can be shared with the entire organization. We are excited to help analysts work even faster and more effectively by releasing the model explorer and the DAX query view in Microsoft Power BI Desktop into general availability.

The model explorer in Microsoft Power BI provides a rich view of all the semantic model objects in the data panehelping you find items in your data fast. You can also use the model explorer to create calculation groups and reduce the number of measures by reusing calculation logic and simplifying semantic model consumption.

The DAX query view in Power BI Desktop lets users discover, analyze, and see the data in their semantic model using the DAX query language. Users working with a model can validate data and measures without having to build a visual or use an additional toolsimilar to the Explore feature. Changes made to measures can be seamlessly updated directly back to the semantic model.

To learn more about these announcements and others coming to Power BI, check out the Power BI blog.

When ChatGPT was launched, it had over 100 million users in just over two monthsthe steepest adoption curve in the history of technology.1 Its been a year and a half since that launch, and organizations are still trying to translate the benefit of generative AI from novelty to actual business results. By infusing generative AI into every layer of Fabric, we can empower your data professionals to employ its benefits, in the right context and in the right scenario to get more done, faster.

Copilot in Fabric was designed to help users unlock the full potential of their data by assisting data professionals to be more productive and business users to explore their data more easily. With Copilot in Fabric, you can use conversational language to create dataflows, generate code and entire functions, build machine learning models, or visualize results. We are excited to share that Copilot in Fabric is now generally available, starting with the Power BI experience. This includes the ability to create stunning reports and summarize your insights into narrative summaries in seconds. Copilot in Fabric is also now enabled on-by-default for all eligible tenants including Copilot in Fabric experiences for Data Factory, Data Engineering, Data Science, Data Warehouse, and Real-Time Intelligence, which are all still in preview. The general availability of Copilot in Fabric for the Power BI experience will be rolling out over the coming weeks to all customers with Power BI Premium capacity (P1 or higher) or Fabric capacity (F64 or higher).

We are also thrilled to announce a new Copilot in Fabric experience for Real-Time Intelligence, currently in preview, that enables users to explore real-time data with ease. Starting with a Kusto Query Language (KQL) Queryset connected to a KQL Database in an Eventhouse or a standalone Azure Data Explorer database, you can type your question in conversational language and Copilot will automatically translate it to a KQL query you can execute. This experience is especially powerful for users less familiar with writing KQL queries but still want to get the most from their time-series data stored in Eventhouse.

We are also thrilled to release a new AI capability in preview called AI skillsan innovative experience designed to provide any user with a conversational Q&A experience about their data. AI skills allow you to simply select the data source in Fabric you want to explore and immediately start asking questions about your dataeven without any configuration. When answering questions, the generative AI experience will show the query it generated to find the answer and you can enhance the Q&A experience by adding more tables, setting additional context, and configuring settings. AI skills can empower everyone to explore data, build and configure AI experiences, and get the answers and insights they need.

AI skills will honor existing security permissions and can be configured to respect the unique language and nuances of your organization, ensuring that responses are not just data-driven but steeped in the context of your business operations. And, coming soon, it can also enrich the creation of new copilots in Microsoft Copilot Studio and be interacted with from Copilot for Microsoft for 365. Its about making your data not just accessible but approachable, inviting users to explore insights through natural dialogue, and shortening the time to insight.

With the launch of Fabric, weve committed to open data formats, standards, and interoperability with our partners to give our customers the flexibility to do what makes sense for their business. We are taking this commitment a step further by expanding our existing partnership with Snowflake to expandinteroperability between Snowflake and Fabrics OneLake. We are excited to announce future support for Apache Iceberg in Fabric OneLake and bi-directional data access between Snowflake and Fabric. This integration will enable users to analyze their Fabric and Snowflake data written in Iceberg format in any engine within either platform, and access data across apps like Microsoft 365, Microsoft Power Platform, and Microsoft Azure AI Studio

With the upcoming availability of shortcuts for Iceberg in OneLake, Fabric users will be able to access all data sources in Iceberg format, including the Iceberg sources from Snowflake, and translate metadata between Iceberg and Delta formats. This means you can work with a single copy of your data across Snowflake and Fabric. Since all the OneLake data can be accessed in Snowflake as well as in Fabric, this integration will enable you to spend less time stitching together applications and your data estate, and more time uncovering insights.

We are also excited to announce we are expanding our existing relationship with Adobe. Adobe Experience Platform (AEP) and Adobe Campaign will have the ability to federate enterprise data from Fabric. Our joint customers will soon have the capability to connect to Fabric and use the Fabric Data Warehouse for query federation to create and enrich audiences for engagement, without having to transfer or extract the data from Fabric.

We are excited to announce that we are expanding the integration between Fabric and Azure Databricksallowing you to have a truly unified experience across both products and pick the right tools for any scenario.

Coming soon, you will be able to access Azure Databricks Unity Catalog tables directly in Fabric, making it even easier to unify Azure Databricks with Fabric. From the Fabric portal, you can create and configure a new Azure Databricks Unity Catalog item in Fabric with just a few clicks. You can add a full catalog, a schema, or even individual tables to link and the management of this Azure Databricks item in OneLakea shortcut connected to Unity Catalogis automatically taken care of for you.

This data acts like any other data in OneLakeyou can write SQL queries or use it with any other workloads in Fabric including Power BI through Direct Lake mode. When the data is modified or tables are added, removed, or renamed in Azure Databricks, the data in Fabric will remain always in sync. This new integration makes it simple to unify Azure Databricks data in Fabric and seamlessly use it across every Fabric workload.

Also coming soon, Fabric users will be able to access Fabric data items like lakehouses as a catalog in Azure Databricks. While the data remains in OneLake, you can access and view data lineage and other metadata in Azure Databricks and leverage the full power of Unity Catalog. This includes extending Unity Catalogs unified governance over data and AI into Azure Databricks Mosaic AI. In total, you will be able to combine this data with other native and federated data in Azure Databricks, perform analysis assisted by generative AI, and publish the aggregated data back to Power BImaking this integration complete across the entire data and AI lifecycle.

Join us at Microsoft Buildfrom May 21 to 23, 2024 to see all of these announcements in action across the following sessions:

You can also try out these new capabilities and everything Fabric has to offer yourself by signing up for a free 60-day trialno credit card information required. To start your free trial, sign up for a free account (Power BI customers can use their existing account), and once signed in, select start trial within the account manager tool in the Fabric app. Existing Power BI Premium customers can already access Fabric by simply turning on Fabric in their Fabric admin portal. Learn more on the Fabric get started page.

We are excited to announce a European Microsoft Fabric Community Conference that will be held in Stockholm, Sweden from September 23 to 26, 2024. You can see firsthand how Fabric and the rest of the data and AI products at Microsoft can help your organization prepare for the era of AI. You will hear from leading Microsoft and community experts from around the world and get hands on experiences with the latest features from Fabric, Power BI, Azure Databases, Azure AI, Microsoft Purview, and more. You will also have the opportunity to learn from top data experts and AI leaders while having the chance to interact with your peers and share your story. We hope you will join usand see how cutting-edge technologies from Microsoft can enable your business success with the power of Fabric.

If you want to learn more about Microsoft Fabric:

Experience the next generation in analytics

1ChatGPT sets record for fastest-growing user base analyst note, Reuters.

Arun Ulagaratchagan

Corporate Vice President, Azure DataMicrosoft

Arun leads product management, engineering, and cloud operations for Azure Data, which includes databases, data integration, big data analytics, messaging, and business intelligence. The products in his teams' portfolio include Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure MySQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, Power BI, and Microsoft Fabric.

The rest is here:

Unlock real-time insights with AI-powered analytics in Microsoft Fabric - Microsoft