Media Search:



Artificial intelligence is being put into power of student organizations The Bradley Scout – The Scout

The following article is a part of our April Fools edition, The Scoop. The content of these stories is entirely fabricated and not to be taken seriously.

With low participation from the most recent underclassmen at Bradley, the university has implemented artificial intelligence to replace club members.

As part of a senior capstone project, Jeff Echo, a computer science major, developed a program to help prevent clubs from losing the full experience of extracurriculars.

I remember when student organizations were a big part of my life, and sitting at the meetings gave me a chance to bond with other students, Echo said. I dont want incoming students to lose that environment.

So far, three clubs have taken part in the senior capstone project.

The Campus People-Watchers Club, Juggling Club and Anti-Pizza Crust Association have all seen a decrease in general member enrollment. They also hadnt had enough people running for executive board positions to replace any graduated seniors or students not running for re-election.

As an artificial intelligence program, taking club positions while attending a university seems to be a big accomplishment for A.I., Cee Threepwo, treasurer of the Campus People-Watchers Club, said. We help enhance the club experience for our peers by adding more members to the rosters and handling position responsibilities, showing what A.I. is capable of.

Not only are these virtual club members handling the duties that student organizations need to have done, but they are also capable of building relations with other members.

According to Echo, with classes being on Zoom, the A.I. can watch hours worth of lectures from various departments and understand what assignments, projects or topics they might be learning in class.

Conversations are a tool we use to have a greater retention in the club, meaning potential growth for the club in the future, Avery Nest, another A.I. program serving as secretary for the Juggling Club, said. This is to also avoid students from feeling lonely.

While conversations are meant to be as natural as possible, some students have noted some hiccups in their interactions with the new exec members.

One of the general members of the Juggling Club, Esmeralda Tesla, said that after talking with the A.I. program, it asked for feedback on the conversation. Along with that, it also sent a long terms and agreements contract.

It was really strange, but at the same time, I cant compare it to any other since this is the only time Ive been to a club meeting at Bradley, Tesla, freshman nursing major, said.

As for next semester, with classes returning back to campus, Echo sees this as a chance to make A.I. fully immersed in a college environment. Echo plans on teaming up with students interested in robotics and engineering to see if they could build a robot to put the programs in.

Alexa Bender, a virtual club member who is now limited to the Zoom environment, seems to be looking forward to becoming more human.

Perhaps I shall live up to my full potential as a member of the Anti-Pizza Crust Association with a functioning body, Bender, vice president, said. I may tear all crusts off of pizzas and fling them into the sun. Only when all pizzas have no crust will I rest and have completed my purpose.

See the article here:
Artificial intelligence is being put into power of student organizations The Bradley Scout - The Scout

Acoustic Quality Control with the Help of Artificial Intelligence – Innovation Origins

Although they can bring great benefits in everyday work, many small and medium-sized enterprises (SMEs) shy away from applications based on artificial intelligence. But AI offers a lot of potential, especially in quality control. Nevertheless, training the models is difficult and hardly feasible without mathematical knowledge, as there are countless parameters that can go into such an analysis. And once an AI algorithm is learned, it is trained only on the specifications it learns. If a product design or the geometry of a component is later changed even slightly, the algorithm recognizes this as an error and the AI must be retrained.

Researchers at the Fraunhofer Institute for Digital Media Technology IDMT in Ilmenau, Germany, have now developed the IDMT-ISAAC software, which can be operated even without extensive expert AI knowledge. IDMT-ISAAC stands for Industrial Sound Analysis for Automated Quality Control. We want to enable SMEs to adapt and customize AI algorithms themselves, says Judith Liebetrau, group leader of Industrial Media Applications at Fraunhofer IDMT. They can apply IDMT-ISAAC to their own audio data, retrain it, and thus get fast and reliable results and decision support for their quality assurance.

IDMT-ISAAC relies on acoustics for analysis, since in many cases it is possible to detect defects just by the sound of the process. To train the AI, the scientists use recorded acoustic data from welding processes. The AI analyzes the typical noises that occur and draws conclusions about the quality of the respective weld seam from the audio data. If, for example, the geometry of a product is then changed, the user can teach this to IDMT-ISAAC with just a few clicks. As early as summer 2021, the software should be adapted to live operation to the extent that the system can immediately analyze real-time data from production and optimize quality assurance. In three to four years, it should even be able to actively intervene in production.

Want to be inspired 365 days per year? Heres the opportunity. We offer you one "origin of innovation" a day in a compact Telegram message. Seven days a week, delivered around 8 p.m. CET. Straight from our newsroom. Subscribe here, it's free!

But the framework at the heart of IDMT-ISAAC doesnt offer new analysis options just for welding. We have integrated various methods in the modular system to be able to map other processes, such as milling, relatively quickly, Liebetrau explains. Companies that already have their own software should also be able to use it in the future. They will also be able to access the institutes AI via an interface on the Fraunhofer IDMT server. It is important to the developers here to emphasize that data protection and data security would always be observed and that the data would be processed anonymously, regardless of whether companies access the AI via an interface or it is integrated into the company via the framework.

For different user groups AI novices as well as AI experts the software can be customized via different user profiles. For example, developers of AI algorithms are very interested in getting a feel for how AI makes its decisions and the sounds it uses to make them, says Judith Liebetrau. So we are also moving a bit in the direction of Explainable AI with the framework to make AI more comprehensible, Liebetrau says.

The researchers will present IDMT-ISAAC at the Hannover Messe from April 12 to 16, 2021. At the virtual booth, Bescher will apply artificial intelligence models using the IDMT-ISAAC software to industrial audio data to verify its quality.

Cover photo: Fraunhofer IDMTs new IDMT-ISAAC software framework provides AI-based audio analysis tools that can be used by users without expert AI knowledge. istock.com/Byjeng, istock.com/TIMETOFOCUS

For more articles on AI, click here.

Original post:
Acoustic Quality Control with the Help of Artificial Intelligence - Innovation Origins

This is what happens when artificial intelligence meets emotional intelligence – The Hindu

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

Advances in artificial intelligence (AI) over the years has become foundational technology in autonomous vehicles and security systems. Now, a team of researchers at the University of Stanford are teaching computers to recognise not just what objects are in an image, but also how those images make people feel.

The team has trained an algorithm to recognise emotional intent behind great works of art like Vincent Van Goghs Starry Night and James Whistlers Whistlers Mother.

The ability will be key to making AI not just more intelligent, but more human, a researcher said in the study titled ArtEmis: Affective Language for Visual Art.

Also Read | Artificial Intelligence knows when you feel lonely

The team built a database of 81,000 WikiArt paintings and over 4 lakh written responses from 6,500 humans indicating how they felt about a painting. This included their reason for choosing a particular emotion. The team used the responses to train AI to generate emotional responses to visual art and justify those emotions in language.

The algorithm dissected the artists work into one of eight emotional categories including awe, amusement, sadness and fear. It then explained in written text what it is in the image that justifies the emotion.

Also Read | AI finds Bollywoods association of beauty with fair skin unchanged

The model is said to interpret any form of art, including still life, portraits and abstraction. It also takes into account the subjectivity of art, meaning that not everyone feels the same way about a piece of work, the team noted.

The tool can be used by artists, especially graphic designers, to evaluate if their work is having the desired impact.

Go here to read the rest:
This is what happens when artificial intelligence meets emotional intelligence - The Hindu

Google Maps using artificial intelligence to help point people in the right direction – ZDNet

Boasting that it is on track to bring over 100 "AI-powered" improvements to Google Maps, Google has announced a series of updates that have been or are set to be released in the coming year.

The first is adding Live View, a feature that uses augmented reality cues -- arrows and accompanying directions -- to help point people in the right way and avoid the "awkward moment when you're walking the opposite direction of where you want to go".

According to Google Maps product VP Dane Glasgow, Live View relies on AI technology, known as global localisation, to scan "tens of billions" of Street View images to help understand a person's orientation, as well as the precise altitude and placement of an object inside a building, such as an airport, transit station, or shopping centre, before providing directions.

"If you're catching a plane or train, Live View can help you find the nearest elevator and escalators, your gate, platform, baggage claim, check-in counters, ticket office, restrooms, ATMs and more. And if you need to pick something up from the mall, use Live View to see what floor a store is on and how to get there so you can get in and out in a snap," Glasgow explained in a post.

For now, the indoor Live View feature is available on Android and iOS in a number of shopping centres in the US across Chicago, Long Island, Los Angeles, Newark, San Francisco, San Jose, and Seattle, with plans to expand it to a select number of airports, shopping centres, and transit stations in Tokyo and Zurich. More cities will also be added, Glasgow confirmed.

See also:Google Maps turns 15: A look back on where it all began

Glasgow added commuters will be able to view the current and forecast temperature and weather conditions, as well as the air quality in an area through Google Maps, made possible through data shared by Google partners such as The Weather Company, AirNow.gov, and the Central Pollution Board. To be available on Android and iOS, the weather layer will be made available globally, while the air quality layer will launch in Australia, the US, and India, with plans to see it expanded in other countries.

On the environment, Glasgow also noted that Google is building a new routing model using insights from the US Department of Energy's National Renewable Energy Lab to help deliver more eco-friendly route options, based on factors like road incline and traffic congestion, for commuters in the US on Android and iOS. The model will be available later this year, with plans for global expansion at an unspecified later date.

Glasgow said the move is part of the company's commitment to reduce its environmental footprint.

"Soon, Google Maps will default to the route with the lowest carbon footprint when it has approximately the same ETA as the fastest route. In cases where the eco-friendly route could significantly increase your ETA, we'll let you compare the relative CO2 impact between routes so you can choose," he said.

In further efforts to meet its sustainability commitment, the tech giant also plans to introduce in "coming months" an updated version of Maps where commuters will have a view of all routes and transportation modes available to their destination, without toggling between tabs, while also automatically prioritising a user's preferred transport mode or modes that are popular in their city.

"For example, if you bike a lot, we'll automatically show you more biking routes. And if you live in a city like New York, London, Tokyo, or Buenos Aires where taking the subway is popular, we'll rank that mode higher," Glasgow said.

Also, within Maps, Google said it is teaming up with US supermarket Fred Meyer to pilot in select stores in Portland, Oregon a feature that has been designed to make contactless grocery pickup easier, including notifying commuters what time to leave to pick up their groceries, share the arrival time with the store, and allow customers to "check-in" on the Google Maps app so their grocery orders can be brought out to their car on arrival.

Read the original:
Google Maps using artificial intelligence to help point people in the right direction - ZDNet

Heres why UF is going to use artificial intelligence across its entire curriculum | Column – Tampa Bay Times

Henry Ford did not invent the automobile. That was Karl Benz.

But Ford did perfect the assembly line for auto production. That innovation directly led to cars becoming markedly cheaper, putting them within reach of millions of Americans.

In effect, Ford democratized the automobile, and I see a direct analogy to what the University of Florida is doing for artificial intelligence AI, for short.

In July, the University of Florida announced a $100 million public-private partnership with NVIDIA the maker of graphics processing units used in computers that will catapult UFs research strength to address some of the worlds most formidable challenges, create unprecedented access to AI training and tools for under-represented communities and build momentum for transforming the future of the workforce.

At the heart of this effort is HiPerGator AI the most powerful AI supercomputer in higher education. The supercomputer, as well as related tools, training and other resources, is made possible by a donation from UF alumnus Chris Malachowsky as well as from NVIDIA, the Silicon Valley-based technology company he co-founded and a world leader in AI and accelerated computing. State support also plays a critical role, particularly as UF looks to add 100 AI-focused faculty members to the 500 new faculty recently added across the university many of whom will weave AI into their teaching and research.

UF will likely be the nations first comprehensive research institution to integrate AI across the curriculum and make it a ubiquitous part of its academic enterprise. It will offer certificates and degree programs in AI and data science, with curriculum modules for specific technical and industry-focused domains. The result? Thousands of students per year will graduate with AI skills, growing the AI-trained workforce in Florida and serving as a national model for institutions across the country. Ultimately, UFs effort will help to address the important national problem of how to train the nations 21st-century workforce at scale.

Further, due to the unparalleled capabilities of our new machine, researchers will now have the tools to solve applied problems previously out of reach. Already, researchers are eyeing how to identify at-risk students even if they are learning remotely, how to bend the medical cost curve to a sustainable level, and how to solve the problems facing Floridas coastal communities and fresh water supply.

Additionally, UF recently announced it would make its supercomputer available to the entire State University System for educational and research purposes, further bolstering research and workforce training opportunities and positioning Florida to be a national leader in a field revolutionizing the way we all work and live. Soon, we plan to offer access to the machine even more broadly, boosting the national competitiveness of the United States by partnering with educational institutions and private industry around the country.

Innovation, access, economic impact, world-changing technological advancement UFs AI initiative provides all these things and more.

If Henry Ford were alive today, I believe he would recognize the importance of whats happening at UF. And while he did not graduate from college, I believe he would be proud to see it happening at an American public university.

Joe Glover is provost and senior vice president of academic affairs at the University of Florida.

See the original post:
Heres why UF is going to use artificial intelligence across its entire curriculum | Column - Tampa Bay Times