A Cornell University committee has released recommendations for how faculty can take generative artificial intelligence into account when considering learning objectives for their students.
A machine-learning algorithm demonstrated the capability to process data that exceeds a computer’s available memory by identifying a massive data set’s key features and dividing them into manageable batches that don’t choke computer hardware. Developed at Los Alamos National Laboratory, the algorithm set a world record for factorizing huge data sets during a test run on Oak Ridge National Laboratory’s Summit, the world’s fifth-fastest supercomputer.
Equally efficient on laptops and supercomputers, the highly scalable algorithm solves hardware bottlenecks that prevent processing information from data-rich applications in cancer research, satellite imagery, social media networks, national security science and earthquake research, to name just a few.
As the use of artificial intelligence continues to rapidly evolve, Cedars-Sinai is tapping its own experts to create and deploy AI-driven solutions to real-time healthcare challenges.
Cornell has received a $1.7 million grant from the National Institutes of Health (NIH) to develop a program that combines precision nutrition with advanced data science and analytical methods, equipping students to address complex health challenges like nutrition disparities and diet-related chronic diseases.
A University of Arkansas at Little Rock professor and students are part of a university startup that has received a $2.2 million grant to develop an advanced artificial-intelligence automation and rapid-recovery hardware to protect industrial control systems from cybersecurity attacks.
Using artificial intelligence applications to help craft a message to a friend is not a good idea – at least if your friend finds out about the use of AI, a new study suggests. Researchers found that people in the study perceived that a fictional friend who used AI assistance to write them a message didn’t put forth as much effort as a friend who wrote a message themselves.
Treatment planning for lung cancer can often be complex due to variations in assessing immune biomarkers. In a new study, Yale Cancer Center researchers at Yale School of Medicine used artificial intelligence (AI) tools and digital pathology to improve the accuracy of this process.
As of today, climate models face the challenge of providing the high-resolution predictions - with quantified uncertainties - needed by a growing number of adaptation planners, from local decision-makers to the private sector, who require detailed assessments of the climate risks they may face locally.
Russian neurobiologists have created computer software that can automatically analyze and classify the shape of dendritic spines. The program is based on machine learning techniques.
Employees’ concerns about the use of artificial intelligence and monitoring technologies in the workplace may be negatively related to their psychological well-being and lead them to feel less valued, according to a survey from the American Psychological Association.
A new Columbia Nursing study analyzes the performance of ADscreen, a computerized speech processing algorithm that is being developed to support clinicians in detecting and monitoring the progression of Alzheimer's disease and related dementias early.
When artificial intelligence robots that have been designed to use algorithms to complete source search tasks, such as search and rescue operations during a fire, encounter a disturbance, they are often unable to complete their task.
The research team led by Dr. Se-Jong Kim and Dr. Juwon Na of the Materials Data Management Center in the Materials Digital Platform Division together with the research team led by Professor Seungchul Lee of POSTECH has developed a technology that can automatically identify and quantify materials microstructure from microscopic images through human-in-the-loop machine learning.
A team of scientists from Ames National Laboratory developed a new machine learning model for discovering critical-element-free permanent magnet materials based on the predicted Curie temperature of new material combinations.
University of Illinois researchers and a Swiss pharmaceutical company have developed a machine learning model that eliminates the need for extensive experimentation to determine the best conditions for an important carbon-nitrogen bond forming reaction known as the Buchwald-Hartwig reaction.
Avalanche photodiodes (APDs) have drawn interest in recent years and extensively used in many applications to include the most important - optical communication. Unquestionably optical quantum information applications such as quantum key distribution also drive that trend putting severe requirements on detector performance. Towards this goal the authors showed the evolution and the recent development of AIIIBV, AIIBVI and potential alternative - “third wave” superlattices and 2D materials APDs.
WASHINGTON, D.C. - Today, the U.S. Department of Energy (DOE) announced $29 million in funding for seven team awards for research in machine learning, artificial intelligence, and data resources for fusion energy sciences.
As the number of elements on phased array antennas continues to grow, so does the volume of data that must be processed. To address this, researchers have developed a new approach to process that data closer to where it is generated - on the antenna subarrays themselves.
Like climbing a mountain via the shortest possible path, improving classification tasks can be achieved by choosing the most influential path to the output, and not just by learning with deeper networks.
A licensing agreement between the Department of Energy’s Oak Ridge National Laboratory and research partner ZEISS will enable industrial X-ray computed tomography, or CT, to perform rapid evaluations of 3D-printed components using ORNL’s machine learning algorithm, Simurgh.
In Journal of Applied Physics, Markus Buehler combines attention neural networks with graph neural networks to better understand and design proteins. The approach couples the strengths of geometric deep learning with those of language models to predict existing protein properties and envision new proteins that nature has not yet devised. Buehler’s model turns numbers, descriptions, tasks, and other elements into symbols for his neural networks to use.
Artificial intelligence could help determine the verdicts of future court cases involving musical copyright, according to West Virginia University College of Law researchers.
Not for public release
This news release is embargoed until 28-Aug-2023 5:00 PM EDT
Released to reporters: 22-Aug-2023 2:00 PM EDT
A reporter's PressPass is required to
access this story until the embargo expires on 28-Aug-2023 5:00 PM EDT
The Newswise PressPass gives verified journalists access to embargoed stories.
Please log in to complete a presspass application.
If you have not yet registered, please Register. When you
fill out the registration form, please identify yourself as a reporter in order to
advance to the presspass application form.
National news coverage from the two largest broadcast outlets, CNN and Fox News, not only reflects growing political polarization in America, but in a recent publication, researchers at Virginia Tech have shown that partisan and inflammatory broadcast coverage has increased over time and can exacerbate growing divides in the new public square of social media.
Someone wearing augmented reality (AR) or “smart” glasses could be Googling your face, turning you into a cat or recording your conversation – and that creates a major power imbalance, said Cornell researchers.
Today, the U.S. Department of Energy (DOE) announced $16 million in funding for four projects in scientific machine learning for the predictive modeling and simulation of complex systems.
Researchers from Lehigh University, University of Hong Kong, and Wuhan University published a new Journal of Marketing article that examines in-feed advertising’s performance across subscription versus AI recommended news feeds.
Given the expected surge in worldwide demand for staple crops by 2050 due to population growth, higher individual incomes, and increased biofuel usage, the adoption of sustainable agricultural practices is crucial to meet this demand.
CureMD, a leading provider of comprehensive technology solutions for community oncology, is proud to announce its partnership with Tempus, a leader in artificial intelligence and precision medicine, to integrate Tempus' advanced genomic testing capabilities into CureMD's cutting-edge Electronic Health Record (EHR) system.
Researchers are in an arms race with hackers to prevent data theft. Their standard tools include strategies like multi-factor authentication systems, fingerprint technology and retinal scans. One type of security system that is gaining popularity is automatic speaker identification, which uses a person’s voice as a passcode.
Bias in the collection of data on which Artificial Intelligence (AI) computer programmes depend can limit the usefulness of this rapidly growing tool for climate scientists predicting future scenarios and guiding global action.
Today, the U.S. Department of Energy (DOE) announced $16 million for fifteen projects that will implement artificial intelligence methods to accelerate scientific discovery in nuclear physics research.
Cleveland Clinic London is the first hospital in London to successfully perform a total knee replacement procedure with the assistance of an augmented reality-based surgical platform that was designed with artificial intelligence and machine learning.
Jason Yip, a UW associate professor in the Information School, discusses how parents and schools can adapt to new technologies in ways that support children’s learning.
Argonne National Laboratory is reimagining the lab spaces and scientific careers of the future by harnessing the power of robotics, artificial intelligence and machine learning in the quest for new knowledge.
Artificial neural networks, ubiquitous machine-learning models that can be trained to complete many tasks, are so called because their architecture is inspired by the way biological neurons process information in the human brain.
From artificial intelligence to digital concept maps, technology may be changing the classroom, but not how students learn. Meta-analytic studies on instructional technology have found that technology does not impact student learning. The single most important influence on learning is the teacher.
La inteligencia artificial (IA) está capturando la imaginación del público a medida que el ritmo de la innovación se acelera considerablemente y las herramientas de IA fáciles de usar ofrecen nuevas posibilidades para transformar industrias enteras.
Experts from Indiana University are available to comment on a variety of topics in the worlds of politics, finance, education and disaster response making headlines the week of Aug. 14, 2023.
Conventional artificial-intelligence vision technology uses separate sensing, computing, and storage units to process vision data. The frequent movement of redundant data between sensors, processors and memory results in high power consumption and latency. Scientists in China designed a novel device, in which photoexcited carriers and ion migration are coupled, that can store and read the tunable short-circuit photocurrent in a non-volatile mode. This new concept of device enables all-in-one sensing-memory-computing approaches for neuromorphic vision hardware.