Humans can understand apes’ sign language, new study finds
From pointing to animated arm movements and nodding, people regularly employ gestures to accompany and create language.
Now, it’s been suggested that humans can also understand sign language used by apes, meaning humans may retain an understanding of ape communication from their ancestors.
Great apes deploy more than 80 signals to communicate everyday goals, according to a study published Tuesday in the journal PLOS Biology.
. . .
Chimpanzees and bonobos, which share more than 90% of their gestures, are humans’ closest living relatives, the study said. Their gestures have been suggested to be an important framework in the evolution of human language, according to study authors Kirsty E. Graham, a research fellow at the University of St. Andrews’ School of Psychology and Neuroscience in Scotland, and primatologist Catherine Hobaiter, a principal investigator at the university’s Wild Minds Lab.
Infants ages 1 to 2 have been found to use more than 50 gestures from the ape repertoire, researchers said. It was therefore thought that humans may have retained their understanding of core features of ape gestures.
Therapy by chatbot? The promise and challenges in using AI for mental health
As darkness and depression engulfed Ali, help seemed out of reach; she couldn't find an available therapist, nor could she get there without a car, or pay for it. She had no health insurance, after having to shut down her bakery.
So her orthopedist suggested a mental-health app called Wysa. Its chatbot-only service is free, though it also offers teletherapy services with a human for a fee ranging from $15 to $30 a week; that fee is sometimes covered by insurance. The chatbot, which Wysa co-founder Ramakant Vempati describes as a "friendly" and "empathetic" tool, asks questions like, "How are you feeling?" or "What's bothering you?" The computer then analyzes the words and phrases in the answers to deliver supportive messages, or advice about managing chronic pain, for example, or grief — all served up from a database of responses that have been prewritten by a psychologist trained in cognitive behavioral therapy.
That is how Ali found herself on a new frontier of technology and mental health. Advances in artificial intelligence — such as Chat GPT — are increasingly being looked to as a way to help screen for, or support, people who dealing with isolation, or mild depression or anxiety. Human emotions are tracked, analyzed and responded to, using machine learning that tries to monitor a patient's mood, or mimic a human therapist's interactions with a patient. It's an area garnering lots of interest, in part because of its potential to overcome the common kinds of financial and logistical barriers to care, such as those Ali faced.
Real estate agents say they can’t imagine working without ChatGPT now
In less than two months, ChatGPT has sparked discussions around its potential to disrupt various industries, from publishing to law. But it’s already having a tangible impact on how a number of real estate agents around the country do their jobs – where much of the written work can be formulaic and time consuming – to the extent that some can no longer imagine working without it.
“I’ve been using it for more than a month, and I can’t remember the last time something has wowed me this much,” said Andres Asion, a broker from the Miami Real Estate Group.
. . .
While ChatGPT has generated a wave of interest among realtors, incorporating artificial intelligence in the real estate market isn’t entirely new. Listing site Zillow, for example, has used AI for 3D mapping, creating automatic floor plans and for its Zestimate tool, which can scan pictures to see if a home has hardwood floors or stainless steel appliances so its price estimation better reflects market conditions. Earlier this week, Zillow rolled out an AI-feature that lets potential buyers conduct searches in a more natural language (something that’s long been mastered by Google).
. . .
Although it’s too early to say if the tool will become a mainstay in real estate, realtor Johannes believes AI in general will transform his industry and others.
A robot was scheduled to argue in court, then came the jail threats
A British man who planned to have a "robot lawyer" help a defendant fight a traffic ticket has dropped the effort after receiving threats of possible prosecution and jail time.
Joshua Browder, the CEO of the New York-based startup DoNotPay, created a way for people contesting traffic tickets to use arguments in court generated by artificial intelligence.
Here's how it was supposed to work: The person challenging a speeding ticket would wear smart glasses that both record court proceedings and dictate responses into the defendant's ear from a small speaker. The system relied on a few leading AI text generators, including ChatGPT and DaVinci.
The first-ever AI-powered legal defense was set to take place in California on Feb. 22, but not anymore.
How ChatGPT became the next big thing
ChatGPT has captured the public imagination in a way the tech world hasn't seen since the debut of the iPhone in 2007.
Why it matters: Most of us are only now getting a glimpse of just how smart artificial intelligence has become. It's awe-inducing — and terrifying.
- When ChatGPT launched to the public, it proved to be much more advanced than even many in the tech industry had expected.
What it is: ChatGPT is a free (for now) site that lets users pose questions and give directions to a bot that can answer with conversation, term papers, sonnets, recipes — almost anything. In almost any style you specify.
The big picture: The possibilities for ChatGPT seem endless. It recently passed all three parts of the U.S. Medical Licensing Examination, although just barely, as part of a research experiment.
Impressive as ChatGPT is, its current version has some severe limitations, as even its creators acknowledge.
The big picture: The AI tool can put together answers to a lot of questions, but it doesn't actually "know" anything — which means it has no yardstick for assessing accuracy, and it stumbles over matters of common sense as well as paradoxes and ambiguities.
- OpenAI notes that ChatGPT "sometimes writes plausible-sounding but incorrect or nonsensical answers ... is often excessively verbose ... [and] will sometimes respond to harmful instructions or exhibit biased behavior."
Details: ChatGPT can't distinguish fact from fiction. For sure, humans have trouble with this too — but they understand what those categories are.
Microsoft investing billions in ChatGPT maker
Microsoft is investing billions of dollars into OpenAI, the company behind the popular ChatGPT language generation tool, as part of a third phase of a partnership between the two tech companies, Microsoft announced Monday.
Microsoft did not detail the exact amount it is investing in OpenAI with the latest phase, describing it as a “multiyear, multibillion dollar investment.” Semafor previously reported the company was in talks to invest $10 billion into the artificial intelligence company.
The investment adds to the ones Microsoft made in OpenAI in 2019 and 2021 and extends the partnership between the companies as ChatGPT becomes increasingly popular.
The free tool, which launched in November, automatically generates detailed text results based on queries in a way that is more advanced than previous technology. While the tool has quickly become an internet sensation and gained popularity, it has also raised questions over how it is used. For example, schools are grappling with challenges it introduces in the classroom, including cheating on assignments, and some schools are banning it.
Scientists Created a Real-Life T-1000 Terminator That Melts on Command
How is real life like the plot of the movie Terminator 2? Let me count the ways.
A scarily powerful artificial intelligence based on a neural network? Check.
Humanity as close as it's ever been to an apocalypse? Check.
And a shapeshifting object made out of a liquid metal? Until now, it would’ve been a miss. But researchers in the U.S. and China have created an object made from microparticles embedded in liquid gallium that, when heated with magnets, melts into a liquid that can pass through the bars of a cage—just like the T-1000 did in the movie.
. . .
The study containing this absolutely bone-chilling display was published on Jan. 26 in the journal Matter. And yes, the Terminator 2 reference in the video was intentional.
“One of the original drafts of the paper made reference to The Terminator and the T-1000, and I actually had it taken out just because of copyright reasons and all that,” Majidi said.
Earth’s inner core may have stopped turning and could go into reverse, study suggests
Earth’s inner core has recently stopped spinning, and may now be reversing the direction of its rotation, according to a surprising new study that probed the deepest reaches of our planet with seismic waves from earthquakes.
The mind-boggling results suggest that Earth’s center pauses and reverses direction on a periodic cycle lasting about 60 to 70 years, a discovery that might solve longstanding mysteries about climate and geological phenomena that occur on a similar timeframe, and that affect life on our planet.
. . .
Located some 3,000 miles beneath our feet, the core experiences intense heat on par with the surface of the Sun. Because it is so remote and difficult to study, the inner core remains one of the least understood environments on our planet, though it’s clear that it plays a role in many processes that make our world habitable to life, such as the generation of Earth’s protective magnetic field, which blocks harmful radiation from reaching the surface.
Now, Yi Yang and Xiaodong Song, a pair of researchers at Peking University’s SinoProbe Lab at School of Earth and Space Sciences, have captured “surprising observations that indicate the inner core has nearly ceased its rotation in the recent decade and may be experiencing a turning-back in a multidecadal oscillation, with another turning point in the early 1970s,” according to a study published on Monday in Nature Geoscience.
1.2-Million-Year-Old Obsidian Axe Factory Found In Ethiopia
Reporting on the latest findings from the Melka Kunture archaeological site in Ethiopia, a team of researchers has described the discovery of an obsidian handaxe workshop within a layer of sediment dated to 1.2 million years ago. This represents a staggeringly early example of obsidian shaping, and, according to the study authors, is the only handaxe factory ever dated to the Early Pleistocene.
. . .
“Generally speaking, obsidian is extensively used only from the Middle Stone Age onwards,” write the study authors.
However, during the course of their excavations, the team came across an ancient layer of sediment containing a cache of 578 stone tools, all but three of which were sculpted from obsidian. “We show through statistical analysis that this was a focused activity, that very standardized handaxes were produced and that this was a stone-tool workshop,” they write.
Describing the axes, the researchers repeatedly marvel that “the morphological standardization is remarkable,” and while they don’t know which species of human crafted the tools, they say that whoever created them diligently applied “secondary retouches” and was highly “focused on the final regularization of the artifacts.”
Antidepressants can cause ‘emotional blunting’, study shows
Widely used antidepressants cause “emotional blunting”, according to research that offers new insights into how the drugs may work and their possible side-effects.
The study found that healthy volunteers became less responsive to positive and negative feedback after taking a selective serotonin reuptake inhibitor (SSRI) drug for three weeks. The “blunting” of negative emotions could be part of how the drugs help people recover from depression, but could also explain a common side-effect.
. . .
Some people on the medication report feeling emotionally dull or no longer finding things as pleasurable, with one study suggesting this applied to 40-60% of people taking the drug. However, it has been unclear whether this symptom is a drug side-effect or a symptom of depression.
The latest work suggests that the drug alone can produce emotional blunting. In the study, published in the journal Neuropsychopharmacology, 66 volunteers were given either the SSRI drug, escitalopram, or a placebo for at least 21 days before doing a set of cognitive tests.
No comments:
Post a Comment