International diplomacy has traditionally relied on bargaining power, covert channels of communication, and personal chemistry between leaders. But a new era is upon us in which the dispassionate insights of AI algorithms and mathematical techniques such as game theory will play a growing role in deals struck between nations, according to the co-founder of the world’s first center for science in diplomacy.
Michael Ambühl, a professor of negotiation and conflict management and former chief Swiss-EU negotiator, said recent advances in AI and machine learning mean that these technologies now have a meaningful part to play in international diplomacy, including at the Cop26 summit starting later this month and in post-Brexit deals on trade and immigration.
The math is pretty basic. How many satellites are going to go up over the next decade? How many solar panels will they need? And how many are being manufactured that fit the bill? Turns out the answers are: a lot, a hell of a lot, and not nearly enough. That’s where Regher Solar aims to make its mark, by bringing the cost of space-quality solar panels down by 90% while making an order of magnitude more of them. It’s not exactly a modest goal, but fortunately the science and market seem to be in favor, giving the company something of a tailwind. The question is finding the right balance between cost and performance while remaining relatively easy to manufacture. Of course, if there was an easy answer there, someone would already be doing that.
Full Story:
Intech Company is the ultimate source of the latest AI news. It checks trusted websites and collects bests pieces of AI information.
Microsoft’s blog post on Megatron-Turing says the algorithm is skilled at tasks like completion prediction, reading comprehension, commonsense reasoning, natural language inferences, and word sense disambiguation. But stay tuned—there will likely be more skills added to that list once the model starts being widely utilized.
GPT-3 turned out to have capabilities beyond what its creators anticipated, like writing code, doing math, translating between languages, and autocompleting images (oh, and writing a short film with a twist ending). This led some to speculate that GPT-3 might be the gateway to artificial general intelligence. But the algorithm’s variety of talents, while unexpected, still fell within the language domain (including programming languages), so that’s a bit of a stretch.
Despite the continued progress that the state of the art in machine learning and artificial intelligence (AI) has been able to achieve, one thing that still sets the human brain apart — and those of some other animals — is its ability to connect the dots and infer information that supports problem-solving in situations that are inherently uncertain. It does this remarkably well despite sparse, incomplete, and almost always less than perfect data. In contrast, machines have a very difficult time inferring new insights and generalizing beyond what they have been explicitly trained on or exposed to.
How the brain evolved to achieve these abilities and what are the underlying ‘algorithms’ that enable them to remain poorly understood. The development and investigation of mathematical models will lead to a deep understanding of what the brain is doing and how are not mature and remain a very active area of research.
Coastal artisanal fisheries, particularly those in developing countries, are facing a global crisis of overexploitation1. Artificial reefs (ARs), or human–made reefs2, have been widely advocated by governmental and non-governmental conservation and management organizations for addressing these issues. Industries, particularly oil and gas, seeking to avoid the costs of removal or conventional disposal of used materials are often major advocates for deploying ARs. Yet, major questions remain regarding the success of such efforts in the context of weak governance and poorly sustained international investment in AR development projects. There is frequently confusion over whether or not ARs should be fishing sites and the precise goals of constructing such ARs are often unclear, making difficult to evaluate their successfulness3. Over the last 40 years, both failures and success AR implementation programs have been reported4,5. The main point of the present work is to underline the importance of the governance issue and address social and management factors on AR “success”.
To improve fishery yields, it has been recommended that ARs must be no-take areas (e.g.,2). Yet, most ARs were historically delineated as sites for fishing4, and were rarely implemented at large scales in/for no-take zones, even in countries with centuries of experience in constructing ARs, such as Japan. In Japan, fishery authorities and local fishers use ARs to promote sustainable catches and to establish nursery grounds of target species6. However, fishery authorities and local fishery cooperatives in Japan have extensive management authority over ARs. For example, fishing around ARs is usually limited to hook and line techniques, with net fishing rarely being permitted in areas where risk of entanglement in ARs is high. Furthermore, during spawning, fishing gear and fishing season are often restricted around ARs in Japan. These practices are recognized for their effectiveness in maintaining good fishing performance and marine conservation in Japan and elsewhere where they have been implemented7.
New research by a City College of New York team has uncovered a novel way to combine two different states of matter. For one of the first times, topological photons—light—has been combined with lattice vibrations, also known as phonons, to manipulate their propagation in a robust and controllable way.
The study utilized topological photonics, an emergent direction in photonics which leverages fundamental ideas of the mathematical field of topology about conserved quantities—topological invariants—that remain constant when altering parts of a geometric object under continuous deformations. One of the simplest examples of such invariants is number of holes, which, for instance, makes donut and mug equivalent from the topological point of view. The topological properties endow photons with helicity, when photons spin as they propagate, leading to unique and unexpected characteristics, such as robustness to defects and unidirectional propagation along interfaces between topologically distinct materials. Thanks to interactions with vibrations in crystals, these helical photons can then be used to channel infrared light along with vibrations.
The implications of this work are broad, in particular allowing researchers to advance Raman spectroscopy, which is used to determine vibrational modes of molecules. The research also holds promise for vibrational spectroscopy—also known as infrared spectroscopy —which measures the interaction of infrared radiation with matter through absorption, emission, or reflection. This can then be utilized to study and identify and characterize chemical substances.
In a new study from Skoltech and the University of Kentucky, researchers found a new connection between quantum information and quantum field theory. This work attests to the growing role of quantum information theory across various areas of physics. The paper was published in the journal Physical Review Letters.
Quantum information plays an increasingly important role as an organizing principle connecting various branches of physics. In particular, the theory of quantum error correction, which describes how to protect and recover information in quantum computers and other complex interacting systems, has become one of the building blocks of the modern understanding of quantum gravity.
“Normally, information stored in physical systems is localized. Say, a computer file occupies a particular small area of the hard drive. By “error” we mean any unforeseen or undesired interaction which scrambles information over an extended area. In our example, pieces of the computer file would be scattered over different areas of the hard drive. Error correcting codes are mathematical protocols that allow collecting these pieces together to recover the original information. They are in heavy use in data storage and communication systems. Quantum error correcting codes play a similar role in cases when the quantum nature of the physical system is important,” Anatoly Dymarsky, Associate Professor at the Skoltech Center for Energy Science and Technology (CEST), explains.
“Our mathematical equation lets us predict which individuals will have both more happiness and more brain activity for intrinsic compared to extrinsic rewards. The same approach can be used in principle to measure what people actually prefer without asking them explicitly, but simply by measuring their mood.”
Summary: A new mathematical equation predicts which individuals will have more happiness and increased brain activity for intrinsic rather than extrinsic rewards. The approach can be used to predict personal preferences based on mood and without asking the individual.
Faraday and Dirac constructed magnetic monopoles using the practical and mathematical tools available to them. Now physicists have engineered effective monopoles by combining modern optics with nanotechnology. Part matter and part light, these magnetic monopoles travel at unprecedented speeds.
In classical physics (as every student should know) there are no sources or sinks of magnetic field, and hence no magnetic monopoles. Even so, a tight bundle of magnetic flux — such as that created by a long string of magnetic dipoles — has an apparent source or sink at its end. If we map the lines of force with a plotting compass, we think we see a magnetic monopole as our compass cannot enter the region of dense flux. In 1,821 Michael Faraday constructed an effective monopole of this sort by floating a long thin bar magnet upright in a bowl of mercury, with the lower end tethered and the upper end free to move like a monopole in the horizontal plane.
Reservoir computing, a machine learning algorithm that mimics the workings of the human brain, is revolutionizing how scientists tackle the most complex data processing challenges, and now, researchers have discovered a new technique that can make it up to a million times faster on specific tasks while using far fewer computing resources with less data input.
With the next-generation technique, the researchers were able to solve a complex computing problem in less than a second on a desktop computer — and these overly complex problems, such as forecasting the evolution of dynamic systems like weather that change over time, are exactly why reservoir computing was developed in the early 2000s.
These systems can be extremely difficult to predict, with the “butterfly effect” being a well-known example. The concept, which is closely associated with the work of mathematician and meteorologist Edward Lorenz, essentially describes how a butterfly fluttering its wings can influence the weather weeks later. Reservoir computing is well-suited for learning such dynamic systems and can provide accurate projections of how they will behave in the future; however, the larger and more complex the system, more computing resources, a network of artificial neurons, and more time are required to obtain accurate forecasts.