Archive for the ‘information science’ category: Page 183

Jul 10, 2013

Quantum Entanglement in Future Communication Technologies

Posted by in categories: engineering, futurism, information science, particle physics, space

The arXiv blog on MIT Technology Review recently reported a breakthrough ‘Physicists Discover the Secret of Quantum Remote Control’ [1] which led some to comment on whether this could be used as an FTL communication channel. In order to appreciate the significance of the paper on Quantum Teleportation of Dynamics [2], one should note that it has already been determined that transfer of information via a quantum tangled pair occurs *at least* 10,000 times faster than the speed of light [3]. The next big communications breakthrough?

Quantum Entanglement Visual

In what could turn out to be a major breakthrough for the advancement of long-distance communications in space exploration, several problems are resolved — where if a civilization is eventually established on a star system many light years away, for example, such as on one of the recently discovered Goldilocks Zone super-Earths in the Gliese 667C star system, then communications back to people on Earth may after all be… instantaneous.

However, implications do not just stop there either. As recently reported in The Register [5], researchers in Israel at the University of Jerusalem, have established that quantum tangling can be used to send data across both TIME AND SPACE [6]. Their recent paper entitled ‘Entanglement Between Photons that have Never Coexisted’ [7] describes how photon-to-photon entanglement can be used to connect with photons in their past/future, opening up an understanding into how one may be able to engineer technology to not just communicate instantaneously across space — but across space-time.

Continue reading “Quantum Entanglement in Future Communication Technologies” »

Jun 5, 2013

The Impending Crisis of Data: Do We Need a Constitution of Information?

Posted by in categories: cybercrime/malcode, information science, media & arts

The recent scandal involving the surveillance of the Associated Press and Fox News by the United States Justice Department has focused attention on the erosion of privacy and freedom of speech in recent years. But before we simply attribute these events to the ethical failings of Attorney General Eric Holder and his staff, we also should consider the technological revolution powering this incident, and thousands like it. It would appear that bureaucrats simply are seduced by the ease with which information can be gathered and manipulated. At the rate that technologies for the collection and fabrication of information are evolving, what is now available to law enforcement and intelligence agencies in the United States, and around the world, will soon be available to individuals and small groups.

We must come to terms with the current information revolution and take the first steps to form global institutions that will assure that our society, and our governments, can continue to function through this chaotic and disconcerting period. The exponential increase in the power of computers will mean that changes the go far beyond the limits of slow-moving human government. We will need to build new institutions to the crisis that are substantial and long-term. It will not be a matter that can be solved by adding a new division to Homeland Security or Google.

We do not have any choice. To make light of the crisis means allowing shadowy organizations to usurp for themselves immense power through the collection and distortion of information. Failure to keep up with technological change in an institutional sense will mean that in the future government will be at best a symbolic façade of authority with little authority or capacity to respond to the threats of information manipulation. In the worst case scenario, corporations and government agencies could degenerate into warring factions, a new form of feudalism in which invisible forces use their control of information to wage murky wars for global domination.

No degree of moral propriety among public servants, or corporate leaders, can stop the explosion of spying and the propagation of false information that we will witness over the next decade. The most significant factor behind this development will be Moore’s Law which stipulates that the number of microprocessors that can be placed economically on a chip will double every 18 months (and the cost of storage has halved every 14 months) — and not the moral decline of citizens. This exponential increase in our capability to gather, store, share, alter and fabricate information of every form will offer tremendous opportunities for the development of new technologies. But the rate of change of computational power is so much faster than the rate at which human institutions can adapt — let alone the rate at which the human species evolves — that we will face devastating existential challenges to human civilization.

Continue reading “The Impending Crisis of Data: Do We Need a Constitution of Information?” »

Apr 19, 2013

Bitcoin’s Dystopian Future

Posted by in categories: bitcoin, cybercrime/malcode, economics, ethics, finance, futurism, information science, lifeboat, open source, policy

I have seen the future of Bitcoin, and it is bleak.

The Promise of Bitcoin

If you were to peek into my bedroom at night (please don’t), there’s a good chance you would see my wife sleeping soundly while I stare at the ceiling, running thought experiments about where Bitcoin is going. Like many other people, I have come to the conclusion that distributed currencies like Bitcoin are going to eventually be recognized as the most important technological innovation of the decade, if not the century. It seems clear to me that the rise of distributed currencies presents the biggest (and riskiest) investment opportunity I am likely to see in my lifetime; perhaps in a thousand lifetimes. It is critically important to understand where Bitcoin is going, and I am determined to do so.


Dec 1, 2012

Response to Plaut and McClelland in the Phys.org story

Posted by in categories: information science, neuroscience, philosophy, robotics/AI

A response to McClelland and Plaut’s
comments in the Phys.org story:

Do brain cells need to be connected to have meaning?

Asim Roy
Department of Information Systems
Arizona State University
Tempe, Arizona, USA

Article reference:

Roy A. (2012). “A theory of the brain: localist representation is used widely in the brain.” Front. Psychology 3:551. doi: 10.3389/fpsyg.2012.00551

Continue reading “Response to Plaut and McClelland in the Phys.org story” »

Nov 20, 2012

Google’s 100,000 Stars & the Paradigmatic Disruption of Large-Scale Innovation Revisited

Posted by in categories: cosmology, general relativity, human trajectories, information science, physics, scientific freedom, space

The 100,000 Stars Google Chrome Galactic Visualization Experiment Thingy

So, Google has these things called Chrome Experiments, and they like, you know, do that. 100,000 Stars, their latest, simulates our immediate galactic zip code and provides detailed information on many of the massive nuclear fireballs nearby.

Zoom in & out of interactive galaxy, state, city, neighborhood, so to speak.

It’s humbling, beautiful, and awesome. Now, is 100, 000 Stars perfectly accurate and practical for anything other than having something pretty to look at and explore and educate and remind us of the enormity of our quaint little galaxy among the likely 170 billion others? Well, no — not really. But if you really feel the need to evaluate it that way, you are a unimaginative jerk and your life is without joy and awe and hope and wonder and you probably have irritable bowel syndrome. Deservedly.

The New Innovation Paradigm Kinda Revisited
Just about exactly one year ago technosnark cudgel Anthrobotic.com was rapping about the changing innovation paradigm in large-scale technological development. There’s chastisement for Neil deGrasse Tyson and others who, paraphrasically (totally a word), have declared that private companies won’t take big risks, won’t do bold stuff, won’t push the boundaries of scientific exploration because of bottom lines and restrictive boards and such. But new business entities like Google, SpaceX, Virgin Galactic, & Planetary Resources are kind of steadily proving this wrong.

Continue reading “Google's 100,000 Stars & the Paradigmatic Disruption of Large-Scale Innovation Revisited” »

Oct 31, 2012

FuturICT Vision for the Social Sciences, ICT & Complexity Science

Posted by in categories: futurism, information science

FutureICT have submitted their proposal to the FET Flagship Programme, an initiative that aims to facilitate breakthroughs in information technology. The vision of FutureICT is to

integrate the fields of information and communication technologies (ICT), social sciences and complexity science, to develop a new kind of participatory science and technology that will help us to understand, explore and manage the complex, global, socially interactive systems that make up our world today, while at the same time paving the way for a new paradigm of ICT systems that will leverage socio-inspired self-organisation, self-regulation, and collective awareness.

The project could provide us with profound insights into societal behaviour and improve policymaking. The project echoes the Large Hadron Collider at CERN in its scope and vision, only here we are trying to understand the state of the world. The FutureICT project combines the creation of a ‘Planetary Nervous System’ (PNS) where Big Data will be collated and organised, a ‘Living Earth Simulator’ (LES), and the ‘Global Participatory Platform’ (GPP). The LES will simulate the data and provide models for analysis, while the GPP will provide the data, models and methods to everyone. People wil be able to collaborate and research in a very different way. The availability of Big Data to participants will both strengthen our ability to understand complex socio-economic systems, and it could help build a new dialogue between nations in how we solve complex global societal challenges.

FutureICT aim to develop a ‘Global Systems Science’, which will

Continue reading “FuturICT Vision for the Social Sciences, ICT & Complexity Science” »

Oct 27, 2012

Today, a Young Man on Acid Realized that all Matter is Merely Energy Condensed to a…

Posted by in categories: biological, complex systems, cosmology, engineering, existential risks, homo sapiens, human trajectories, humor, information science, particle physics, philosophy, physics

…here’s Tom with the Weather.
That right there is comedian/philosopher Bill Hicks, sadly no longer with us. One imagines he would be pleased and completely unsurprised to learn that serious scientific minds are considering and actually finding support for the theory that our reality could be a kind of simulation. That means, for example, a string of daisy-chained IBM Super-Deep-Blue Gene Quantum Watson computers from 2042 could be running a History of the Universe program, and depending on your solipsistic preferences, either you are or we are the character(s).

It’s been in the news a lot of late, but — no way, right?

Because dude, I’m totally real
Despite being utterly unable to even begin thinking about how to consider what real even means, the everyday average rational person would probably assign this to the sovereign realm of unemployable philosophy majors or under the Whatever, Who Cares? or Oh, That’s Interesting I Gotta Go Now! categories. Okay fine, but on the other side of the intellectual coin, vis-à-vis recent technological advancement, of late it’s actually being seriously considered by serious people using big words they’ve learned at endless college whilst collecting letters after their names and doin’ research and writin’ and gettin’ association memberships and such.

So… why now?

Continue reading “Today, a Young Man on Acid Realized that all Matter is Merely Energy Condensed to a...” »

Oct 23, 2012

The Witch-Hunt of Geophysicists: Society returns to the Dark Ages

Posted by in categories: education, ethics, events, geopolitics, information science, physics

I cannot let the day pass without contributing a comment on the incredible ruling of multiple manslaughter on six top Italian geophysicists for not predicting an earthquake that left 309 people dead in 2009. When those who are entrusted with safeguarding humanity (be it on a local level in this case) are subjected to persecution when they fail to do so, despite acting in the best of their abilities in an inaccurate science, we have surely returned to the dark ages where those who practice science are demonized by the those who misunderstand it.


I hope I do not misrepresent other members of staff here at The Lifeboat Foundation, in speaking on behalf of the Foundation in wishing these scientists a successful appeal against a court ruling which has shocked the scientific community, and I stand behind the 5,000 members of the scientific community who sent an open letter to Italy’s President Giorgio Napolitano denouncing the trial. This court ruling was ape-mentality at its worst.

Oct 6, 2012

The decaying web and our disappearing history

Posted by in categories: information science, media & arts, philosophy

On January 28 2011, three days into the fierce protests that would eventually oust the Egyptian president Hosni Mubarak, a Twitter user called Farrah posted a link to a picture that supposedly showed an armed man as he ran on a “rooftop during clashes between police and protesters in Suez”. I say supposedly, because both the tweet and the picture it linked to no longer exist. Instead they have been replaced with error messages that claim the message – and its contents – “doesn’t exist”.

Few things are more explicitly ephemeral than a Tweet. Yet it’s precisely this kind of ephemeral communication – a comment, a status update, sharing or disseminating a piece of media – that lies at the heart of much of modern history as it unfolds. It’s also a vital contemporary historical record that, unless we’re careful, we risk losing almost before we’ve been able to gauge its importance.

Consider a study published this September by Hany SalahEldeen and Michael L Nelson, two computer scientists at Old Dominion University. Snappily titled “Losing My Revolution: How Many Resources Shared on Social Media Have Been Lost?”, the paper took six seminal news events from the last few years – the H1N1 virus outbreak, Michael Jackson’s death, the Iranian elections and protests, Barack Obama’s Nobel Peace Prize, the Egyptian revolution, and the Syrian uprising – and established a representative sample of tweets from Twitter’s entire corpus discussing each event specifically.

It then analysed the resources being linked to by these tweets, and whether these resources were still accessible, had been preserved in a digital archive, or had ceased to exist. The findings were striking: one year after an event, on average, about 11% of the online content referenced by social media had been lost and just 20% archived. What’s equally striking, moreover, is the steady continuation of this trend over time. After two and a half years, 27% had been lost and 41% archived.

Continue reading “The decaying web and our disappearing history”

Oct 5, 2012

Want to Get 70 Billion Copies of Your Book In Print? Print It In DNA

Posted by in categories: biological, biotech/medical, chemistry, futurism, information science, media & arts

I have been meaning to read a book coming out soon called Regenesis: How Synthetic Biology Will Reinvent Nature and Ourselves. It’s written by Harvard biologist George Church and science writer Ed Regis. Church is doing stunning work on a number of fronts, from creating synthetic microbes to sequencing human genomes, so I definitely am interested in what he has to say. I don’t know how many other people will be, so I have no idea how well the book will do. But in a tour de force of biochemical publishing, he has created 70 billion copies. Instead of paper and ink, or pdf’s and pixels, he’s used DNA.

Much as pdf’s are built on a digital system of 1s and 0s, DNA is a string of nucleotides, which can be one of four different types. Church and his colleagues turned his whole book–including illustrations–into a 5.27 MB file–which they then translated into a sequence of DNA. They stored the DNA on a chip and then sequenced it to read the text. The book is broken up into little chunks of DNA, each of which has a portion of the book itself as well as an address to indicate where it should go. They recovered the book with only 10 wrong bits out of 5.27 million. Using standard DNA-copying methods, they duplicated the DNA into 70 billion copies.

Scientists have stored little pieces of information in DNA before, but Church’s book is about 1,000 times bigger. I doubt anyone would buy a DNA edition of Regenesis on Amazon, since they’d need some expensive equipment and a lot of time to translate it into a format our brains can comprehend. But the costs are crashing, and DNA is a far more stable medium than that hard drive on your desk that you’re waiting to die. In fact, Regenesis could endure for centuries in its genetic form. Perhaps librarians of the future will need to get a degree in biology…

Link to Church’s paper