Sunday, April 29, 2012

Linguistics NLP model from physics and category theory

Recently, Coecke et al. (2010) used high level cross-disciplinary techniques from logic, category theory, and physics to bring the above two approaches together. They developed a unified mathematical framework whereby a sentence vector is by definition a function of the Kronecker product of its
word vectors. A concrete instantiation of this theory was exemplified on a toy hand crafted corpus by Grefenstette et al. (2011).
http://dl.acm.org/citation.cfm?id=2145580
Modelling compositional meaning for sentences using empirical distributional methods has been a challenge for computational linguists. We implement the abstract categorical model of Coecke et al. (2010) using data from the BNC and evaluate it. The implementation is based on unsupervised learning of matrices for relational words and applying them to the vectors of their arguments. The evaluation is based on the word disambiguation task developed by Mitchell and Lapata (2008) for intransitive sentences, and on a similar new experiment designed for transitive sentences. Our model matches the results of its competitors in the first experiment, and betters them in the second. The general improvement in results with increase in syntactic complexity showcases the compositional power of our model.
Related: http://arxivindex.blogspot.com/2012/04/more-linguistics-and-qft.html

Thursday, April 26, 2012

Interspecies retroviral transfer of cryo-preservatives in vitro

Old but still interesting in relation to potential cryonics.

Trehalose expression confers desiccation tolerance on human cells.

Many organisms that withstand desiccation express the disaccharide trehalose. We have now expressed the otsA and otsB genes of Escherichia coli, which encode trehalose biosynthetic enzymes, in human primary fibroblasts using a recombinant adenovirus vector. Infected cells produced increased amounts of trehalose with increasing multiplicity of infection (MOI). Human primary fibroblasts expressing trehalose could be maintained in the dry state for up to five days. Fourier transform infrared spectroscopy indicated that dry, but viable, human cells contained no detectable water. This study shows that mammalian cells can be engineered to retain viability in the absence of water.
http://www.ncbi.nlm.nih.gov/pubmed/10657122

Source: http://www.benbest.com/cryonics/Crypto.html Review and compendium of cryonics articles.

Review of bioinformatics and applications and future directions


Rise and Demise of Bioinformatics? Promise and Progress

The field of bioinformatics and computational biology has gone through a number of transformations during the past 15 years, establishing itself as a key component of new biology. This spectacular growth has been challenged by a number of disruptive changes in science and technology. Despite the apparent fatigue of the linguistic use of the term itself, bioinformatics has grown perhaps to a point beyond recognition. We explore both historical aspects and future trends and argue that as the field expands, key questions remain unanswered and acquire new meaning while at the same time the range of applications is widening to cover an ever increasing number of biological disciplines. These trends appear to be pointing to a redefinition of certain objectives, milestones, and possibly the field itself.
 http://www.ploscompbiol.org/article/info%3Adoi%2F10.1371%2Fjournal.pcbi.1002487

How do Ontology Mappings Change in the Life Sciences?

Mappings between related ontologies are increasingly used to support data integration and analysis tasks. Changes in the ontologies also require the adaptation of ontology mappings. So far the evolution of ontology mappings has received little attention albeit ontologies change continuously especially in the life sciences. We therefore analyze how mappings between popular life science ontologies evolve for different match algorithms. We also evaluate which semantic ontology changes primarily affect the mappings. We further investigate alternatives to predict or estimate the degree of future mapping changes based on previous ontology and mapping transitions.
http://arxiv.org/abs/1204.2731

Ontologies have become increasingly important in the life sciences [4, 18]. They are used to semantically annotate molecular-biological objects such as proteins or pathways [27]. Different ontologies of the same domain often contain over- lapping and related information. For instance, information about mammalian anatomy can be found in NCI Thesaurus [19] and Adult Mouse Anatomy [1]. Ontology mappings are used to express the semantic relationships between dif- ferent but related ontologies, e.g., by linking equivalent concepts of two ontolo- gies. Mappings between related ontologies are useful in many ways, in particular for data integration and enhanced analysis [21, 15]. In particular, such map- pings are needed to merge ontologies, e.g., to create an integrated cross-species anatomy ontology such as the Uber ontology [29]. Anatomy ontology mappings may also be useful to transfer knowledge from different experiments between species [3]. Furthermore, mappings can help finding objects with similar on- tological properties as interesting targets for a comparative analysis. Ontology curators can further find missing ontology annotations and get recommendations for possible ontology enhancements based on mappings to other ontologies.  

Proceedings of the first International Workshop On Open Data, WOD-2012

WOD-2012 aims at facilitating new trends and ideas from a broad range of topics concerned within the widely-spread Open Data movement, from the viewpoint of computer science research.
While being most commonly known from the recent Linked Open Data movement, the concept of publishing data explicitly as Open Data has meanwhile developed many variants and facets that go beyond publishing large and highly structured RDF/S repositories. Open Data comprises text and semi-structured data, but also open multi-modal contents, including music, images, and videos. With the increasing amount of data that is published by governments (see, e.g., data.gov, data.gov.uk or data.gouv.fr), by international organizations (data.worldbank.org or data.undp.org) and by scientific communities (tdar.org, cds.u-strasbg.fr, GenBank, IRIS or KNB) explicitly under an Open Data policy, new challenges arise not only due to the scale at which this data becomes available.
A number of community-based conferences accommodate tracks or workshops which are dedicated to Open Data. However, WOD aims to be a premier venue to gather researchers and practitioners who are contributing to and interested in the emerging field of managing Open Data from a computer science perspective. Hence, it is a unique opportunity to find in a single place up-to-date scientific works on Web-scale Open Data issues that have so far only partially been addressed by different research communities such as Databases, Data Mining and Knowledge Management, Distributed Systems, Data Privacy, and Data Visualization.
 http://arxiv.org/abs/1204.3726

Excellent bundle of papers on fractals.

http://links.uwaterloo.ca/Publications.html

Introduction to Non-Linear Algebra

  Concise introduction to a relatively new subject of non-linear algebra: literal extension of text-book linear algebra to the case of non-linear equations and maps. This powerful science is based on the notions of discriminant (hyperdeterminant) and resultant, which today can be effectively studied both analytically and by modern computer facilities. The paper is mostly focused on resultants of non-linear maps. First steps are described in direction of Mandelbrot-set theory, which is direct extension of the eigenvalue problem from linear algebra, and is related by renormalization group ideas to the theory of phase transitions and dualities.
http://arxiv.org/abs/hep-th/0609022

The discovery of geomagnetically trapped cosmic ray antiprotons

 The existence of a significant flux of antiprotons confined to Earth's magnetosphere has been considered in several theoretical works. These antiparticles are produced in nuclear interactions of energetic cosmic rays with the terrestrial atmosphere and accumulate in the geomagnetic field at altitudes of several hundred kilometers. A contribution from the decay of albedo antineutrons has been hypothesized in analogy to proton production by neutron decay, which constitutes the main source of trapped protons at energies above some tens of MeV. This Letter reports the discovery of an antiproton radiation belt around the Earth. The trapped antiproton energy spectrum in the South Atlantic Anomaly (SAA) region has been measured by the PAMELA experiment for the kinetic energy range 60--750 MeV. A measurement of the atmospheric sub-cutoff antiproton spectrum outside the radiation belts is also reported. PAMELA data show that the magnetospheric antiproton flux in the SAA exceeds the cosmic-ray antiproton flux by three orders of magnitude at the present solar minimum, and exceeds the sub-cutoff antiproton flux outside radiation belts by four orders of magnitude, constituting the most abundant source of antiprotons near the Earth.

http://arxiv.org/abs/1107.4882

Hyperbolic metamaterial interfaces: Hawking radiation from Rindler horizons and the "end of time"

Extraordinary rays in a hyperbolic metamaterial behave as particle world lines in a three dimensional (2+1) Minkowski spacetime. We analyze electromagnetic field behavior at the boundaries of this effective spacetime depending on the boundary orientation. If the boundary is perpendicular to the space-like direction in the metamaterial, an effective Rindler horizon may be observed which produces Hawking radiation. On the other hand, if the boundary is perpendicular to the time-like direction an unusual physics situation is created, which can be called "the end of time". It appears that in the lossless approximation electromagnetic field diverges at the interface in both situations. Experimental observations of the "end of time" using plasmonic metamaterials confirm this conclusion.
http://arxiv.org/abs/1107.4053

Art students can't tell grandmaster from animal.

Abstract art grandmasters score like class D amateurs

Hawley-Dolan and Winner had asked the art students to compare paintings by abstract artists with paintings made by a child or by an animal. In 67% of the cases, art students said that the painting by a renowned artist is better. I compare this with the winning probability of the chessplayers of different ratings. I conclude that the great artists score on the level of class D amateurs.
http://arxiv.org/abs/1106.1915

The randomness is a lie!

Maybe there's no such thing as a random sequence

An infinite binary sequence is deemed to be random if it has all definable properties that hold almost surely for the usual probability measure on the set of infinite binary sequences. There are only countably many such properties, so it would seem that the set of random sequences should have full measure. But in fact there might be no random sequences, because for all we know, there might be no undefinable sets.

http://arxiv.org/abs/1103.3494

It would take an infinite amount of time to truly verify any random sequence. (Probably)

Natural Language Processing : starting from scratch and discarding conventional wisdom.

 We propose a unified neural network architecture and learning algorithm that can be applied to various natural language processing tasks including: part-of-speech tagging, chunking, named entity recognition, and semantic role labeling. This versatility is achieved by trying to avoid task-specific engineering and therefore disregarding a lot of prior knowledge. Instead of exploiting man-made input features carefully optimized for each task, our system learns internal representations on the basis of vast amounts of mostly unlabeled training data. This work is then used as a basis for building a freely available tagging system with good performance and minimal computational requirements.
http://arxiv.org/abs/1103.0398
In this contribution, we try to excel on multiple benchmarks while avoiding task-specific enginering. Instead we use a single learning system able to discover adequate internal representations. In fact we view the benchmarks as indirect measurements of the relevance of the internal representations discovered by the learning procedure, and we posit that these intermediate representations are more general than any of the benchmarks. Our desire to avoid task-specific engineered features led us to ignore a large body of linguistic knowledge. Instead we reach good performance levels in most of the tasks by transferring intermediate representations discovered on large unlabeled datasets. We call this approach “almost from scratch” to emphasize the reduced (but still important) reliance on a priori NLP knowledge.  

Thinking in circles.

 Self reference in word definitions
Dictionaries are inherently circular in nature. A given word is linked to a set of alternative words (the definition) which in turn point to further descendants. Iterating through definitions in this way, one typically finds that definitions loop back upon themselves. The graph formed by such definitional relations is our object of study. By eliminating those links which are not in loops, we arrive at a core subgraph of highly connected nodes.
We observe that definitional loops are conveniently classified by length, with longer loops usually emerging from semantic misinterpretation. By breaking the long loops in the graph of the dictionary, we arrive at a set of disconnected clusters. We find that the words in these clusters constitute semantic units, and moreover tend to have been introduced into the English language at similar times, suggesting a possible mechanism for language evolution.
http://arxiv.org/abs/1103.2325

Quantum repeaters and an early applied solid-state entanglement development

 Quantum Storage of Photonic Entanglement in a Crystal
Entanglement is the fundamental characteristic of quantum physics. Large experimental efforts are devoted to harness entanglement between various physical systems. In particular, entanglement between light and material systems is interesting due to their prospective roles as "flying" and stationary qubits in future quantum information technologies, such as quantum repeaters and quantum networks. Here we report the first demonstration of entanglement between a photon at telecommunication wavelength and a single collective atomic excitation stored in a crystal. One photon from an energy-time entangled pair is mapped onto a crystal and then released into a well-defined spatial mode after a predetermined storage time. The other photon is at telecommunication wavelength and is sent directly through a 50 m fiber link to an analyzer. Successful transfer of entanglement to the crystal and back is proven by a violation of the Clauser-Horne-Shimony-Holt (CHSH) inequality by almost three standard deviations (S=2.64+/-0.23). These results represent an important step towards quantum communication technologies based on solid-state devices. In particular, our resources pave the way for building efficient multiplexed quantum repeaters for long-distance quantum networks.

http://arxiv.org/abs/1009.0489

DNA Translocation through Graphene Nanopores

Nanopores -- nanosized holes that can transport ions and molecules -- are very promising devices for genomic screening, in particular DNA sequencing. Both solid-state and biological pores suffer from the drawback, however, that the channel constituting the pore is long, viz. 10-100 times the distance between two bases in a DNA molecule (0.5 nm for single-stranded DNA). Here, we demonstrate that it is possible to realize and use ultrathin nanopores fabricated in graphene monolayers for single-molecule DNA translocation. The pores are obtained by placing a graphene flake over a microsize hole in a silicon nitride membrane and drilling a nanosize hole in the graphene using an electron beam. As individual DNA molecules translocate through the pore, characteristic temporary conductance changes are observed in the ionic current through the nanopore, setting the stage for future genomic screening.
http://arxiv.org/abs/1005.4754

Position based geo-cryptography.

Position-Based Quantum Cryptography: Impossibility and Constructions
In this work, we study position-based cryptography in the quantum setting. The aim is to use the geographical position of a party as its only credential. On the negative side, we show that if adversaries are allowed to share an arbitrarily large entangled quantum state, no secure position-verification is possible at all. We show a distributed protocol for computing any unitary operation on a state shared between the different users, using local operations and one round of classical communication. Using this surprising result, we break any position-verification scheme of a very general form. On the positive side, we show that if adversaries do not share any entangled quantum state but can compute arbitrary quantum operations, secure position-verification is achievable. Jointly, these results suggest the interesting question whether secure position-verification is possible in case of a bounded amount of entanglement. Our positive result can be interpreted as resolving this question in the simplest case, where the bound is set to zero.
In models where secure positioning is achievable, it has a number of interesting applications. For example, it enables secure communication over an insecure channel without having any pre-shared key, with the guarantee that only a party at a specific location can learn the content of the conversation. More generally, we show that in settings where secure position-verification is achievable, other position-based cryptographic schemes are possible as well, such as secure position-based authentication and position-based key agreement.
http://arxiv.org/abs/1009.2490

Metamaterials as toy universes

Metamaterial "Multiverse"
Optical space in metamaterials may be engineered to mimic the landscape of a multidimensional Universe which has regions of different topology and different effective dimensionality. The "metamaterial landscape" may include regions in which one or two spatial dimensions are compactified. Nonlinear optics of metamaterials in these regions mimics either U(1) or SU(2) Kaluza-Klein theories having one or more kinds of effective charges. As a result, novel "photon blockade" nonlinear optical metamaterial devices may be realized. Topology-changing phase transitions in such metamaterials lead to considerable particle creation perceived as flashes of light, thus providing a toy model of birth of an individual physical Universe.

http://arxiv.org/abs/1005.1002

Verlinde criticisms.

 Entropic force and its cosmological implications
We investigate a possibility of realizing the entropic force into the cosmology. A main issue is how the holographic screen is implemented in the Newtonian cosmology. Contrary to the relativistic realization of Friedmann equations, we do not clarify the connection between Newtonian cosmology and entropic force because there is no way of implementing the holographic screen in the Newtonian cosmology.
 http://arxiv.org/abs/1005.2240
However, one of urgent issues to resolve is to answer to the question of how one can construct a spherical holographic screen of radius R which encloses a source mass M located at the origin to understand the entropic force. This is a critical and important issue because the holographic screen (an exotic description of spacetime) originates from relativistic approaches to black hole [32, 33] and cosmology [34]. Verlinde has introduced this screen by analogy with an absorbing process of a particle around the event horizon of black hole. Considering a smaller test mass m located at ∆x away from the screen 2 and getting the change of entropy on the screen, its behavior should resemble that of a particle approaching a stretched horizon of a black hole, as described by Bekenstein [2]. Before proceeding, we would like to mention what is the difference between Newtonian gravity and general relativity [35]. First Newtonian gravity is an action-at-a-distance, that is, the gravitational influence propagates instantaneously (c → ∞), implying the violation of causality. Second, Newtonian gravity is ignorant of the presence of horizon where the relativistic effects are supposed to dominates. For example, the horizons are considered as either the event horizon of black hole or the apparent horizons in the Friedmann- Robertson-Walker (FRW) universe. Comparing Newtonian gravity and general relativity in cosmology is different than in the case of isolated, asymptotically flat systems [36]. For isolated systems, both Newtonian gravity and general relativity are well-defined. In con- trast, while relativistic cosmology is well-defined, there is no unique way to accommodate Newtonian theory of cosmology because the Newtonian equations are only defined up to boundary terms which have to be specified at all times. Hence it is not easy to implement the entropic force into the cosmological setting. In the literatures [7, 9, 11], the authors did not mention explicitly how the entropic force (3) works for the cosmological purpose. It seems that the entropic force is not realized in the Newtonian cosmology unless the holographic screen is clearly defined. 

How many universes are there?

How many universes are in the multiverse?
We argue that the total number of distinguishable locally Friedmann universes generated by eternal inflation is proportional to the exponent of the entropy of inflationary perturbations and is limited by e^{e^{3 N}}, where N is the number of e-folds of slow-roll post-eternal inflation. For simplest models of chaotic inflation, N is approximately equal to de Sitter entropy at the end of eternal inflation; it can be exponentially large. However, not all of these universes can be observed by a local observer. In the presence of a cosmological constant \Lambda the number of distinguishable universes is bounded by e^{|\Lambda|^{-3/4}}. In the context of the string theory landscape, the overall number of different universes is expected to be exponentially greater than the total number of vacua in the landscape. We discuss the possibility that the strongest constraint on the number of distinguishable universes may be related not to the properties of the multiverse but to the properties of observers.
With the invention of inflationary cosmology, the no- tion of a uniform universe was gradually replaced by the notion of a multiverse consisting of many locally uniform exponentially large parts [1, 2]. Each of these parts lo- cally looks like a uniform nearly-Friedmann universe. A collection of all of these universes represents an eternally growing fractal consisting of many such “universes” with different properties [3–5]. This scenario recently became quite popular when a mechanism to stabilize string the- ory vacua was found [6], and string theorists realized [7], in agreement with earlier expectations [8, 9], that the to- tal number of different stringy vacua can be extremely large. The popular estimate of the number of different vacua is ∌ 10500 , but the true number may be much smaller or much greater than that [7]. Because of the transitions from one vacuum state to another, the infla- tionary multiverse becomes divided into an exponentially large number of different exponentially large “universes” with different laws of low-energy physics operating in each of them. This picture, which is now known as the string theory landscape [10], was envisaged in the very first paper on eternal chaotic inflation [4]. 

http://arxiv.org/abs/0910.1589

An easier to understand intro to HDE / entropic force / Verlinde's work


Planck Scale Effect in the Entropic Force Law
In this note we generalize the quantum uncertainty relation proposed by Vancea and Santos [7] in the entropic force law, by introducing Planck scale modifications. The latter is induced by the Generalized Uncertainty Principle. We show that the proposed uncertainty relation of [7], involving the entropic force and the square of particle position, gets modified from the consideration of a minimum measurable length, (which can be the Planck length).
http://arxiv.org/abs/1003.0285

Gravity as Entanglement, similar to Verlinde's work.

Gravity as Quantum Entanglement Force
We conjecture that quantum entanglement of matter and vacuum in the universe tend to increase with time, like entropy, and there is an effective force called quantum entanglement force associated with this tendency. It is also suggested that gravity and dark energy are types of the quantum entanglement force, similar to Verlinde's entropic force. If the entanglement entropy of the universe saturates the Bekenstein bound, this gives holographic dark energy with the equation of state consistent with current observational data. This connection between quantum information and gravity gives some new insights on the origin of gravity, dark energy, the holographic principle and arrow of time.
http://arxiv.org/abs/1002.4568
Related: http://arxivindex.blogspot.com/2012/04/original-paper-connecting-dark-energy.html

Theoretical limits of computation

"Seth Lloyd of MIT has previously addressed himself to calculating the conceivable limits on the computing power of such a black hole computer (Nature, 31 August 2000) and arrives at a maximum processing speed of about 10^51 operations/sec for a 1-kg black hole."

Black hole starships.

Are Black Hole Starships Possible
We investigate whether it is physically possible to build starships or power sources using the Hawking radiation of an artificial black hole as a power source. The proposal seems to be at the edge of possibility, but quantum gravity effects could change the picture.
 http://arxiv.org/abs/0908.1803

Info-computationalist naturalism.

A Dialogue Concerning Two World Systems: Info-Computational vs. Mechanistic
The dialogue develops arguments for and against adopting a new world system, info-computationalist naturalism, that is poised to replace the traditional mechanistic world system. We try to figure out what the info-computational paradigm would mean, in particular its pancomputationalism. We make some steps towards developing the notion of computing that is necessary here, especially in relation to traditional notions. We investigate whether pancomputationalism can possibly provide the basic causal structure to the world, whether the overall research programme appears productive and whether it can revigorate computationalism in the philosophy of mind.

http://arxiv.org/abs/0910.5001

Pulsar Timing Arrays as new astro-clocks.

 Pulsar timing array projects
Pulsars are amongst the most stable rotators known in the Universe. Over many years some millisecond pulsars rival the stability of atomic clocks. Comparing observations of many such stable pulsars may allow the first direct detection of gravitational waves, improve the Solar System planetary ephemeris and provide a means to study irregularities in terrestrial time scales. Here we review the goals and status of current and future pulsar timing array projects.
http://arxiv.org/abs/0911.0943

Pedestrian dynamics.


Pedestrian Traffic: on the Quickest Path
When a large group of pedestrians moves around a corner, most pedestrians do not follow the shortest path, which is to stay as close as possible to the inner wall, but try to minimize the travel time. For this they accept to move on a longer path with some distance to the corner, to avoid large densities and by this succeed in maintaining a comparatively high speed. In many models of pedestrian dynamics the basic rule of motion is often either "move as far as possible toward the destination" or - reformulated - "of all coordinates accessible in this time step move to the one with the smallest distance to the destination". Atop of this rule modifications are placed to make the motion more realistic. These modifications usually focus on local behavior and neglect long-ranged effects. Compared to real pedestrians this leads to agents in a simulation valuing the shortest path a lot better than the quickest. So, in a situation as the movement of a large crowd around a corner, one needs an additional element in a model of pedestrian dynamics that makes the agents deviate from the rule of the shortest path. In this work it is shown, how this can be achieved by using a flood fill dynamic potential field method, where during the filling process the value of a field cell is not increased by 1, but by a larger value, if it is occupied by an agent. This idea may be an obvious one, however, the tricky part - and therefore in a strict sense the contribution of this work - is a) to minimize unrealistic artifacts, as naive flood fill metrics deviate considerably from the Euclidean metric and in this respect yield large errors, b) do this with limited computational effort, and c) keep agents' movement at very low densities unaltered.
http://arxiv.org/abs/0901.0170

More connections between bio/phys

Quantum physics meets biology

http://arxiv.org/abs/0911.0155

Asymmetric(like RSA) quantum crypto using symmetric keys


Quantum asymmetric cryptography with symmetric keys
Based on quantum encryption, we present a new idea for quantum public-key cryptography (QPKC) and construct a whole theoretical framework of a QPKC system. We show that the quantum-mechanical nature renders it feasible and reasonable to use symmetric keys in such a scheme, which is quite different from that in conventional public-key cryptography. The security of our scheme is analyzed and some features are discussed. Furthermore, the state-estimation attack to a prior QPKC scheme is demonstrated.
http://arxiv.org/abs/0810.2859
As a result, most of PKC schemes will be broken by future quantum computer. It is natural to ask, at that time, what is the substitution for PKC to distribute a key? One possible way is to exploit quantum mechanics, which is called quantum key distri- bution (QKD) or quantum cryptography [6]. QKD has a unique property, that is, the potential eavesdropping would be exposed by the users, and consequently it can achieve unconditional security in theory. This security is assured by fundamental principles in quantum mechanics instead of hardness of computational problems. In fact, QKD can only realize one application of PKC, i.e., key distribution. But about digital signature, the other important application, what can we do? Obviously we do not want to give up the significant flexibility of PKC even in the era of quantum computer. To this end the research is progressing along two directions. One is to look for difficult problems under quantum computa- tion (especially the existing quantum algorithms [4, 5]) and construct PKC based on them [7, 8, 9, 10]. In these schemes the key is still composed of classical bits, and it follows that the flexibility of PKC is retained. But the fact that their security lies on unproved computational assumptions is unchanged. For simplicity, we call this kind of cryptosystems the first class of quantum PKC (QPKC class I). The other direction pursues PKC with perfect security by adding more quantum elements in the schemes, which is just like that of QKD [11, 12]. In these schemes the security is assured by physical laws instead of unproved assumptions. However, the keys generally contain qubits, which are, at least within current tech- niques, more difficult to deal with, and then the flexibil- ity of PKC would be reduced to some extent. We call these cryptosystems the second class of quantum PKC (QPKC class II). In our opinion, both classes of QPKC are of significance for the future applications. Class I is more practical, whereas class II is more ideal and still needs more related researches. In this paper we study the latter. 

Adult stem cell reverse aging and a new large scale manufacuturing technique.


 Researchers have shown they can reverse the aging process for human adult stem cells, which are responsible for helping old or damaged tissues regenerate. The findings could lead to medical treatments that may repair a host of ailments that occur because of tissue damage as people age. A research group led by the Buck Institute for Research on Aging and the Georgia Institute of Technology conducted the study in cell culture, which appears in the September 1, 2011 edition of the journal Cell Cycle.
http://www.news-medical.net/news/20110921/Researchers-can-reverse-aging-process-for-human-adult-stem-cells.aspx
Southampton, explained: “Until now, it’s been very difficult to grow stem cells in sufficient numbers and maintain them as stem cells for use in therapy. What we and our colleagues at the University of Southampton have shown is that this new nanostructured surface can be used to very effectively culture mesencyhmal stem cells, taken from sources such as bone marrow, which can then be put to use in musculoskeletal, orthopaedic and connective tissues. “If the same process can be used to culture other types of stem cells too, and this research in under way in our labs, our technology could be the first step on the road to developing large-scale stem cell culture factories.” *The paper, titled ‘Nanoscale surfaces for the long-term maintenance of mesenchymalstem cell phenotype and multipotency’, is published in the journal Nature Materials.
http://www.labmate-online.com/news/news-and-views/5/university_of_southampton/stem_cell_breakthrough_has_therapies_potential/16830/

Wednesday, April 25, 2012

Unsung hero of GR


If you've never heard Grassmann's story it's worth reading. He barely received any recognition during his life and his work was revived years later as an almost completely independent branch of math behind many modern theories.
 Kummer's report ended any chance that Grassmann might obtain a university post. This episode proved the norm; time and again, leading figures of Grassmann's day failed to recognize the value of his mathematics.
 http://en.wikipedia.org/wiki/Hermann_Grassmann
He was intensely interested in linguistics and tracing the origins of language, especially trying to prove the Gothic language was older or independent of Sanskrit and Proto-Indo-European.

He made several important linguistic discoveries,
http://en.wikipedia.org/wiki/Grassmann%27s_law
The fact that deaspiration in Greek took place after the change of Proto-Indo-European *bʰ, *dʰ, *gʰ to /pʰ, tʰ, kʰ/, and the fact that no other Indo-European languages show Grassmann's law, suggests that Grassmann's law developed separately in Greek and Sanskrit (although quite possibly due to areal influence from one language to the other), i.e. that it was not inherited from PIE
 As an aside:
(The modern understanding of the Dravidian language stands out as a failure of the proto-Indo-European model as a central origin in some views as it is so distinctly different as to lack common features.)
The Dravidian languages have not been shown to be related to any other language family.
http://en.wikipedia.org/wiki/Dravidian_languages#Relationship_to_other_language_families
Perhaps some of his interest can be explained by the relationship between Sanskrit and machine language with respect to grammar.
http://en.wikipedia.org/wiki/Backus%E2%80%93Naur_Form
The name Pāṇini Backus form has also been suggested in view of the facts that the expansion Backus Normal Form may not be accurate, and that Pāṇini had independently discovered a similar notation earlier. [7]
http://www.infinityfoundation.com/mandala/t_es/t_es_rao-t_syntax.htm
Dr. Alexander Wilhelmy has called to my attention a work by Panini. , Panini was a scholar who flourished between 400 B.C. and 200 B.C.; perhaps his most significant work was the compilation of a grammar of Sanskrit. In order to describe the (rather complicated) rules of grammar, he invented a notation which is equivalent in its power to that of Backus, and has many similar properties: given the use to which the notation was put, it is possible to identify structures equivalent to the Backus "|" and to the use of the meta-brackets "<" and ">" enclosing suggestive names. Panini avoided the necessity for the character "::=" by writing the meta-result on the right rather than the left [see, or Ingerman (1996) for a similar notation].

Grassmann was clearly able to see the connection between math and linguistics and was intensely active in both fields. Is it so strange then that we should see a resurgence in recent research finding connections between the two fields, (in Lagrangian dynamics, random matrices, prime numbers and the Riemann hypothesis, and more between linguistics and math and physics and even game theory.)

Related: http://arxivindex.blogspot.com/2012/04/more-linguistics-and-qft.html
http://arxivindex.blogspot.com/2012/04/random-matrices.html

Entropy-based Tuning of Musical Instruments

The human sense of hearing perceives a combination of sounds 'in tune' if the corresponding harmonic spectra are correlated, meaning that the neuronal excitation pattern in the inner ear exhibits some kind of order. Based on this observation it is suggested that musical instruments such as pianos can be tuned by minimizing the Shannon entropy of suitably preprocessed Fourier spectra. This method reproduces not only the correct stretch curve but also similar pitch fluctuations as in the case of high-quality aural tuning.
 http://arxiv.org/abs/1203.5101

Related:

In case you've never heard of http://en.wikipedia.org/wiki/Just_intonation it's a tuning system that preserves the harmonic integer frequencies instead of accepting slight errors in retuning scales. It's only possible to use because of modern electronic synthesizers negating the need for physical key changes. But wave-graphs of JI music are much less dissonant and more visually appealing, and plenty of people claim the notes sound better.

Degenerate robots

The capacity to adapt can greatly influence the success of systems that need to compensate for damaged parts, learn how to achieve robust performance in new environments, or exploit novel opportunities that originate from new technological interfaces or emerging markets. Many of the conditions in which technology is required to adapt cannot be anticipated during its design stage, creating a significant challenge for the designer. Inspired by the study of a range of biological systems, we propose that degeneracy - the realization of multiple, functionally versatile components with contextually overlapping functional redundancy - will support adaptation in technologies because it effects pervasive flexibility, evolutionary innovation, and homeostatic robustness. We provide examples of degeneracy in a number of rudimentary living technologies from military socio-technical systems to swarm robotics and we present design principles - including protocols, loose regulatory coupling, and functional versatility - that allow degeneracy to arise in both biological and man-made systems.
http://arxiv.org/abs/1112.3117

The math behind organ growth.

Organogenesis is a tightly regulated process that has been studied experimentally for decades. Computational models can help to integrate available knowledge and to better understand the underlying regulatory logic. We are currently studying mechanistic models for the development of limbs, lungs, kidneys, and bone. We have tested a number of alternative methods to solve our spatio- temporal differential equation models of reaction-diffusion type on growing domains of realistic shape, among them finite elements in COMSOL Multiphysics. Given the large number of variables (up to fifteen), the sharp domain boundaries, the travelling wave character of some solutions, and the stiffness of the reactions we are facing numerous numerical challenges. To test new ideas efficiently we have developed a strategy to optimize simulation times in COMSOL.
http://arxiv.org/abs/1202.0428