Do you want to publish a course? Click here

A Simulation for Neurophotonic Quantum Computation in Visual Pathways

122   0   0.0 ( 0 )
 Added by Vahid Salari
 Publication date 2014
  fields Biology
and research's language is English




Ask ChatGPT about the research

One of the answers to the measurement problem in quantum theory is given by the Copenhagen-Interpretation of quantum theory (i.e. orthodox quantum theory) in which the wave function collapse happens in (by) the mind of observer. In fact, at first, great scientists like Von Neumann, London, Bauer and Wigner (initially) believed that the wave function collapse occurs in the brain or is caused by the consciousness of observer. However, this issue has been stayed yet very controversial. In fact, there are many challenging discussions about the survival of quantum effects in microscopic structures of the human brain, which is mainly because of quick decoherence of quantum states due to hot, wet and noisy environment of the brain that forbids long life coherence for brain processing. Nevertheless, there are also several arguments and evidences that emergence of large coherent states is feasible in the brain. In this paper, our approach is based on the latter in which macroscopic quantum states are probable in the human brain. Here, we simulate the delayed luminescence of photons in neurons with a Brassard-like teleportation circuit, i.e. equivalent to the transfer of quantum states of photons through the visual pathways from retina to the visual cortex. Indeed, our simulation considers both classical and quantum mechanical aspects of processing in neurons. As a result and based on our simulation, it is possible for our brain to receive the exact quantum states of photons in the visual cortex to be collapsed by our consciousness, which supports the Copenhagen-Interpretation of measurement problem in quantum theory.



rate research

Read More

Background: The global spread of the severe acute respiratory syndrome (SARS) epidemic has clearly shown the importance of considering the long-range transportation networks in the understanding of emerging diseases outbreaks. The introduction of extensive transportation data sets is therefore an important step in order to develop epidemic models endowed with realism. Methods: We develop a general stochastic meta-population model that incorporates actual travel and census data among 3 100 urban areas in 220 countries. The model allows probabilistic predictions on the likelihood of country outbreaks and their magnitude. The level of predictability offered by the model can be quantitatively analyzed and related to the appearance of robust epidemic pathways that represent the most probable routes for the spread of the disease. Results: In order to assess the predictive power of the model, the case study of the global spread of SARS is considered. The disease parameter values and initial conditions used in the model are evaluated from empirical data for Hong Kong. The outbreak likelihood for specific countries is evaluated along with the emerging epidemic pathways. Simulation results are in agreement with the empirical data of the SARS worldwide epidemic. Conclusions: The presented computational approach shows that the integration of long-range mobility and demographic data provides epidemic models with a predictive power that can be consistently tested and theoretically motivated. This computational strategy can be therefore considered as a general tool in the analysis and forecast of the global spreading of emerging diseases and in the definition of containment policies aimed at reducing the effects of potentially catastrophic outbreaks.
Homicide investigations often depend on the determination of a minimum post-mortem interval (PMI$_{min}$) by forensic entomologists. The age of the most developed insect larvae (mostly blow fly larvae) gives reasonably reliable information about the minimum time a person has been dead. Methods such as isomegalen diagrams or ADH calculations can have problems in their reliability, so we established in this study a new growth model to calculate the larval age of textit{Lucilia sericata} (Meigen 1826). This is based on the actual non-linear development of the blow fly and is designed to include uncertainties, e.g. for temperature values from the crime scene. We used published data for the development of textit{L. sericata} to estimate non-linear functions describing the temperature dependent behavior of each developmental state. For the new model it is most important to determine the progress within one developmental state as correctly as possible since this affects the accuracy of the PMI estimation by up to 75%. We found that PMI calculations based on one mean temperature value differ by up to 65% from PMIs based on an 12-hourly time temperature profile. Differences of 2degree C in the estimation of the crime scene temperature result in a deviation in PMI calculation of 15 - 30%.
Data augmentation is practically helpful for visual recognition, especially at the time of data scarcity. However, such success is only limited to quite a few light augmentations (e.g., random crop, flip). Heavy augmentations (e.g., gray, grid shuffle) are either unstable or show adverse effects during training, owing to the big gap between the original and augmented images. This paper introduces a novel network design, noted as Augmentation Pathways (AP), to systematically stabilize training on a much wider range of augmentation policies. Notably, AP tames heavy data augmentations and stably boosts performance without a careful selection among augmentation policies. Unlike traditional single pathway, augmented images are processed in different neural paths. The main pathway handles light augmentations, while other pathways focus on heavy augmentations. By interacting with multiple paths in a dependent manner, the backbone network robustly learns from shared visual patterns among augmentations, and suppresses noisy patterns at the same time. Furthermore, we extend AP to a homogeneous version and a heterogeneous version for high-order scenarios, demonstrating its robustness and flexibility in practical usage. Experimental results on ImageNet benchmarks demonstrate the compatibility and effectiveness on a much wider range of augmentations (e.g., Crop, Gray, Grid Shuffle, RandAugment), while consuming fewer parameters and lower computational costs at inference time. Source code:https://github.com/ap-conv/ap-net.
This review article summarizes the requirement of low temperature conditions in existing experimental approaches to quantum computation and quantum simulation.
346 - Lu Xie , Yi Zhang 2009
Constraint-based modeling has been widely used on metabolic networks analysis, such as biosynthetic prediction and flux optimization. The linear constraints, like mass conservation constraint, reversibility constraint, biological capacity constraint, can be imposed on linear algorithms. However, recently a non-linear constraint based on the second thermodynamic law, known as loop law, has emerged and challenged the existing algorithms. Proven to be unfeasible with linear solutions, this non-linear constraint has been successfully imposed on the sampling process. In this place, Monte - Carlo sampling with Metropolis criterion and Simulated Annealing has been introduced to optimize the Biomass synthesis of genome scale metabolic network of Helicobacter pylori (iIT341 GSM / GPR) under mass conservation constraint, biological capacity constraint, and thermodynamic constraints including reversibility and loop law. The sampling method has also been employed to optimize a non-linear objective function, the Biomass synthetic rate, which is unified by the total income number of reducible electrons. To verify whether a sample contains internal loops, an automatic solution has been developed based on solving a set of inequalities. In addition, a new type of pathway has been proposed here, the Futile Pathway, which has three properties: 1) its mass flow could be self-balanced; 2) it has exchange reactions; 3) it is independent to the biomass synthesis. To eliminate the fluxes of the Futile Pathways in the sampling results, a linear programming based method has been suggested and the results have showed improved correlations among the reaction fluxes in the pathways related to Biomass synthesis.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا