Search results for: computational brain
1701 Analysis of Waterjet Propulsion System for an Amphibious Vehicle
Authors: Nafsi K. Ashraf, C. V. Vipin, V. Anantha Subramanian
Abstract:
This paper reports the design of a waterjet propulsion system for an amphibious vehicle based on circulation distribution over the camber line for the sections of the impeller and stator. In contrast with the conventional waterjet design, the inlet duct is straight for water entry parallel and in line with the nozzle exit. The extended nozzle after the stator bowl makes the flow more axial further improving thrust delivery. Waterjet works on the principle of volume flow rate through the system and unlike the propeller, it is an internal flow system. The major difference between the propeller and the waterjet occurs at the flow passing the actuator. Though a ducted propeller could constitute the equivalent of waterjet propulsion, in a realistic situation, the nozzle area for the Waterjet would be proportionately larger to the inlet area and propeller disc area. Moreover, the flow rate through impeller disk is controlled by nozzle area. For these reasons the waterjet design is based on pump systems rather than propellers and therefore it is important to bring out the characteristics of the flow from this point of view. The analysis is carried out using computational fluid dynamics. Design of waterjet propulsion is carried out adapting the axial flow pump design and performance analysis was done with three-dimensional computational fluid dynamics (CFD) code. With the varying environmental conditions as well as with the necessity of high discharge and low head along with the space confinement for the given amphibious vehicle, an axial pump design is suitable. The major problem of inlet velocity distribution is the large variation of velocity in the circumferential direction which gives rise to heavy blade loading that varies with time. The cavitation criteria have also been taken into account as per the hydrodynamic pump design. Generally, waterjet propulsion system can be parted into the inlet, the pump, the nozzle and the steering device. The pump further comprises an impeller and a stator. Analytical and numerical approaches such as RANSE solver has been undertaken to understand the performance of designed waterjet propulsion system. Unlike in case of propellers the analysis was based on head flow curve with efficiency and power curves. The modeling of the impeller is performed using rigid body motion approach. The realizable k-ϵ model has been used for turbulence modeling. The appropriate boundary conditions are applied for the domain, domain size and grid dependence studies are carried out.Keywords: amphibious vehicle, CFD, impeller design, waterjet propulsion
Procedia PDF Downloads 2291700 On-Road Text Detection Platform for Driver Assistance Systems
Authors: Guezouli Larbi, Belkacem Soundes
Abstract:
The automation of the text detection process can help the human in his driving task. Its application can be very useful to help drivers to have more information about their environment by facilitating the reading of road signs such as directional signs, events, stores, etc. In this paper, a system consisting of two stages has been proposed. In the first one, we used pseudo-Zernike moments to pinpoint areas of the image that may contain text. The architecture of this part is based on three main steps, region of interest (ROI) detection, text localization, and non-text region filtering. Then, in the second step, we present a convolutional neural network architecture (On-Road Text Detection Network - ORTDN) which is considered a classification phase. The results show that the proposed framework achieved ≈ 35 fps and an mAP of ≈ 90%, thus a low computational time with competitive accuracy.Keywords: text detection, CNN, PZM, deep learning
Procedia PDF Downloads 841699 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Orlin Davchev
Abstract:
The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction
Procedia PDF Downloads 661698 Peak Frequencies in the Collective Membrane Potential of a Hindmarsh-Rose Small-World Neural Network
Authors: Sun Zhe, Ruggero Micheletto
Abstract:
As discussed extensively in many studies, noise in neural networks have an important role in the functioning and time evolution of the system. The mechanism by which noise induce stochastic resonance enhancing and influencing certain operations is not clarified nor is the mechanism of information storage and coding. With the present research we want to study the role of noise, especially focusing on the frequency peaks in a three variable Hindmarsh−Rose Small−World network. We investigated the behaviour of the network to external noises. We demonstrate that a variation of signal to noise ratio of about 10 dB induces an increase in membrane potential signal of about 15%, averaged over the whole network. We also considered the integral of the whole membrane potential as a paradigm of internal noise, the one generated by the brain network. We showed that this internal noise is attenuated with the size of the network or with the number of random connections. By means of Fourier analysis we found that it has distinct peaks of frequencies, moreover, we showed that increasing the size of the network introducing more neurons, reduced the maximum frequencies generated by the network, whereas the increase in the number of random connections (determined by the small-world probability p) led to a trend toward higher frequencies. This study may give clues on how networks utilize noise to alter the collective behaviour of the system in their operations.Keywords: neural networks, stochastic processes, small-world networks, discrete Fourier analysis
Procedia PDF Downloads 2921697 Increase in Specificity of MicroRNA Detection by RT-qPCR Assay Using a Specific Extension Sequence
Authors: Kyung Jin Kim, Jiwon Kwak, Jae-Hoon Lee, Soo Suk Lee
Abstract:
We describe an innovative method for highly specific detection of miRNAs using a specially modified method of poly(A) adaptor RT-qPCR. We use uniquely designed specific extension sequence, which plays important role in providing an opportunity to affect high specificity of miRNA detection. This method involves two steps of reactions as like previously reported and which are poly(A) tailing and reverse-transcription followed by real-time PCR. Firstly, miRNAs are extended by a poly(A) tailing reaction and then converted into cDNA. Here, we remarkably reduced the reaction time by the application of short length of poly(T) adaptor. Next, cDNA is hybridized to the 3’-end of a specific extension sequence which contains miRNA sequence and results in producing a novel PCR template. Thereafter, the SYBR Green-based RT-qPCR progresses with a universal poly(T) adaptor forward primer and a universal reverse primer. The target miRNA, miR-106b in human brain total RNA, could be detected quantitatively in the range of seven orders of magnitude, which demonstrate that the assay displays a dynamic range of at least 7 logs. In addition, the better specificity of this novel extension-based assay against well known poly(A) tailing method for miRNA detection was confirmed by melt curve analysis of real-time PCR product, clear gel electrophoresis and sequence chromatogram images of amplified DNAs.Keywords: microRNA(miRNA), specific extension sequence, RT-qPCR, poly(A) tailing assay, reverse transcription
Procedia PDF Downloads 3081696 Autoantibodies against Central Nervous System Antigens and the Serum Levels of IL-32 in Patients with Schizophrenia
Authors: Fatemeh Keshavarz
Abstract:
Background: Schizophrenia is a disease of the nervous system, and immune system disorders can affect its pathogenesis. Activation of microglia, proinflammatory cytokines, disruption of the blood-brain barrier (BBB) due to inflammation, activation of autoreactive B cells, and consequently the production of autoantibodies against system antigens are among the immune processes involved in neurological diseases. interleukin 32 (IL-32) a proinflammatory cytokine that important player in the activation of the innate and adaptive immune responses. This study aimed to measure the serum level of IL-32 as well as the frequency of autoantibody positivity against several nervous system antigens in patients with schizophrenia. Material and Methods: This study was conducted on 40 patients with schizophrenia and 40 healthy individuals in the control group. Serum IL-32 levels were measured by ELISA. The frequency of autoantibodies against Hu, Ri, Yo, Tr, CV2, Amphiphysin, SOX1, Zic4, ITPR1, CARP, GAD, Recoverin, Titin, and Ganglioside antigens were measured by indirect immunofluorescence method. Results: Serum IL-32 levels in patients with schizophrenia were significantly higher compared to the control group. Autoantibodies were positive in 8 patients for GAD antigen and 5 patients for Ri antigen, which showed a significant relationship compared to the control group. Autoantibodies were also positive in 2 patients for CV2, in 1 patient for Hu, and in 1 patient for CARP. Negative results were reported for other antigens. Conclusion: Our findings suggest that elevated the serum IL-32 level and autoantibody positivity against several nervous system antigens may be involved in the pathogenesis of schizophrenia.Keywords: schizophrenia, microglia, autoantibodies, IL-32
Procedia PDF Downloads 1271695 Solving Stochastic Eigenvalue Problem of Wick Type
Authors: Hassan Manouzi, Taous-Meriem Laleg-Kirati
Abstract:
In this paper we study mathematically the eigenvalue problem for stochastic elliptic partial differential equation of Wick type. Using the Wick-product and the Wiener-Ito chaos expansion, the stochastic eigenvalue problem is reformulated as a system of an eigenvalue problem for a deterministic partial differential equation and elliptic partial differential equations by using the Fredholm alternative. To reduce the computational complexity of this system, we shall use a decomposition-coordination method. Once this approximation is performed, the statistics of the numerical solution can be easily evaluated.Keywords: eigenvalue problem, Wick product, SPDEs, finite element, Wiener-Ito chaos expansion
Procedia PDF Downloads 3591694 Cognitive and Behavioral Disorders in Patients with Precuneal Infarcts
Authors: F. Ece Cetin, H. Nezih Ozdemir, Emre Kumral
Abstract:
Ischemic stroke of the precuneal cortex (PC) alone is extremely rare. This study aims to evaluate the clinical, neurocognitive, and behavioural characteristics of isolated PC infarcts. We assessed neuropsychological and behavioral findings in 12 patients with isolated PC infarct among 3800 patients with ischemic stroke. To determine the most frequently affected brain locus in patients, we first overlapped the ischemic area of patients with specific cognitive disorders and patients without specific cognitive disorders. Secondly, we compared both overlap maps using the 'subtraction plot' function of MRIcroGL. Patients showed various types of cognitive disorders. All patients experienced more than one category of cognitive disorder, except for two patients with only one cognitive disorder. Lesion topographical analysis showed that damage within the anterior precuneal region might lead to consciousness disorders (25%), self-processing impairment (42%), visuospatial disorders (58%), and lesions in the posterior precuneal region caused episodic and semantic memory impairment (33%). The whole precuneus is involved in at least one body awareness disorder. The cause of the stroke was cardioembolism in 5 patients (42%), large artery disease in 3 (25%), and unknown in 4 (33%). This study showed a wide variety of neuropsychological and behavioural disorders in patients with precuneal infarct. Future studies are needed to achieve a proper definition of the function of the precuneus in relation to the extended cortical areas. Precuneal cortex region infarcts have been found to predict a source of embolism from the large arteries or heart.Keywords: cognition, pericallosal artery, precuneal cortex, ischemic stroke
Procedia PDF Downloads 1311693 Activation of Mirror Neuron System Response to Drumming Training: A Functional Magnetic Resonance Imaging Study
Authors: Manal Alosaimi
Abstract:
Many rehabilitation strategies exist to aid persons with neurological disorders relearn motor skills through intensive training. Evidence supporting the theory that cortical areas involved in motor execution can be triggered by observing actions performed by others is attributed to the function of the mirror neuron system (MNS) indicates that activation of the MNS is associated with improvements in physical action and motor learning. Therefore, it is important to investigate the relationship between motor training (in this case, playing the drums) and the activation of the MNS. To achieve this, 15 healthy right-handed participants received drum-kit training for 21 weeks, during which time blood oxygen level-dependent (BOLD) signals were monitored in the brain using functional magnetic resonance imaging (fMRI). Participants were required to perform action–observation and action–execution fMRI tasks. The main results are that BOLD signals in classical regions of the MNS such as supramarginal gyri, inferior parietal lobule, and supplementary motor area increase significantly over the training period. Activation of these areas indicates that passive-observation of others performing these same skills may facilitate recovery of persons suffering from neurological disorders, and complement conventional rehabilitation programs that focus on action execution or intense training.Keywords: fMRI, mirror neuron system, magnetic resonance imaging, neuroplasticity, drumming, learning, music, action observation, action execution
Procedia PDF Downloads 391692 Analysis of Cardiac Health Using Chaotic Theory
Authors: Chandra Mukherjee
Abstract:
The prevalent knowledge of the biological systems is based on the standard scientific perception of natural equilibrium, determination and predictability. Recently, a rethinking of concepts was presented and a new scientific perspective emerged that involves complexity theory with deterministic chaos theory, nonlinear dynamics and theory of fractals. The unpredictability of the chaotic processes probably would change our understanding of diseases and their management. The mathematical definition of chaos is defined by deterministic behavior with irregular patterns that obey mathematical equations which are critically dependent on initial conditions. The chaos theory is the branch of sciences with an interest in nonlinear dynamics, fractals, bifurcations, periodic oscillations and complexity. Recently, the biomedical interest for this scientific field made these mathematical concepts available to medical researchers and practitioners. Any biological network system is considered to have a nominal state, which is recognized as a homeostatic state. In reality, the different physiological systems are not under normal conditions in a stable state of homeostatic balance, but they are in a dynamically stable state with a chaotic behavior and complexity. Biological systems like heart rhythm and brain electrical activity are dynamical systems that can be classified as chaotic systems with sensitive dependence on initial conditions. In biological systems, the state of a disease is characterized by a loss of the complexity and chaotic behavior, and by the presence of pathological periodicity and regulatory behavior. The failure or the collapse of nonlinear dynamics is an indication of disease rather than a characteristic of health.Keywords: HRV, HRVI, LF, HF, DII
Procedia PDF Downloads 4281691 Comparative Evaluation of Pharmacologically Guided Approaches (PGA) to Determine Maximum Recommended Starting Dose (MRSD) of Monoclonal Antibodies for First Clinical Trial
Authors: Ibraheem Husain, Abul Kalam Najmi, Karishma Chester
Abstract:
First-in-human (FIH) studies are a critical step in clinical development of any molecule that has shown therapeutic promise in preclinical evaluations, since preclinical research and safety studies into clinical development is a crucial step for successful development of monoclonal antibodies for guidance in pharmaceutical industry for the treatment of human diseases. Therefore, comparison between USFDA and nine pharmacologically guided approaches (PGA) (simple allometry, maximum life span potential, brain weight, rule of exponent (ROE), two species methods and one species methods) were made to determine maximum recommended starting dose (MRSD) for first in human clinical trials using four drugs namely Denosumab, Bevacizumab, Anakinra and Omalizumab. In our study, the predicted pharmacokinetic (pk) parameters and the estimated first-in-human dose of antibodies were compared with the observed human values. The study indicated that the clearance and volume of distribution of antibodies can be predicted with reasonable accuracy in human and a good estimate of first human dose can be obtained from the predicted human clearance and volume of distribution. A pictorial method evaluation chart was also developed based on fold errors for simultaneous evaluation of various methods.Keywords: clinical pharmacology (CPH), clinical research (CRE), clinical trials (CTR), maximum recommended starting dose (MRSD), clearance and volume of distribution
Procedia PDF Downloads 3741690 Electrochemical Sensing of L-Histidine Based on Fullerene-C60 Mediated Gold Nanocomposite
Authors: Sanjeeb Sutradhar, Archita Patnaik
Abstract:
Histidine is one of the twenty-two naturally occurring essential amino acids exhibiting two conformations, L-histidine and D-histidine. D-Histidine is biologically inert, while L-histidine is bioactive because of its conversion to neurotransmitter or neuromodulator histamine in both brain as well as central nervous system. The deficiency of L-histidine causes serious diseases like Parkinson’s disease, epilepsy and the failure of normal erythropoiesis development. Gold nanocomposites are attractive materials due to their excellent biocompatibility and are easy to adsorb on the electrode surface. In the present investigation, hydrophobic fullerene-C60 was functionalized with homocysteine via nucleophilic addition reaction to make it hydrophilic and to successively make the nanocomposite with in-situ prepared gold nanoparticles with ascorbic acid as reducing agent. The electronic structure calculations of the AuNPs@Hcys-C60 nanocomposite showed a drastic reduction of HOMO-LUMO gap compared to the corresponding molecules of interest, indicating enhanced electron transportability to the electrode surface. In addition, the electrostatic potential map of the nanocomposite showed the charge was distributed over either end of the nanocomposite, evidencing faster direct electron transfer from nanocomposite to the electrode surface. This nanocomposite showed catalytic activity; the nanocomposite modified glassy carbon electrode showed a tenfold higher kₑt, the electron transfer rate constant than the bare glassy carbon electrode. Significant improvement in its sensing behavior by square wave voltammetry was noted.Keywords: fullerene-C60, gold nanocomposites, L-Histidine, square wave voltammetry
Procedia PDF Downloads 2501689 A Study of Cloud Computing Solution for Transportation Big Data Processing
Authors: Ilgin Gökaşar, Saman Ghaffarian
Abstract:
The need for fast processed big data of transportation ridership (eg., smartcard data) and traffic operation (e.g., traffic detectors data) which requires a lot of computational power is incontrovertible in Intelligent Transportation Systems. Nowadays cloud computing is one of the important subjects and popular information technology solution for data processing. It enables users to process enormous measure of data without having their own particular computing power. Thus, it can also be a good selection for transportation big data processing as well. This paper intends to examine how the cloud computing can enhance transportation big data process with contrasting its advantages and disadvantages, and discussing cloud computing features.Keywords: big data, cloud computing, Intelligent Transportation Systems, ITS, traffic data processing
Procedia PDF Downloads 4701688 Characterization of the Dispersion Phenomenon in an Optical Biosensor
Authors: An-Shik Yang, Chin-Ting Kuo, Yung-Chun Yang, Wen-Hsin Hsieh, Chiang-Ho Cheng
Abstract:
Optical biosensors have become a powerful detection and analysis tool for wide-ranging applications in biomedical research, pharmaceuticals and environmental monitoring. This study carried out the computational fluid dynamics (CFD)-based simulations to explore the dispersion phenomenon in the microchannel of a optical biosensor. The predicted time sequences of concentration contours were utilized to better understand the dispersion development occurred in different geometric shapes of microchannels. The simulation results showed the surface concentrations at the sensing probe (with the best performance of a grating coupler) in respect of time to appraise the dispersion effect and therefore identify the design configurations resulting in minimum dispersion.Keywords: CFD simulations, dispersion, microfluidic, optical waveguide sensors
Procedia PDF Downloads 5451687 Element-Independent Implementation for Method of Lagrange Multipliers
Authors: Gil-Eon Jeong, Sung-Kie Youn, K. C. Park
Abstract:
Treatment for the non-matching interface is an important computational issue. To handle this problem, the method of Lagrange multipliers including classical and localized versions are the most popular technique. It essentially imposes the interface compatibility conditions by introducing Lagrange multipliers. However, the numerical system becomes unstable and inefficient due to the Lagrange multipliers. The interface element-independent formulation that does not include the Lagrange multipliers can be obtained by modifying the independent variables mathematically. Through this modification, more efficient and stable system can be achieved while involving equivalent accuracy comparing with the conventional method. A numerical example is conducted to verify the validity of the presented method.Keywords: element-independent formulation, interface coupling, methods of Lagrange multipliers, non-matching interface
Procedia PDF Downloads 4041686 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming
Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero
Abstract:
Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up
Procedia PDF Downloads 2441685 Flow Characterization in Complex Terrain for Aviation Safety
Authors: Adil Rasheed, Mandar Tabib
Abstract:
The paper describes the ability of a high-resolution Computational Fluid Dynamics model to predict terrain-induced turbulence and wind shear close to the ground. Various sensitivity studies to choose the optimal simulation setup for modeling the flow characteristics in a complex terrain are presented. The capabilities of the model are demonstrated by applying it to the Sandnessjøen Airport, Stokka in Norway, an airport that is located in a mountainous area. The model is able to forecast turbulence in real time and trigger an alert when atmospheric conditions might result in high wind shear and turbulence.Keywords: aviation safety, terrain-induced turbulence, atmospheric flow, alert system
Procedia PDF Downloads 4171684 Alteration of Bone Strength in Osteoporosis of Mouse Femora: Computational Study Based on Micro CT Images
Authors: Changsoo Chon, Sangkuy Han, Donghyun Seo, Jihyung Park, Bokku Kang, Hansung Kim, Keyoungjin Chun, Cheolwoong Ko
Abstract:
The purpose of the study is to develop a finite element model based on 3D bone structural images of Micro-CT and to analyze the stress distribution for the osteoporosis mouse femora. In this study, results of finite element analysis show that the early osteoporosis of mouse model decreased a bone density in trabecular region; however, the bone density in cortical region increased.Keywords: micro-CT, finite element analysis, osteoporosis, bone strength
Procedia PDF Downloads 3631683 Hierarchical Checkpoint Protocol in Data Grids
Authors: Rahma Souli-Jbali, Minyar Sassi Hidri, Rahma Ben Ayed
Abstract:
Grid of computing nodes has emerged as a representative means of connecting distributed computers or resources scattered all over the world for the purpose of computing and distributed storage. Since fault tolerance becomes complex due to the availability of resources in decentralized grid environment, it can be used in connection with replication in data grids. The objective of our work is to present fault tolerance in data grids with data replication-driven model based on clustering. The performance of the protocol is evaluated with Omnet++ simulator. The computational results show the efficiency of our protocol in terms of recovery time and the number of process in rollbacks.Keywords: data grids, fault tolerance, clustering, chandy-lamport
Procedia PDF Downloads 3421682 Frontal Oscillatory Activity and Phase–Amplitude Coupling during Chan Meditation
Authors: Arthur C. Tsai, Chii-Shyang Kuo, Vincent S. C. Chien, Michelle Liou, Philip E. Cheng
Abstract:
Meditation enhances mental abilities and it is an antidote to anxiety. However, very little is known about brain mechanisms and cortico-subcortical interactions underlying meditation-induced anxiety relief. In this study, the changes of phase-amplitude coupling (PAC) in which the amplitude of the beta frequency band were modulated in phase with delta rhythm were investigated after eight-week of meditation training. The study hypothesized that through a concentrate but relaxed mental training the delta-beta coupling in the frontal regions is attenuated. The delta-beta coupling analysis was applied to within and between maximally-independent component sources returned from the extended infomax independent components analysis (ICA) algorithm on the continuous EEG data during mediation. A unique meditative concentration task through relaxing body and mind was used with a constant level of moderate mental effort, so as to approach an ‘emptiness’ meditative state. A pre-test/post-test control group design was used in this study. To evaluate cross-frequency phase-amplitude coupling of component sources, the modulation index (MI) with statistics to calculate circular phase statistics were estimated. Our findings reveal that a significant delta-beta decoupling was observed in a set of frontal regions bilaterally. In addition, beta frequency band of prefrontal component were amplitude modulated in phase with the delta rhythm of medial frontal component.Keywords: phase-amplitude coupling, ICA, meditation, EEG
Procedia PDF Downloads 4281681 Solving the Pseudo-Geometric Traveling Salesman Problem with the “Union Husk” Algorithm
Authors: Boris Melnikov, Ye Zhang, Dmitrii Chaikovskii
Abstract:
This study explores the pseudo-geometric version of the extensively researched Traveling Salesman Problem (TSP), proposing a novel generalization of existing algorithms which are traditionally confined to the geometric version. By adapting the "onion husk" method and introducing auxiliary algorithms, this research fills a notable gap in the existing literature. Through computational experiments using randomly generated data, several metrics were analyzed to validate the proposed approach's efficacy. Preliminary results align with expected outcomes, indicating a promising advancement in TSP solutions.Keywords: optimization problems, traveling salesman problem, heuristic algorithms, “onion husk” algorithm, pseudo-geometric version
Procedia PDF Downloads 2081680 Inspiring Woman: The Emotional Intelligence Leadership of Khadijah Bint Khuwaylid
Authors: Eman S. Soliman, Sana Hawamdeh, Najmus S. Mahfooz
Abstract:
Purpose: The purpose of this paper was to examine various components of applied emotional intelligence as demonstrated in the leadership style of Khadijah Bint Khuwaylid in pre and post-Islamic society. Methodology: The research used a qualitative research method, specifically historical and ethnographic techniques. Data collection included both primary and secondary sources. Data from sources were analyzed to document the use of emotional intelligent leadership behaviors throughout Khadijah Bint Khuwaylid leadership experience from 596 A.D. to 621 A.D. Findings: Demonstration of four cornerstones of emotional intelligence which are self-awareness, self-management, social awareness and relationship management. Apply them on khadejah Bint Khuwaylid leadership style reveal that she possess main behavioral competences in the form of emotionally self-aware, self-.confidence, adaptability, empathy and influence. Conclusions: Khadijah Bint Khuwaylid serves as a historical model of effective leadership that included the use of emotional intelligence in her leadership behavior. The inclusion of the effective portion of the brain created a successful leadership style that can be learned by present day and future leadership. The recommendations for future leaders are to include the use of emotionally self-aware and self-confidence, adaptability, empathy and influence as components of leadership. This will then demonstrate in a leadership a basic knowledge and understanding of feelings, the keenness to be emotionally open with others, the ability to prototype beliefs and values, and the use of emotions in future communications, vision and progress.Keywords: emotional intelligence, leadership, Khadijah Bint Khuwaylid, women
Procedia PDF Downloads 2761679 EEG Neurofeedback Training – Healing the Wounded Brain
Authors: Jamuna Rajeswaran
Abstract:
In the past two decades, with a population of more than a billion. India is passing through a major socio-demographic and epidemiological transition with consequent changes in health scenario. TBI constitute significant burden on health care resources in India The impact on a person and family can be devastating. Patients with TBI experience persistent cognitive deficits, emotional changes, which contribute to the disruption of life activities. The recovery of TBI would be maximized by appropriate rehabilitation. Neurofeedback is an emerging neuroscience-based clinical application. Sixty patients were recruited for this study after obtaining informed consent. Rivermead Head Injury Follow-up Questionnaire, Rivermead Post Concussion Symptoms Questionnaire and Visual Analog Scale were used to assess the behavioral and symptomotolgy associated with post TBI. Neuropsychological assessment was carried out using NIMHANS neuropsychological battery 2004. The Intervention group received neurofeedback training and the waitlist group did not receive any treatment during this phase. Patients were allocated to intervention and waitlist group at random. There were 30 patients in each group. Patients were given 20 sessions of NFT Patients were trained on the O1 and O2 channels for alpha theta training. Each session was of 40 minutes duration with 5-6 sessions per week. The post-training assessment was carried out for the intervention group after 20 sessions of NFT. The waitlist group underwent assessment after one month. Results showed neurofeedback training is effective in ameliorating deficits in cognitive functions and quality of life in patients with TBI. Improvements were corroborated by the clinical interview with patients and significant others post NFT.Keywords: assessment, rehabilitation, cognition, EEG neurofeedback
Procedia PDF Downloads 2651678 Application of Wavelet Based Approximation for the Solution of Partial Integro-Differential Equation Arising from Viscoelasticity
Authors: Somveer Singh, Vineet Kumar Singh
Abstract:
This work contributes a numerical method based on Legendre wavelet approximation for the treatment of partial integro-differential equation (PIDE). Operational matrices of Legendre wavelets reduce the solution of PIDE into the system of algebraic equations. Some useful results concerning the computational order of convergence and error estimates associated to the suggested scheme are presented. Illustrative examples are provided to show the effectiveness and accuracy of proposed numerical method.Keywords: legendre wavelets, operational matrices, partial integro-differential equation, viscoelasticity
Procedia PDF Downloads 4501677 A Hebbian Neural Network Model of the Stroop Effect
Authors: Vadim Kulikov
Abstract:
The classical Stroop effect is the phenomenon that it takes more time to name the ink color of a printed word if the word denotes a conflicting color than if it denotes the same color. Over the last 80 years, there have been many variations of the experiment revealing various mechanisms behind semantic, attentional, behavioral and perceptual processing. The Stroop task is known to exhibit asymmetry. Reading the words out loud is hardly dependent on the ink color, but naming the ink color is significantly influenced by the incongruent words. This asymmetry is reversed, if instead of naming the color, one has to point at a corresponding color patch. Another debated aspects are the notions of automaticity and how much of the effect is due to semantic and how much due to response stage interference. Is automaticity a continuous or an all-or-none phenomenon? There are many models and theories in the literature tackling these questions which will be discussed in the presentation. None of them, however, seems to capture all the findings at once. A computational model is proposed which is based on the philosophical idea developed by the author that the mind operates as a collection of different information processing modalities such as different sensory and descriptive modalities, which produce emergent phenomena through mutual interaction and coherence. This is the framework theory where ‘framework’ attempts to generalize the concepts of modality, perspective and ‘point of view’. The architecture of this computational model consists of blocks of neurons, each block corresponding to one framework. In the simplest case there are four: visual color processing, text reading, speech production and attention selection modalities. In experiments where button pressing or pointing is required, a corresponding block is added. In the beginning, the weights of the neural connections are mostly set to zero. The network is trained using Hebbian learning to establish connections (corresponding to ‘coherence’ in framework theory) between these different modalities. The amount of data fed into the network is supposed to mimic the amount of practice a human encounters, in particular it is assumed that converting written text into spoken words is a more practiced skill than converting visually perceived colors to spoken color-names. After the training, the network performs the Stroop task. The RT’s are measured in a canonical way, as these are continuous time recurrent neural networks (CTRNN). The above-described aspects of the Stroop phenomenon along with many others are replicated. The model is similar to some existing connectionist models but as will be discussed in the presentation, has many advantages: it predicts more data, the architecture is simpler and biologically more plausible.Keywords: connectionism, Hebbian learning, artificial neural networks, philosophy of mind, Stroop
Procedia PDF Downloads 2701676 Comparative and Combined Toxicity of NiO and Mn₃O₄ Nanoparticles as Assessed in vitro and in vivo
Authors: Ilzira A. Minigalieva, Tatiana V. Bushueva, Eleonore Frohlich, Vladimir Panov, Ekaterina Shishkina, Boris A. Katsnelson
Abstract:
Background: The overwhelming majority of the experimental studies in the field of metal nanotoxicology have been performed on cultures of established cell lines, with very few researchers focusing on animal experiments, while a juxtaposition of conclusions inferred from these two types of research is blatantly lacking. The least studied aspect of this problem relates to characterizing and predicting the combined toxicity of metallic nanoparticles. Methods: Comparative and combined toxic effects of purposefully prepared spherical NiO and Mn₃O₄ nanoparticles (mean diameters 16.7 ± 8.2 nm and 18.4 ± 5.4 nm respectively) were estimated on cultures of human cell lines: MRC-5 fibroblasts, THP-1 monocytes, SY-SY5Y neuroblastoma cells, as well as on the latter two lines differentiated to macrophages and neurons, respectively. The combined cytotoxicity was mathematically modeled using the response surface methodology. Results: The comparative assessment of the studied NPs unspecific toxicity previously obtained in vivo was satisfactorily reproduced by the present in vitro tests. However, with respect to manganese-specific brain damage which had been demonstrated by us in animal experiment with the same NPs, the testing on neuronall cell culture showed only a certain enhancing effect of Mn₃O₄-NPs on the toxic action of NiO-NPs, while the role of the latter prevailed. Conclusion: From the point of view of the preventive toxicology, the experimental modeling of metallic NPs combined toxicity on cell cultures can give non-reliable predictions of the in vivo action’s effects.Keywords: manganese oxide, nickel oxide, nanoparticles, in vitro toxicity
Procedia PDF Downloads 2971675 Cuckoo Search Optimization for Black Scholes Option Pricing
Authors: Manas Shah
Abstract:
Black Scholes option pricing model is one of the most important concepts in modern world of computational finance. However, its practical use can be challenging as one of the input parameters must be estimated; implied volatility of the underlying security. The more precisely these values are estimated, the more accurate their corresponding estimates of theoretical option prices would be. Here, we present a novel model based on Cuckoo Search Optimization (CS) which finds more precise estimates of implied volatility than Particle Swarm Optimization (PSO) and Genetic Algorithm (GA).Keywords: black scholes model, cuckoo search optimization, particle swarm optimization, genetic algorithm
Procedia PDF Downloads 4531674 Effect of the Applied Bias on Miniband Structures in Dimer Fibonacci Inas/Ga1-Xinxas Superlattices
Authors: Z. Aziz, S. Terkhi, Y. Sefir, R. Djelti, S. Bentata
Abstract:
The effect of a uniform electric field across multibarrier systems (InAs/InxGa1-xAs) is exhaustively explored by a computational model using exact airy function formalism and the transfer-matrix technique. In the case of biased DFHBSL structure a strong reduction in transmission properties was observed and the width of the miniband structure linearly decreases with the increase of the applied bias. This is due to the confinement of the states in the miniband structure, which becomes increasingly important (Wannier-Stark Effect).Keywords: dimer fibonacci height barrier superlattices, singular extended state, exact airy function, transfer matrix formalism
Procedia PDF Downloads 3071673 Adaptation of Hough Transform Algorithm for Text Document Skew Angle Detection
Authors: Kayode A. Olaniyi, Olabanji F. Omotoye, Adeola A. Ogunleye
Abstract:
The skew detection and correction form an important part of digital document analysis. This is because uncompensated skew can deteriorate document features and can complicate further document image processing steps. Efficient text document analysis and digitization can rarely be achieved when a document is skewed even at a small angle. Once the documents have been digitized through the scanning system and binarization also achieved, document skew correction is required before further image analysis. Research efforts have been put in this area with algorithms developed to eliminate document skew. Skew angle correction algorithms can be compared based on performance criteria. Most important performance criteria are accuracy of skew angle detection, range of skew angle for detection, speed of processing the image, computational complexity and consequently memory space used. The standard Hough Transform has successfully been implemented for text documentation skew angle estimation application. However, the standard Hough Transform algorithm level of accuracy depends largely on how much fine the step size for the angle used. This consequently consumes more time and memory space for increase accuracy and, especially where number of pixels is considerable large. Whenever the Hough transform is used, there is always a tradeoff between accuracy and speed. So a more efficient solution is needed that optimizes space as well as time. In this paper, an improved Hough transform (HT) technique that optimizes space as well as time to robustly detect document skew is presented. The modified algorithm of Hough Transform presents solution to the contradiction between the memory space, running time and accuracy. Our algorithm starts with the first step of angle estimation accurate up to zero decimal place using the standard Hough Transform algorithm achieving minimal running time and space but lacks relative accuracy. Then to increase accuracy, suppose estimated angle found using the basic Hough algorithm is x degree, we then run again basic algorithm from range between ±x degrees with accuracy of one decimal place. Same process is iterated till level of desired accuracy is achieved. The procedure of our skew estimation and correction algorithm of text images is implemented using MATLAB. The memory space estimation and process time are also tabulated with skew angle assumption of within 00 and 450. The simulation results which is demonstrated in Matlab show the high performance of our algorithms with less computational time and memory space used in detecting document skew for a variety of documents with different levels of complexity.Keywords: hough-transform, skew-detection, skew-angle, skew-correction, text-document
Procedia PDF Downloads 1591672 A Questionnaire Survey Reviewing Radiographers' Knowledge of Computed Tomography Exposure Parameters
Authors: Mohammad Rawashdeh, Mark McEntee, Maha Zaitoun, Mostafa Abdelrahman, Patrick Brennan, Haytham Alewaidat, Sarah Lewis, Charbel Saade
Abstract:
Despite the tremendous advancements that have been generated by Computed Tomography (CT) in the field of diagnosis, concerns have been raised about the potential cancer induction risk from CT because of the exponentially increased use of it in medicine. This study aims at investigating the application and knowledge of practicing radiographers in Jordan about CT radiation. In order to collect the primary data of this study, a questionnaire was designed and distributed by social media using a snow-balling sampling method. The respondents (n=54) have answered 36 questions including the questions about their demographic information, knowledge about Diagnostic Reference Levels (DRLs), CT exposure and adaptation of pediatric patients exposure. The educational level of the respondents was either at a diploma degree (35.2%) or bachelor (64.8%). The results of this study have indicated a good level of general knowledge between radiographers about the relationship between image quality, exposure parameters, and patient dose. The level of knowledge related to DRL was poor where less than 7.4 percent of the sample members were able to give specific values for a number of common anatomical fields, including abdomen, brain, and chest. Overall, Jordanian radiographers need to gain more knowledge about the expected levels of the dose when applying good practice. Additional education on DRL or DRL inclusion in educational programs is highlighted.Keywords: computed tomography, CT scan, DRLs, exposure parameters, image quality, radiation dose
Procedia PDF Downloads 145