Search results for: computational cognitive model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19353

Search results for: computational cognitive model

18663 Earthquake Forecasting Procedure Due to Diurnal Stress Transfer by the Core to the Crust

Authors: Hassan Gholibeigian, Kazem Gholibeigian

Abstract:

In this paper, our goal is determination of loading versus time in crust. For this goal, we present a computational procedure to propose a cumulative strain energy time profile which can be used to predict the approximate location and time of the next major earthquake (M > 4.5) along a specific fault, which we believe, is more accurate than many of the methods presently in use. In the coming pages, after a short review of the research works presently going on in the area of earthquake analysis and prediction, earthquake mechanisms in both the jerk and sequence earthquake direction is discussed, then our computational procedure is presented using differential equations of equilibrium which govern the nonlinear dynamic response of a system of finite elements, modified with an extra term to account for the jerk produced during the quake. We then employ Von Mises developed model for the stress strain relationship in our calculations, modified with the addition of an extra term to account for thermal effects. For calculation of the strain energy the idea of Pulsating Mantle Hypothesis (PMH) is used. This hypothesis, in brief, states that the mantle is under diurnal cyclic pulsating loads due to unbalanced gravitational attraction of the sun and the moon. A brief discussion is done on the Denali fault as a case study. The cumulative strain energy is then graphically represented versus time. At the end, based on some hypothetic earthquake data, the final results are verified.

Keywords: pulsating mantle hypothesis, inner core’s dislocation, outer core’s bulge, constitutive model, transient hydro-magneto-thermo-mechanical load, diurnal stress, jerk, fault behaviour

Procedia PDF Downloads 276
18662 Modified Model for UV-Laser Corneal Ablation

Authors: Salah Hassab Elnaby, Omnia Hamdy, Aziza Ahmed Hassan, Salwa Abdelkawi, Ibrahim Abdelhalim

Abstract:

Laser corneal reshaping has been proposed as a successful treatment of many refraction disorders. However, some physical and chemical demonstrations of the laser effect upon interaction with the corneal tissue are still not fully explained. Therefore, different computational and mathematical models have been implemented to predict the depth of the ablated channel and calculate the ablation threshold and the local temperature rise. In the current paper, we present a modified model that aims to answer some of the open questions about the ablation threshold, the ablation rate, and the physical and chemical mechanisms of that action. The proposed model consists of three parts. The first part deals with possible photochemical reactions between the incident photons and various components of the cornea (collagen, water, etc.). Such photochemical reactions may end by photo-ablation or just the electronic excitation of molecules. Then a chemical reaction is responsible for the ablation threshold. Finally, another chemical reaction produces fragments that can be cleared out. The model takes into account all processes at the same time with different probabilities. Moreover, the effect of applying different laser wavelengths that have been studied before, namely the common excimer laser (193-nm) and the solid state lasers (213-nm & 266-nm), has been investigated. Despite the success and ubiquity of the ArF laser, the presented results reveal that a carefully designed 213-nm laser gives the same results with lower operational drawbacks. Moreover, the use of mode locked laser could also decrease the risk of heat generation and diffusion.

Keywords: UV lasers, mathematical model, corneal ablation, photochemical ablation

Procedia PDF Downloads 88
18661 Development of a Reduced Multicomponent Jet Fuel Surrogate for Computational Fluid Dynamics Application

Authors: Muhammad Zaman Shakir, Mingfa Yao, Zohaib Iqbal

Abstract:

This study proposed four Jet fuel surrogate (S1, S2 S3, and 4) with careful selection of seven large hydrocarbon fuel components, ranging from C₉-C₁₆ of higher molecular weight and higher boiling point, adapting the standard molecular distribution size of the actual jet fuel. The surrogate was composed of seven components, including n-propyl cyclohexane (C₉H₁₈), n- propylbenzene (C₉H₁₂), n-undecane (C₁₁H₂₄), n- dodecane (C₁₂H₂₆), n-tetradecane (C₁₄H₃₀), n-hexadecane (C₁₆H₃₄) and iso-cetane (iC₁₆H₃₄). The skeletal jet fuel surrogate reaction mechanism was developed by two approaches, firstly based on a decoupling methodology by describing the C₄ -C₁₆ skeletal mechanism for the oxidation of heavy hydrocarbons and a detailed H₂ /CO/C₁ mechanism for prediction of oxidation of small hydrocarbons. The combined skeletal jet fuel surrogate mechanism was compressed into 128 species, and 355 reactions and thereby can be used in computational fluid dynamics (CFD) simulation. The extensive validation was performed for individual single-component including ignition delay time, species concentrations profile and laminar flame speed based on various fundamental experiments under wide operating conditions, and for their blended mixture, among all the surrogate, S1 has been extensively validated against the experimental data in a shock tube, rapid compression machine, jet-stirred reactor, counterflow flame, and premixed laminar flame over wide ranges of temperature (700-1700 K), pressure (8-50 atm), and equivalence ratio (0.5-2.0) to capture the properties target fuel Jet-A, while the rest of three surrogate S2, S3 and S4 has been validated for Shock Tube ignition delay time only to capture the ignition characteristic of target fuel S-8 & GTL, IPK and RP-3 respectively. Based on the newly proposed HyChem model, another four surrogate with similar components and composition, was developed and parallel validations data was used as followed for previously developed surrogate but at high-temperature condition only. After testing the mechanism prediction performance of surrogates developed by the decoupling methodology, the comparison was done with the results of surrogates developed by the HyChem model. It was observed that all of four proposed surrogates in this study showed good agreement with the experimental measurements and the study comes to this conclusion that like the decoupling methodology HyChem model also has a great potential for the development of oxidation mechanism for heavy alkanes because of applicability, simplicity, and compactness.

Keywords: computational fluid dynamics, decoupling methodology Hychem, jet fuel, surrogate, skeletal mechanism

Procedia PDF Downloads 136
18660 Dynamic Soil Structure Interaction in Buildings

Authors: Shreya Thusoo, Karan Modi, Ankit Kumar Jha, Rajesh Kumar

Abstract:

Since the evolution of computational tools and simulation software, there has been considerable increase in research on Soil Structure Interaction (SSI) to decrease the computational time and increase accuracy in the results. To aid the designer with a proper understanding of the response of structure in different soil types, the presented paper compares the deformation, shear stress, acceleration and other parameters of multi-storey building for a specific input ground motion using Response-spectrum Analysis (RSA) method. The response of all the models of different heights have been compared in different soil types. Finite Element Simulation software, ANSYS, has been used for all the computational purposes. Overall, higher response is observed with SSI, while it increases with decreasing stiffness of soil.

Keywords: soil-structure interaction, response spectrum, analysis, finite element method, multi-storey buildings

Procedia PDF Downloads 480
18659 Finite Element Analysis of the Anaconda Device: Efficiently Predicting the Location and Shape of a Deployed Stent

Authors: Faidon Kyriakou, William Dempster, David Nash

Abstract:

Abdominal Aortic Aneurysm (AAA) is a major life-threatening pathology for which modern approaches reduce the need for open surgery through the use of stenting. The success of stenting though is sometimes jeopardized by the final position of the stent graft inside the human artery which may result in migration, endoleaks or blood flow occlusion. Herein, a finite element (FE) model of the commercial medical device AnacondaTM (Vascutek, Terumo) has been developed and validated in order to create a numerical tool able to provide useful clinical insight before the surgical procedure takes place. The AnacondaTM device consists of a series of NiTi rings sewn onto woven polyester fabric, a structure that despite its column stiffness is flexible enough to be used in very tortuous geometries. For the purposes of this study, a FE model of the device was built in Abaqus® (version 6.13-2) with the combination of beam, shell and surface elements; the choice of these building blocks was made to keep the computational cost to a minimum. The validation of the numerical model was performed by comparing the deployed position of a full stent graft device inside a constructed AAA with a duplicate set-up in Abaqus®. Specifically, an AAA geometry was built in CAD software and included regions of both high and low tortuosity. Subsequently, the CAD model was 3D printed into a transparent aneurysm, and a stent was deployed in the lab following the steps of the clinical procedure. Images on the frontal and sagittal planes of the experiment allowed the comparison with the results of the numerical model. By overlapping the experimental and computational images, the mean and maximum distances between the rings of the two models were measured in the longitudinal, and the transverse direction and, a 5mm upper bound was set as a limit commonly used by clinicians when working with simulations. The two models showed very good agreement of their spatial positioning, especially in the less tortuous regions. As a result, and despite the inherent uncertainties of a surgical procedure, the FE model allows confidence that the final position of the stent graft, when deployed in vivo, can also be predicted with significant accuracy. Moreover, the numerical model run in just a few hours, an encouraging result for applications in the clinical routine. In conclusion, the efficient modelling of a complicated structure which combines thin scaffolding and fabric has been demonstrated to be feasible. Furthermore, the prediction capabilities of the location of each stent ring, as well as the global shape of the graft, has been shown. This can allow surgeons to better plan their procedures and medical device manufacturers to optimize their designs. The current model can further be used as a starting point for patient specific CFD analysis.

Keywords: AAA, efficiency, finite element analysis, stent deployment

Procedia PDF Downloads 191
18658 Implementation and Validation of a Damage-Friction Constitutive Model for Concrete

Authors: L. Madouni, M. Ould Ouali, N. E. Hannachi

Abstract:

Two constitutive models for concrete are available in ABAQUS/Explicit, the Brittle Cracking Model and the Concrete Damaged Plasticity Model, and their suitability and limitations are well known. The aim of the present paper is to implement a damage-friction concrete constitutive model and to evaluate the performance of this model by comparing the predicted response with experimental data. The constitutive formulation of this material model is reviewed. In order to have consistent results, the parameter identification and calibration for the model have been performed. Several numerical simulations are presented in this paper, whose results allow for validating the capability of the proposed model for reproducing the typical nonlinear performances of concrete structures under different monotonic and cyclic load conditions. The results of the evaluation will be used for recommendations concerning the application and further improvements of the investigated model.

Keywords: Abaqus, concrete, constitutive model, numerical simulation

Procedia PDF Downloads 364
18657 The University of California at Los Angeles-Young Autism Project: A Systematic Review of Replication Studies

Authors: Michael Nicolosi, Karola Dillenburger

Abstract:

The University of California at Los Angeles-Young Autism Project (UCLA-YAP) provides one of the best-known and most researched comprehensive applied behavior analysis-based intervention models for young children on the autism spectrum. This paper reports a systematic literature review of replication studies over more than 30 years. The data show that the relatively high-intensity UCLA-YAP model can be greatly beneficial for children on the autism spectrum, particularly with regard to their cognitive functioning and adaptive behavior. This review concludes that, while more research is always welcome, the impact of the UCLA-YAP model on autism interventions is justified by more than 30 years of outcome evidence.

Keywords: ABA, applied behavior analysis, autism, California at Los Angeles Young Autism project, intervention, Lovaas, UCLA-YAP

Procedia PDF Downloads 103
18656 Solid Particles Transport and Deposition Prediction in a Turbulent Impinging Jet Using the Lattice Boltzmann Method and a Probabilistic Model on GPU

Authors: Ali Abdul Kadhim, Fue Lien

Abstract:

Solid particle distribution on an impingement surface has been simulated utilizing a graphical processing unit (GPU). In-house computational fluid dynamics (CFD) code has been developed to investigate a 3D turbulent impinging jet using the lattice Boltzmann method (LBM) in conjunction with large eddy simulation (LES) and the multiple relaxation time (MRT) models. This paper proposed an improvement in the LBM-cellular automata (LBM-CA) probabilistic method. In the current model, the fluid flow utilizes the D3Q19 lattice, while the particle model employs the D3Q27 lattice. The particle numbers are defined at the same regular LBM nodes, and transport of particles from one node to its neighboring nodes are determined in accordance with the particle bulk density and velocity by considering all the external forces. The previous models distribute particles at each time step without considering the local velocity and the number of particles at each node. The present model overcomes the deficiencies of the previous LBM-CA models and, therefore, can better capture the dynamic interaction between particles and the surrounding turbulent flow field. Despite the increasing popularity of LBM-MRT-CA model in simulating complex multiphase fluid flows, this approach is still expensive in term of memory size and computational time required to perform 3D simulations. To improve the throughput of each simulation, a single GeForce GTX TITAN X GPU is used in the present work. The CUDA parallel programming platform and the CuRAND library are utilized to form an efficient LBM-CA algorithm. The methodology was first validated against a benchmark test case involving particle deposition on a square cylinder confined in a duct. The flow was unsteady and laminar at Re=200 (Re is the Reynolds number), and simulations were conducted for different Stokes numbers. The present LBM solutions agree well with other results available in the open literature. The GPU code was then used to simulate the particle transport and deposition in a turbulent impinging jet at Re=10,000. The simulations were conducted for L/D=2,4 and 6, where L is the nozzle-to-surface distance and D is the jet diameter. The effect of changing the Stokes number on the particle deposition profile was studied at different L/D ratios. For comparative studies, another in-house serial CPU code was also developed, coupling LBM with the classical Lagrangian particle dispersion model. Agreement between results obtained with LBM-CA and LBM-Lagrangian models and the experimental data is generally good. The present GPU approach achieves a speedup ratio of about 350 against the serial code running on a single CPU.

Keywords: CUDA, GPU parallel programming, LES, lattice Boltzmann method, MRT, multi-phase flow, probabilistic model

Procedia PDF Downloads 207
18655 Numerical Study on the Performance of Upgraded Victorian Brown Coal in an Ironmaking Blast Furnace

Authors: Junhai Liao, Yansong Shen, Aibing Yu

Abstract:

A 3D numerical model is developed to simulate the complicated in-furnace combustion phenomena in the lower part of an ironmaking blast furnace (BF) while using pulverized coal injection (PCI) technology to reduce the consumption of relatively expensive coke. The computational domain covers blowpipe-tuyere-raceway-coke bed in the BF. The model is validated against experimental data in terms of gaseous compositions and coal burnout. Parameters, such as coal properties and some key operational variables, play an important role on the performance of coal combustion. Their diverse effects on different combustion characteristics are examined in the domain, in terms of gas compositions, temperature, and burnout. The heat generated by the combustion of upgraded Victorian brown coal is able to meet the heating requirement of a BF, hence making upgraded brown coal injected into BF possible. It is evidenced that the model is suitable to investigate the mechanism of the PCI operation in a BF. Prediction results provide scientific insights to optimize and control of the PCI operation. This model cuts the cost to investigate and understand the comprehensive combustion phenomena of upgraded Victorian brown coal in a full-scale BF.

Keywords: blast furnace, numerical study, pulverized coal injection, Victorian brown coal

Procedia PDF Downloads 243
18654 Computational Design, Simulation, and Wind Tunnel Testing of a Stabilator for a Fixed Wing Aircraft

Authors: Kartik Gupta, Umar Khan, Mayur Parab, Dhiraj Chaudhari, Afzal Ansari

Abstract:

The report focuses on the study related to the Design and Simulation of a stabilator (an all-movable horizontal stabilizer) for a fixed-wing aircraft. The project involves the development of a computerized direct optimization procedure for designing an aircraft all-movable stabilator. This procedure evaluates various design variables to synthesize an optimal stabilator that meets specific requirements, including performance, control, stability, strength, and flutter velocity constraints. The work signifies the CFD (Computational Fluid Dynamics) analysis of the airfoils used in the stabilator along with the CFD analysis of the Stabilizer and Stabilator of an aircraft named Thorp- T18 in software like XFLR5 and ANSYS-Fluent. A comparative analysis between a Stabilizer and Stabilator of equal surface area and under the same environmental conditions was done, and the percentage of drag reduced by the Stabilator for the same amount of lift generated as the Stabilizer was also calculated lastly, Wind tunnel testing was performed on a scale down model of the Stabilizer and Stabilator and the results of the Wind tunnel testing were compared with the results of CFD.

Keywords: wind tunnel testing, CFD, stabilizer, stabilator

Procedia PDF Downloads 60
18653 Instructional Consequences of the Transiency of Spoken Words

Authors: Slava Kalyuga, Sujanya Sombatteera

Abstract:

In multimedia learning, written text is often transformed into spoken (narrated) text. This transient information may overwhelm limited processing capacity of working memory and inhibit learning instead of improving it. The paper reviews recent empirical studies in modality and verbal redundancy effects within a cognitive load framework and outlines conditions under which negative effects of transiency may occur. According to the modality effect, textual information accompanying pictures should be presented in an auditory rather than visual form in order to engage two available channels of working memory – auditory and visual - instead of only one of them. However, some studies failed to replicate the modality effect and found differences opposite to those expected. Also, according to the multimedia redundancy effect, the same information should not be presented simultaneously in different modalities to avoid unnecessary cognitive load imposed by the integration of redundant sources of information. However, a few studies failed to replicate the multimedia redundancy effect too. Transiency of information is used to explain these controversial results.

Keywords: cognitive load, transient information, modality effect, verbal redundancy effect

Procedia PDF Downloads 380
18652 A Comparative Study on Sampling Techniques of Polynomial Regression Model Based Stochastic Free Vibration of Composite Plates

Authors: S. Dey, T. Mukhopadhyay, S. Adhikari

Abstract:

This paper presents an exhaustive comparative investigation on sampling techniques of polynomial regression model based stochastic natural frequency of composite plates. Both individual and combined variations of input parameters are considered to map the computational time and accuracy of each modelling techniques. The finite element formulation of composites is capable to deal with both correlated and uncorrelated random input variables such as fibre parameters and material properties. The results obtained by Polynomial regression (PR) using different sampling techniques are compared. Depending on the suitability of sampling techniques such as 2k Factorial designs, Central composite design, A-Optimal design, I-Optimal, D-Optimal, Taguchi’s orthogonal array design, Box-Behnken design, Latin hypercube sampling, sobol sequence are illustrated. Statistical analysis of the first three natural frequencies is presented to compare the results and its performance.

Keywords: composite plate, natural frequency, polynomial regression model, sampling technique, uncertainty quantification

Procedia PDF Downloads 513
18651 Application of Hydrological Engineering Centre – River Analysis System (HEC-RAS) to Estuarine Hydraulics

Authors: Julia Zimmerman, Gaurav Savant

Abstract:

This study aims to evaluate the efficacy of the U.S. Army Corp of Engineers’ River Analysis System (HEC-RAS) application to modeling the hydraulics of estuaries. HEC-RAS has been broadly used for a variety of riverine applications. However, it has not been widely applied to the study of circulation in estuaries. This report details the model development and validation of a combined 1D/2D unsteady flow hydraulic model using HEC-RAS for estuaries and they are associated with tidally influenced rivers. Two estuaries, Galveston Bay and Delaware Bay, were used as case studies. Galveston Bay, a bar-built, vertically mixed estuary, was modeled for the 2005 calendar year. Delaware Bay, a drowned river valley estuary, was modeled from October 22, 2019, to November 5, 2019. Water surface elevation was used to validate both models by comparing simulation results to NOAA’s Center for Operational Oceanographic Products and Services (CO-OPS) gauge data. Simulations were run using the Diffusion Wave Equations (DW), the Shallow Water Equations, Eulerian-Lagrangian Method (SWE-ELM), and the Shallow Water Equations Eulerian Method (SWE-EM) and compared for both accuracy and computational resources required. In general, the Diffusion Wave Equations results were found to be comparable to the two Shallow Water equations sets while requiring less computational power. The 1D/2D combined approach was valid for study areas within the 2D flow area, with the 1D flow serving mainly as an inflow boundary condition. Within the Delaware Bay estuary, the HEC-RAS DW model ran in 22 minutes and had an average R² value of 0.94 within the 2-D mesh. The Galveston Bay HEC-RAS DW ran in 6 hours and 47 minutes and had an average R² value of 0.83 within the 2-D mesh. The longer run time and lower R² for Galveston Bay can be attributed to the increased length of the time frame modeled and the greater complexity of the estuarine system. The models did not accurately capture tidal effects within the 1D flow area.

Keywords: Delaware bay, estuarine hydraulics, Galveston bay, HEC-RAS, one-dimensional modeling, two-dimensional modeling

Procedia PDF Downloads 199
18650 Conceptual Synthesis as a Platform for Psychotherapy Integration: The Case of Transference and Overgeneralization

Authors: Merav Rabinovich

Abstract:

Background: Psychoanalytic and cognitive therapy attend problems from a different point of view. At the recent decade the integrating movement gaining momentum. However only little has been studied regarding the theoretical interrelationship among these therapy approaches. Method: 33 transference case-studies that were published in peer-reviewed academic journals were coded by Luborsky's Core Conflictual Relationship Theme (CCRT) method (components of wish, response from other – real or imaginal - and the response of self). CCRT analysis was conducted through tailor-made method, a valid tool to identify transference patterns. Rabinovich and Kacen's (2010, 2013) Relationship Between Categories (RBC) method was used to analyze the relationship among these transference patterns with cognitive and behavior components appearing at those psychoanalytic case-studies. Result: 30 of 33 cases (90%) were found to connect the transference themes with cognitive overgeneralization. In these cases, overgeneralizations were organized around Luborsky's transference themes of response from other and response of self. Additionally, overgeneralization was found to be an antithesis of the wish component, and the tension between them found to be linked with powerful behavioral and emotional reactions. Conclusion: The findings indicate that thinking distortions of overgeneralization (cognitive therapy) are the actual expressions of transference patterns. These findings point to a theoretical junction, a platform for clinical integration. Awareness to this junction can help therapists to promote well psychotherapy outcomes relying on the accumulative wisdom of the different therapies.

Keywords: transference, overgeneralization, theoretical integration, case-study metasynthesis, CCRT method, RBC method

Procedia PDF Downloads 142
18649 Spectrum Allocation in Cognitive Radio Using Monarch Butterfly Optimization

Authors: Avantika Vats, Kushal Thakur

Abstract:

This paper displays the point at issue, improvement, and utilization of a Monarch Butterfly Optimization (MBO) rather than a Genetic Algorithm (GA) in cognitive radio for the channel portion. This approach offers a satisfactory approach to get the accessible range of both the users, i.e., primary users (PUs) and secondary users (SUs). The proposed enhancement procedure depends on a nature-inspired metaheuristic algorithm. In MBO, all the monarch butterfly individuals are located in two distinct lands, viz. Southern Canada and the northern USA (land 1), and Mexico (Land 2). The positions of the monarch butterflies are modernizing in two ways. At first, the offsprings are generated (position updating) by the migration operator and can be adjusted by the migration ratio. It is trailed by tuning the positions for different butterflies by the methods for the butterfly adjusting operator. To keep the population unaltered and minimize fitness evaluations, the aggregate of the recently produced butterflies in these two ways stays equivalent to the first population. The outcomes obviously display the capacity of the MBO technique towards finding the upgraded work values on issues regarding the genetic algorithm.

Keywords: cognitive radio, channel allocation, monarch butterfly optimization, evolutionary, computation

Procedia PDF Downloads 72
18648 Reliability Factors Based Fuzzy Logic Scheme for Spectrum Sensing

Authors: Tallataf Rasheed, Adnan Rashdi, Ahmad Naeem Akhtar

Abstract:

The accurate spectrum sensing is a fundamental requirement of dynamic spectrum access for deployment of Cognitive Radio Network (CRN). To acheive this requirement a Reliability factors based Fuzzy Logic (RFL) Scheme for Spectrum Sensing has been proposed in this paper. Cognitive Radio User (CRU) predicts the presence or absence of Primary User (PU) using energy detector and calculates the Reliability factors which are SNR of sensing node, threshold of energy detector and decision difference of each node with other nodes in a cooperative spectrum sensing environment. Then the decision of energy detector is combined with Reliability factors of sensing node using Fuzzy Logic. These Reliability Factors used in RFL Scheme describes the reliability of decision made by a CRU to improve the local spectrum sensing. This Fuzzy combining scheme provides the accuracy of decision made by sensornode. The simulation results have shown that the proposed technique provide better PU detection probability than existing Spectrum Sensing Techniques.

Keywords: cognitive radio, spectrum sensing, energy detector, reliability factors, fuzzy logic

Procedia PDF Downloads 486
18647 The Use of Polar Substituent Groups for Promoting Azo Disperse Dye Solubility and Reactivity for More Economic and Environmental Benign Applications: A Computational Study

Authors: Olaide O. Wahab, Lukman O. Olasunkanmi, Krishna K. Govender, Penny P. Govender

Abstract:

The economic and environmental challenges associated with azo disperse dyes applications are due to poor aqueous solubility and low degradation tendency which stems from low chemical reactivity. Poor aqueous solubility property of this group of dyes necessitates the use of dispersing agents which increase operational costs and also release toxic chemical components into the environment, while their low degradation tendency is due to the high stability of the azo functional group (-N=N-) in their chemical structures. To address these problems, this study investigated theoretically the effects of some polar substituents on the aqueous solubility and reactivity properties of disperse yellow (DY) 119 dye with a view to theoretically develop new azo disperse dyes with improved solubility in water and higher degradation tendency in the environment using DMol³ computational code. All calculations were carried out using the Becke and Perdew version of Volsko-Wilk-Nusair (VWN-BP) level of density functional theory in conjunction with double numerical basis set containing polarization function (DNP). The aqueous solubility determination was achieved with conductor-like screening model for realistic solvation (COSMO-RS) in conjunction with known empirical solubility model, while the reactivity was predicted using frontier molecular orbital calculations. Most of the new derivatives studied showed evidence of higher aqueous solubility and degradation tendency compared to the parent dye. We conclude that these derivatives are promising alternative dyes for more economic and environmental benign dyeing practice and therefore recommend them for synthesis.

Keywords: aqueous solubility, azo disperse dye, degradation, disperse yellow 119, DMol³, reactivity

Procedia PDF Downloads 204
18646 Modeling Of The Random Impingement Erosion Due To The Impact Of The Solid Particles

Authors: Siamack A. Shirazi, Farzin Darihaki

Abstract:

Solid particles could be found in many multiphase flows, including transport pipelines and pipe fittings. Such particles interact with the pipe material and cause erosion which threats the integrity of the system. Therefore, predicting the erosion rate is an important factor in the design and the monitor of such systems. Mechanistic models can provide reliable predictions for many conditions while demanding only relatively low computational cost. Mechanistic models utilize a representative particle trajectory to predict the impact characteristics of the majority of the particle impacts that cause maximum erosion rate in the domain. The erosion caused by particle impacts is not only due to the direct impacts but also random impingements. In the present study, an alternative model has been introduced to describe the erosion due to random impingement of particles. The present model provides a realistic trend for erosion with changes in the particle size and particle Stokes number. The present model is examined against the experimental data and CFD simulation results and indicates better agreement with the data incomparison to the available models in the literature.

Keywords: erosion, mechanistic modeling, particles, multiphase flow, gas-liquid-solid

Procedia PDF Downloads 168
18645 Evaluation of P300 and CNV Changes in Patients with Essential Tremor

Authors: Sehur Sibel Ozkaynak, Zakir Koc, Ebru Barcın

Abstract:

Essential tremor (ET) is one of the most common movement disorders and has long been considered a monosymptomatic disorder. While ET has traditionally been categorized as a pure motor disease, cross-sectional and longitudinal studies of cognition in ET have been demonstrated that these patients may have cognitive dysfunction. We investigated the neuro physiological aspects of cognition in ET, using event-related potentials (ERPs).Twenty patients with ET and 20 age-education and sex matched healthy controls underwent a neuro physiological evaluation. P300 components and Contingent Negative Variation (CNV) were recorded. The latencies and amplitudes of the P300 and CNV were evaluated. P200-N200 amplitude was significantly smaller in the ET group, while no differences emerged between patients and controls in P300 latencies. CNV amplitude was significantly smaller at Cz electrode site in the ET group. No differences were observed between in the two groups in CNV latencies. As a result, P300 and CNV parameters did not show significant differences between in the two groups, does not mean that there aren't mild cognitive changes in ET patients. In this regard, there is a need to further studies using electro physiological tests related to cognitive changes in ET patients.

Keywords: cognition, essential tremor, event related potentials

Procedia PDF Downloads 287
18644 General Purpose Graphic Processing Units Based Real Time Video Tracking System

Authors: Mallikarjuna Rao Gundavarapu, Ch. Mallikarjuna Rao, K. Anuradha Bai

Abstract:

Real Time Video Tracking is a challenging task for computing professionals. The performance of video tracking techniques is greatly affected by background detection and elimination process. Local regions of the image frame contain vital information of background and foreground. However, pixel-level processing of local regions consumes a good amount of computational time and memory space by traditional approaches. In our approach we have explored the concurrent computational ability of General Purpose Graphic Processing Units (GPGPU) to address this problem. The Gaussian Mixture Model (GMM) with adaptive weighted kernels is used for detecting the background. The weights of the kernel are influenced by local regions and are updated by inter-frame variations of these corresponding regions. The proposed system has been tested with GPU devices such as GeForce GTX 280, GeForce GTX 280 and Quadro K2000. The results are encouraging with maximum speed up 10X compared to sequential approach.

Keywords: connected components, embrace threads, local weighted kernel, structuring elements

Procedia PDF Downloads 440
18643 Rheological and Computational Analysis of Crude Oil Transportation

Authors: Praveen Kumar, Satish Kumar, Jashanpreet Singh

Abstract:

Transportation of unrefined crude oil from the production unit to a refinery or large storage area by a pipeline is difficult due to the different properties of crude in various areas. Thus, the design of a crude oil pipeline is a very complex and time consuming process, when considering all the various parameters. There were three very important parameters that play a significant role in the transportation and processing pipeline design; these are: viscosity profile, temperature profile and the velocity profile of waxy crude oil through the crude oil pipeline. Knowledge of the Rheological computational technique is required for better understanding the flow behavior and predicting the flow profile in a crude oil pipeline. From these profile parameters, the material and the emulsion that is best suited for crude oil transportation can be predicted. Rheological computational fluid dynamic technique is a fast method used for designing flow profile in a crude oil pipeline with the help of computational fluid dynamics and rheological modeling. With this technique, the effect of fluid properties including shear rate range with temperature variation, degree of viscosity, elastic modulus and viscous modulus was evaluated under different conditions in a transport pipeline. In this paper, two crude oil samples was used, as well as a prepared emulsion with natural and synthetic additives, at different concentrations ranging from 1,000 ppm to 3,000 ppm. The rheological properties was then evaluated at a temperature range of 25 to 60 °C and which additive was best suited for transportation of crude oil is determined. Commercial computational fluid dynamics (CFD) has been used to generate the flow, velocity and viscosity profile of the emulsions for flow behavior analysis in crude oil transportation pipeline. This rheological CFD design can be further applied in developing designs of pipeline in the future.

Keywords: surfactant, natural, crude oil, rheology, CFD, viscosity

Procedia PDF Downloads 454
18642 Information and Cooperativity in Fiction: The Pragmatics of David Baboulene’s “Knowledge Gaps”

Authors: Cara DiGirolamo

Abstract:

In his 2017 Ph.D. thesis, script doctor David Baboulene presented a theory of fiction in which differences in the knowledge states between participants in a literary experience, including reader, author, and characters, create many story elements, among them suspense, expectations, subtext, theme, metaphor, and allegory. This theory can be adjusted and modeled by incorporating a formal pragmatic approach that understands narrative as a speech act with a conversational function. This approach requires both the Speaker and the Listener to be understood as participants in the discourse. It also uses theories of cooperativity and the QUD to identify the existence of implicit questions. This approach predicts that what an effective literary narrative must do: provide a conversational context early in the story so the reader can engage with the text as a conversational participant. In addition, this model incorporates schema theory. Schema theory is a cognitive model for learning and processing information about the world and transforming it into functional knowledge. Using this approach can extend the QUD model. Instead of describing conversation as a form of information gathering restricted to question-answer sets, the QUD can include knowledge modeling and understanding as a possible outcome of a conversation. With this model, Baboulene’s “Knowledge Gaps” can provide real insight into storytelling as a conversational move, and extend the QUD to be able to simply and effectively apply to a more diverse set of conversational interactions and also to narrative texts.

Keywords: literature, speech acts, QUD, literary theory

Procedia PDF Downloads 2
18641 Autism: Impact on Cognitive, Social-Communication and Behavioural Development

Authors: Prachi Sharma, B. V. Ramkumar

Abstract:

In current days, autism is a well-known neurodevelopmental disorder that may restrict child development globally. Ignorance or delayed identification or incorrect diagnosis of autism is a major challenge in controlling such an incurable disorder. This may lead to various behavioural complications followed by mental illness in adulthood. Autism is an incurable disorder that is progressive and negatively affects our development globally. This may vary in degree in different skills. However, a deviation from the normal range creates a complex outcome in social and communication areas and restricts or deviates cognitive ability. The primary goal of the present research is to identify and understand the deviations in cognitive, social communication, and behaviour in children during their growing age, with a focus on autism. In this study, five children with mild autism were taken. All the children had achieved normal developmental milestones until the age of one year. The maximum age of observation of children’s development was four years to see the difference in their developmental rates in the areas of cognitive, social communication, and behaviour. The study is based on the parental report about their children from 1 year to 4 years. Videos and pictures of children during their development were also seen as a reference to verify information received by the parents of the children. This research is qualitative, with samples for which were selected using a purposive sampling technique. The data was collected from the OPD, NIEPID RC, NOIDA, India. The data was collected in the form of parental reports based on their observations about their kids. Videos were also seen to verify the information reported by the parents (just shown to verify the facts, not shared). In results, we observed a significant difference in the rate of development in all five children taken for this research. The children having mild autism, at present, showed variations in all three domains (cognitive, social communication, and behaviour). These variations were seen in terms of restricted development in global areas. The result revealed that typical features of ASD had created more cognitive restrictions as compared to the children having ASD features with hyperactivity. Behavioral problems were observed with different levels of severity in the children having ASD with hyperactivity, whereas children with typical ASD are found with some typical problem behaviours like head banging, body rocking, self-biting, etc., with different levels of severity. The social-communication area was observed as equally affected in all children, as no major difference was found in the information received from each parent.

Keywords: autism/ASD, behaviour, cognitive skill, hyperactivity, social-communication skill

Procedia PDF Downloads 37
18640 A Review of Lexical Retrieval Intervention in Primary Progressive Aphasia and Alzheimer's Disease: Mechanisms of Change, Cognition, and Generalisation

Authors: Ashleigh Beales, Anne Whitworth, Jade Cartwright

Abstract:

Background: While significant benefits of lexical retrieval intervention are evident within the Primary Progressive Aphasia (PPA) and Alzheimer’s disease (AD) literature, an understanding of the mechanisms that underlie change or improvement is limited. Change mechanisms have been explored in the non-progressive post-stroke literature that may offer insight into how interventions affect change with progressive language disorders. The potential influences of cognitive factors may also play a role here, interacting with the aims of intervention. Exploring how such processes have been applied is likely to grow our understanding of how interventions have, or have not, been effective, and how and why generalisation is likely, or not, to occur. Aims: This review of the literature aimed to (1) investigate the proposed mechanisms of change which underpin lexical interventions, mapping the PPA and AD lexical retrieval literature to theoretical accounts of mechanisms that underlie change within the broader intervention literature, (2) identify whether and which nonlinguistic cognitive functions have been engaged in intervention with these populations and any proposed influence, and (3) explore evidence of linguistic generalisation, with particular reference to change mechanisms employed in interventions. Main contribution: A search of Medline, PsycINFO, and CINAHL identified 36 articles that reported data for individuals with PPA or AD following lexical retrieval intervention. A review of the mechanisms of change identified 10 studies that used stimulation, 21 studies utilised relearning, three studies drew on reorganisation, and two studies used cognitive-relay. Significant treatment gains, predominantly based on linguistic performance measures, were reported for all client groups for each of the proposed mechanisms. Reorganisation and cognitive-relay change mechanisms were only targeted in PPA. Eighteen studies incorporated nonlinguistic cognitive functions in intervention; these were limited to autobiographical memory (16 studies), episodic memory (three studies), or both (one study). Linguistic generalisation outcomes were inconsistently reported in PPA and AD studies. Conclusion: This review highlights that individuals with PPA and AD may benefit from lexical retrieval intervention, irrespective of the mechanism of change. Thorough application of a theory of intervention is required to gain a greater understanding of the change mechanisms, as well as the interplay of nonlinguistic cognitive functions.

Keywords: Alzheimer's disease, lexical retrieval, mechanisms of change, primary progressive aphasia

Procedia PDF Downloads 203
18639 Identifying Promoters and Their Types Based on a Two-Layer Approach

Authors: Bin Liu

Abstract:

Prokaryotic promoter, consisted of two short DNA sequences located at in -35 and -10 positions, is responsible for controlling the initiation and expression of gene expression. Different types of promoters have different functions, and their consensus sequences are similar. In addition, their consensus sequences may be different for the same type of promoter, which poses difficulties for promoter identification. Unfortunately, all existing computational methods treat promoter identification as a binary classification task and can only identify whether a query sequence belongs to a specific promoter type. It is desired to develop computational methods for effectively identifying promoters and their types. Here, a two-layer predictor is proposed to try to deal with the problem. The first layer is designed to predict whether a given sequence is a promoter and the second layer predicts the type of promoter that is judged as a promoter. Meanwhile, we also analyze the importance of feature and sequence conversation in two aspects: promoter identification and promoter type identification. To the best knowledge of ours, it is the first computational predictor to detect promoters and their types.

Keywords: promoter, promoter type, random forest, sequence information

Procedia PDF Downloads 184
18638 Investigating Early Markers of Alzheimer’s Disease Using a Combination of Cognitive Tests and MRI to Probe Changes in Hippocampal Anatomy and Functionality

Authors: Netasha Shaikh, Bryony Wood, Demitra Tsivos, Michael Knight, Risto Kauppinen, Elizabeth Coulthard

Abstract:

Background: Effective treatment of dementia will require early diagnosis, before significant brain damage has accumulated. Memory loss is an early symptom of Alzheimer’s disease (AD). The hippocampus, a brain area critical for memory, degenerates early in the course of AD. The hippocampus comprises several subfields. In contrast to healthy aging where CA3 and dentate gyrus are the hippocampal subfields with most prominent atrophy, in AD the CA1 and subiculum are thought to be affected early. Conventional clinical structural neuroimaging is not sufficiently sensitive to identify preferential atrophy in individual subfields. Here, we will explore the sensitivity of new magnetic resonance imaging (MRI) sequences designed to interrogate medial temporal regions as an early marker of Alzheimer’s. As it is likely a combination of tests may predict early Alzheimer’s disease (AD) better than any single test, we look at the potential efficacy of such imaging alone and in combination with standard and novel cognitive tasks of hippocampal dependent memory. Methods: 20 patients with mild cognitive impairment (MCI), 20 with mild-moderate AD and 20 age-matched healthy elderly controls (HC) are being recruited to undergo 3T MRI (with sequences designed to allow volumetric analysis of hippocampal subfields) and a battery of cognitive tasks (including Paired Associative Learning from CANTAB, Hopkins Verbal Learning Test and a novel hippocampal-dependent abstract word memory task). AD participants and healthy controls are being tested just once whereas patients with MCI will be tested twice a year apart. We will compare subfield size between groups and correlate subfield size with cognitive performance on our tasks. In the MCI group, we will explore the relationship between subfield volume, cognitive test performance and deterioration in clinical condition over a year. Results: Preliminary data (currently on 16 participants: 2 AD; 4 MCI; 9 HC) have revealed subfield size differences between subject groups. Patients with AD perform with less accuracy on tasks of hippocampal-dependent memory, and MCI patient performance and reaction times also differ from healthy controls. With further testing, we hope to delineate how subfield-specific atrophy corresponds with changes in cognitive function, and characterise how this progresses over the time course of the disease. Conclusion: Novel sequences on a MRI scanner such as those in route in clinical use can be used to delineate hippocampal subfields in patients with and without dementia. Preliminary data suggest that such subfield analysis, perhaps in combination with cognitive tasks, may be an early marker of AD.

Keywords: Alzheimer's disease, dementia, memory, cognition, hippocampus

Procedia PDF Downloads 573
18637 Cognitive Function During the First Two Hours of Spravato Administration in Patients with Major Depressive Disorder

Authors: Jocelyn Li, Xiangyang Li

Abstract:

We have employed THINC-it® to study the acute effects of Spravato on the cognitive function of patients with severe major depression disorder (MDD). The scores of the four tasks (Spotter, Symbol Check, Code Breaker, Trails) found in THINC-it® were used to measure cognitive function throughout treatment. The patients who participated in this study have tried more than 3 antidepressants without significant improvement before they began Spravato treatment. All patients received 3 doses of 28 mg Spravato 5 minutes apart (84 mg total per treatment) during this study with THINC-it®. The data were collected before the first Spravato administration (T0), 1 hour after the first Spravato administration (T1), and 2 hours after the first Spravato administration (T2) during each treatment. The following data were from 13 patients, with a total of 226 trials in a 2-3 month period. Spravato at 84 mg reduced the scores of Trails, Code Breaker, Symbol Check, and Spotter at T1 by 10-20% in all patients with one exception for a minority of patients in Spotter. At T2, the scores of Trails, Symbol Check, and Spotter were back to 97% of T0 while the score of Code Breaker was back to 92%. Interestingly, we found that the score of Spotter was consistently increased by 17% at T1 in the same 30% of patients in each treatment. We called this change reverse response while the pattern of the other patients, a decline (T1) and then recovery (T2), was called non-reverse response. We also compared the scores at T0 between the first visit and the fifth visit. The T0 scores of all four tasks were improved at visit 5 when compared to visit 1. The scores of Trails, Code Breaker, and Symbol Check at T0 were increased by 14%, 33%, and 14% respectively at visit 5. The score of Code Breaker, which had two trends, improved by 9% in reverse response patients compared to a 27% improvement in non-reverse response patients. To our knowledge, this is the first study done on the impact of Spravato on cognitive function change in major depression patients at this time frame. Whether we can predict future responses to Spravato with THINC-it® merits further study.

Keywords: Spravato, THINC-it, major depressive disorder, cognitive function

Procedia PDF Downloads 116
18636 The Phenomena of Virtual World Adoption: Antecedents and Consequences of Virtual World Experience

Authors: Norita Ahmad, Reza Barkhi, Xiaobo Xu

Abstract:

We design an experimental study to learn about the cognitive implications of the use of avatars in a Virtual World (VW) (i.e., Second Life). The results support our proposed model, where a positive flow experience with VW influences the attitude towards VW, in turn influencing intention to use VW. Furthermore, VW flow experience can itself be impacted by perceived peer influence, familiarity with VW, and personality of the individuals behind the avatars in VW.

Keywords: avatar, flow experience, personality type, second life, virtual world

Procedia PDF Downloads 597
18635 An Exact Algorithm for Location–Transportation Problems in Humanitarian Relief

Authors: Chansiri Singhtaun

Abstract:

This paper proposes a mathematical model and examines the performance of an exact algorithm for a location–transportation problems in humanitarian relief. The model determines the number and location of distribution centers in a relief network, the amount of relief supplies to be stocked at each distribution center and the vehicles to take the supplies to meet the needs of disaster victims under capacity restriction, transportation and budgetary constraints. The computational experiments are conducted on the various sizes of problems that are generated. Branch and bound algorithm is applied for these problems. The results show that this algorithm can solve problem sizes of up to three candidate locations with five demand points and one candidate location with up to twenty demand points without premature termination.

Keywords: disaster response, facility location, humanitarian relief, transportation

Procedia PDF Downloads 451
18634 The Relationship between Hot and Cool Executive Function and Theory of Mind in School-Aged Children with Autism Spectrum Disorder

Authors: Evangelia-Chrysanthi Kouklari, Stella Tsermentseli, Claire P. Monks

Abstract:

Executive function (EF) refers to a set of future-oriented and goal-directed cognitive skills that are crucial for problem solving and social behaviour, as well as the ability to organise oneself. It has been suggested that EF could be conceptualised as two distinct but interrelated constructs, one emotional (hot) and one cognitive (cool), as it facilitates both affective and cognitive regulation. Cool EF has been found to be strongly related to Theory of Mind (ToM) that is the ability to infer mental states, but research has not taken into account the association between hot EF and ToM in Autism Spectrum Disorder (ASD) to date. The present study investigates the associations between both hot and cool EF and ToM in school-aged children with ASD. This cross-sectional study assesses 79 school-aged children with ASD (7-15 years) and 91 controls matched for age and IQ, on tasks tapping cool EF (working memory, inhibition, planning), hot EF (effective decision making, delay discounting), and ToM (emotional understanding and false/no false belief). Significant group differences in each EF measure support a global executive dysfunction in ASD. Strong associations between hot EF and ToM in ASD are reported for the first time (i.e. ToM emotional understanding and delay discounting). These findings highlight that hot EF also makes a unique contribution to the developmental profile of ASD. Considering the role of both hot and cool EF in association with ToM in individuals with ASD may aid in gaining a greater understanding not just of how these complex multifaceted cognitive abilities relate to one another, but their joint role in the distinct developmental pathway followed in ASD.

Keywords: ASD, executive function, school age, theory of mind

Procedia PDF Downloads 291