Search results for: sensory processing sensitivity
4107 Immersive and Interactive Storytelling: Exploring Narratives and Online Multisensory Experience for Cultural Memory and Collective Awareness through Graphic Novel
Authors: Cristina Greco
Abstract:
The spread of the digital and we-based technologies has led to a transformation process, which has coincided with an increase in the number of cases who are beyond the mainstream storytelling and its codes on the interaction with the user. On the base of a previous research on i-docs and virtual museums, this study analyses interactive and immersive online Graphic Novel – one-page, animated, illustrated, and hybrid – to reflect on the transformational implications of this expressive form on the user perception, remembrance, and awareness. The way in which the user experiences a certain level of interaction with the story and immersion in the semantic and figurative universe would bring user’s attention, activating introspection and self-reflection processes, perception, imagination, and creativity. This would have to do with the involvement of different senses – visual, proprioceptive, tactile, auditory, and vestibular – and the activation of a phenomenon of synaesthesia (involuntary cross-modal sensory association) – where, for example, the aural reconnect the user to another sense, providing a multisensory experience. The case studies show specific forms of interactive and immersive graphic novel and reflect on application that has sought to engage innovative ways to communicate different messages and stimulate cultural memory and collective awareness. The visual semiotic and narrative analysis of the distinctive traits of such a complex textuality, along with a study of the user’s experience through observation in naturalistic settings and interviews, allows us to question the functioning of these configurations, with regard to the relationships between the figurative dimension, the perceptive activity, and their impact on the user’s engagement.Keywords: collective awareness, cultural memory, graphic novel, interactive and immersive storytelling
Procedia PDF Downloads 1494106 Isolation and Selection of Strains Perspective for Sewage Sludge Processing
Authors: A. Zh. Aupova, A. Ulankyzy, A. Sarsenova, A. Kussayin, Sh. Turarbek, N. Moldagulova, A. Kurmanbayev
Abstract:
One of the methods of organic waste bioconversion into environmentally-friendly fertilizer is composting. Microorganisms that produce hydrolytic enzymes play a significant role in accelerating the process of organic waste composting. We studied the enzymatic potential (amylase, protease, cellulase, lipase, urease activity) of bacteria isolated from the sewage sludge of Nur-Sultan, Rudny, and Fort-Shevchenko cities, the dacha soil of Nur-Sultan city, and freshly cut grass from the dacha for processing organic waste and identifying active strains. Microorganism isolation was carried out by the cultures enrichment method on liquid nutrient media, followed by inoculating on different solid media to isolate individual colonies. As a result, sixty-one microorganisms were isolated, three of which were thermophiles (DS1, DS2, and DS3). The highest number of isolates, twenty-one and eighteen, were isolated from sewage sludge of Nur-Sultan and Rudny cities, respectively. Ten isolates were isolated from the wastewater of the sewage treatment plant in Fort-Shevchenko. From the dacha soil of Nur-Sultan city and freshly cut grass - 9 and 5 isolates were revealed, respectively. The lipolytic, proteolytic, amylolytic, cellulolytic, ureolytic, and oil-oxidizing activities of isolates were studied. According to the results of experiments, starch hydrolysis (amylolytic activity) was found in 2 isolates - CB2/2, and CB2/1. Three isolates - CB2, CB2/1, and CB1/1 were selected for the highest ability to break down casein. Among isolated 61 bacterial cultures, three isolates could break down fats - CB3, CBG1/1, and IL3. Seven strains had cellulolytic activity - DS1, DS2, IL3, IL5, P2, P5, and P3. Six isolates rapidly decomposed urea. Isolate P1 could break down casein and cellulose. Isolate DS3 was a thermophile and had cellulolytic activity. Thus, based on the conducted studies, 15 isolates were selected as a potential for sewage sludge composting - CB2, CB3, CB1/1, CB2/2, CBG1/1, CB2/1, DS1, DS2, DS3, IL3, IL5, P1, P2, P5, P3. Selected strains were identified on a mass spectrometer (Maldi-TOF). The isolate - CB 3 was referred to the genus Rhodococcus rhodochrous; two isolates CB2 and CB1 / 1 - to Bacillus cereus, CB 2/2 - to Cryseobacterium arachidis, CBG 1/1 - to Pseudoxanthomonas sp., CB2/1 - to Bacillus megaterium, DS1 - to Pediococcus acidilactici, DS2 - to Paenibacillus residui, DS3 - to Brevibacillus invocatus, three strains IL3, P5, P3 - to Enterobacter cloacae, two strains IL5, P2 - to Ochrobactrum intermedium, and P1 - Bacillus lichenoformis. Hence, 60 isolates were isolated from the wastewater of the cities of Nur-Sultan, Rudny, Fort-Shevchenko, the dacha soil of Nur-Sultan city, and freshly cut grass from the dacha. Based on the highest enzymatic activity, 15 active isolates were selected and identified. These strains may become the candidates for bio preparation for sewage sludge processing.Keywords: sewage sludge, composting, bacteria, enzymatic activity
Procedia PDF Downloads 1024105 Low Temperature Biological Treatment of Chemical Oxygen Demand for Agricultural Water Reuse Application Using Robust Biocatalysts
Authors: Vedansh Gupta, Allyson Lutz, Ameen Razavi, Fatemeh Shirazi
Abstract:
The agriculture industry is especially vulnerable to forecasted water shortages. In the fresh and fresh-cut produce sector, conventional flume-based washing with recirculation exhibits high water demand. This leads to a large water footprint and possible cross-contamination of pathogens. These can be alleviated through advanced water reuse processes, such as membrane technologies including reverse osmosis (RO). Water reuse technologies effectively remove dissolved constituents but can easily foul without pre-treatment. Biological treatment is effective for the removal of organic compounds responsible for fouling, but not at the low temperatures encountered at most produce processing facilities. This study showed that the Microvi MicroNiche Engineering (MNE) technology effectively removes organic compounds (> 80%) at low temperatures (6-8 °C) from wash water. The MNE technology uses synthetic microorganism-material composites with negligible solids production, making it advantageously situated as an effective bio-pretreatment for RO. A preliminary technoeconomic analysis showed 60-80% savings in operation and maintenance costs (OPEX) when using the Microvi MNE technology for organics removal. This study and the accompanying economic analysis indicated that the proposed technology process will substantially reduce the cost barrier for adopting water reuse practices, thereby contributing to increased food safety and furthering sustainable water reuse processes across the agricultural industry.Keywords: biological pre-treatment, innovative technology, vegetable processing, water reuse, agriculture, reverse osmosis, MNE biocatalysts
Procedia PDF Downloads 1294104 Olive Oils from Algeria: Phenolic Compounds Composition and Antibacterial Activity
Authors: Firdaousse Laincer, Rahima Laribi, Abderazak Tamendjari, Rovellini Venturini
Abstract:
Phenolic compounds present in olive oil have received much attention in recent years due to their beneficial functional and nutritional effects. Phenolic composition, antibacterial activity of phenolic extracts of olive oil varieties from Algeria were investigated. The analysis of polyphenols was performed by Folin-Ciocalteu and HPLC. As a result, many phenolic compounds were identified and quantified by using HPLC; derivatives of oleuropein and ligstroside, hydroxytyrosol, tyrosol, flavonoids, and lignans reporting unique and characteristic phenolic profile. These phenolic fractions also differentiate the total antibacterial activity. Among the bacteria tested, S. aureus and, to a lesser extent, B. subtilis showed the highest sensitivity; the MIC varied from 0.6 to 1.6 mg•mL-1 and 1.2 to 1.8 mg•mL-1, respectively. The results obtained denote that Algerian olive oils may constitute a good source of healthy compounds, phenolics compounds, in the diet, suggesting that their consumption could be useful in the prevention of diseases.Keywords: antibacterial activity, olive oil, phenols, HPLC
Procedia PDF Downloads 4524103 The Variable Sampling Interval Xbar Chart versus the Double Sampling Xbar Chart
Authors: Michael B. C. Khoo, J. L. Khoo, W. C. Yeong, W. L. Teoh
Abstract:
The Shewhart Xbar control chart is a useful process monitoring tool in manufacturing industries to detect the presence of assignable causes. However, it is insensitive in detecting small process shifts. To circumvent this problem, adaptive control charts are suggested. An adaptive chart enables at least one of the chart’s parameters to be adjusted to increase the chart’s sensitivity. Two common adaptive charts that exist in the literature are the double sampling (DS) Xbar and variable sampling interval (VSI) Xbar charts. This paper compares the performances of the DS and VSI Xbar charts, based on the average time to signal (ATS) criterion. The ATS profiles of the DS Xbar and VSI Xbar charts are obtained using the Mathematica and Statistical Analysis System (SAS) programs, respectively. The results show that the VSI Xbar chart is generally superior to the DS Xbar chart.Keywords: adaptive charts, average time to signal, double sampling, charts, variable sampling interval
Procedia PDF Downloads 2874102 Radiation Dosimetry Using Sintered Pellets of Yellow Beryl (Heliodor) Crystals
Authors: Lucas Sátiro Do Carmo, Betzabel Noemi Silva Carrera, Shigueo Watanabe, J. F. D. Chubaci
Abstract:
Beryl is a silicate with chemical formula Be₃Al₂(SiO₃)₆ commonly found in Brazil. It has a few colored variations used as jewelry, like Aquamarine (blueish), Emerald (green) and Heliodor (yellow). The color of each variation depends on the dopant that is naturally present in the crystal lattice. In this work, Heliodor pellets of 5 mm diameter and 1 mm thickness have been produced and investigated using thermoluminescence (TL) to evaluate its potential for use as gamma ray’s dosimeter. The results show that the pellets exhibited a prominent TL peak at 205 °C that grows linearly with dose when irradiated from 1 Gy to 1000 Gy. A comparison has been made between powdered and sintered dosimeters. The results show that sintered pellets have higher sensitivity than powder dosimeter. The TL response of this mineral is satisfactory for radiation dosimetry applications in the studied dose range.Keywords: dosimetry, beryl, gamma rays, sintered pellets, new material
Procedia PDF Downloads 964101 Effectiveness of Cognitive and Supportive-Expressive Group Therapies on Self-Efficiency and Life Style in MS Patients
Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi
Abstract:
Multiple sclerosis is the most common chronic disease of the central nervous system associated with demyelination of neurons and several demyelinated parts of the disease encompasses throughout the white matter and affects the sensory and motor function. This study compared the effectiveness of two methods of cognitive therapy and supportive-expressive therapy on the efficacy and quality of life in MS patients. This is an experimental project which has used developed group pretest - posttest and follow-up with 3 groups. The study included all patients with multiple sclerosis in 2013 that were members of the MS Society of Iran in Tehran. The sample included 45 patients with MS that were selected volunteerily of members of the MS society of Iran and randomly divided into three groups and pretest, posttest, and follow-up (three months) for the three groups had been done.The dimensions of quality of life in patients with multiple sclerosis scale, and general self-efficiency scale of Schwarzer and Jerusalem was used for collecting data. The results showed that there was a significant difference between the mean of quality of life scores at pretest, posttest, and follow-up of the experimental groups. There was no significant difference between the mean of quality of life of the experimental groups which means that both groups were effective and had the same effect. There was no significant difference between the mean of self-efficiency scores in control and experimental group in pretest, posttest and follow-up. Thus, by using cognitive and supportive-expressive group therapy we can improve quality of life in MS patients and make great strides in their mental health.Keywords: cognitive group therapy, life style, MS, self-efficiency, supportive-expressive group therapy
Procedia PDF Downloads 4854100 Prevalence of Dens Evaginatus in Adolescent Population of Melaka: A Retrospective Study
Authors: Preethy Mary Donald, Renjith George Pallivathukal
Abstract:
Dens evaginatus (DE) is a rare developmental anomaly characterized by a slender enamel-covered tubercle which projects from the occlusal surface of an otherwise normal premolar. DE can often interfere normal occlusion and can lead to complications like sensitivity, pulpal exposure and temporo mandibular joint problems. The orthopantomographs (OPGs) and dental records of patients under the age of 20 who attended the faculty of dentistry, Melaka-Manipal Medical College were examined for DE. Results: The prevalence of DE was 23% among the study group. Males presented with a higher prevalence of 67% and females with 33%. The prevalence of Dens evaginatus was distributed as 28% in maxillary central incisor, 52% in maxillary lateral incisors, 12% in mandibular second premolars. Prevalence in permanent dentitions appeared to be higher than deciduous dentition. The bilateral occurrence of Dens evaginatus is an interesting phenomenon. 57% of the cases of the DE were bilateral.Keywords: deciduous dentition, dens evaginatus, permanent dentition, prevalence
Procedia PDF Downloads 3064099 Design and Implementation of Collaborative Editing System Based on Physical Simulation Engine Running State
Authors: Zhang Songning, Guan Zheng, Ci Yan, Ding Gangyi
Abstract:
The application of physical simulation engines in collaborative editing systems has an important background and role. Firstly, physical simulation engines can provide real-world physical simulations, enabling users to interact and collaborate in real time in virtual environments. This provides a more intuitive and immersive experience for collaborative editing systems, allowing users to more accurately perceive and understand various elements and operations in collaborative editing. Secondly, through physical simulation engines, different users can share virtual space and perform real-time collaborative editing within it. This real-time sharing and collaborative editing method helps to synchronize information among team members and improve the efficiency of collaborative work. Through experiments, the average model transmission speed of a single person in the collaborative editing system has increased by 141.91%; the average model processing speed of a single person has increased by 134.2%; the average processing flow rate of a single person has increased by 175.19%; the overall efficiency improvement rate of a single person has increased by 150.43%. With the increase in the number of users, the overall efficiency remains stable, and the physical simulation engine running status collaborative editing system also has horizontal scalability. It is not difficult to see that the design and implementation of a collaborative editing system based on physical simulation engines not only enriches the user experience but also optimizes the effectiveness of team collaboration, providing new possibilities for collaborative work.Keywords: physics engine, simulation technology, collaborative editing, system design, data transmission
Procedia PDF Downloads 864098 Colored Image Classification Using Quantum Convolutional Neural Networks Approach
Authors: Farina Riaz, Shahab Abdulla, Srinjoy Ganguly, Hajime Suzuki, Ravinesh C. Deo, Susan Hopkins
Abstract:
Recently, quantum machine learning has received significant attention. For various types of data, including text and images, numerous quantum machine learning (QML) models have been created and are being tested. Images are exceedingly complex data components that demand more processing power. Despite being mature, classical machine learning still has difficulties with big data applications. Furthermore, quantum technology has revolutionized how machine learning is thought of, by employing quantum features to address optimization issues. Since quantum hardware is currently extremely noisy, it is not practicable to run machine learning algorithms on it without risking the production of inaccurate results. To discover the advantages of quantum versus classical approaches, this research has concentrated on colored image data. Deep learning classification models are currently being created on Quantum platforms, but they are still in a very early stage. Black and white benchmark image datasets like MNIST and Fashion MINIST have been used in recent research. MNIST and CIFAR-10 were compared for binary classification, but the comparison showed that MNIST performed more accurately than colored CIFAR-10. This research will evaluate the performance of the QML algorithm on the colored benchmark dataset CIFAR-10 to advance QML's real-time applicability. However, deep learning classification models have not been developed to compare colored images like Quantum Convolutional Neural Network (QCNN) to determine how much it is better to classical. Only a few models, such as quantum variational circuits, take colored images. The methodology adopted in this research is a hybrid approach by using penny lane as a simulator. To process the 10 classes of CIFAR-10, the image data has been translated into grey scale and the 28 × 28-pixel image containing 10,000 test and 50,000 training images were used. The objective of this work is to determine how much the quantum approach can outperform a classical approach for a comprehensive dataset of color images. After pre-processing 50,000 images from a classical computer, the QCNN model adopted a hybrid method and encoded the images into a quantum simulator for feature extraction using quantum gate rotations. The measurements were carried out on the classical computer after the rotations were applied. According to the results, we note that the QCNN approach is ~12% more effective than the traditional classical CNN approaches and it is possible that applying data augmentation may increase the accuracy. This study has demonstrated that quantum machine and deep learning models can be relatively superior to the classical machine learning approaches in terms of their processing speed and accuracy when used to perform classification on colored classes.Keywords: CIFAR-10, quantum convolutional neural networks, quantum deep learning, quantum machine learning
Procedia PDF Downloads 1294097 Gamifying Content and Language Integrated Learning: A Study Exploring the Use of Game-Based Resources to Teach Primary Mathematics in a Second Language
Authors: Sarah Lister, Pauline Palmer
Abstract:
Research findings presented within this paper form part of a larger scale collaboration between academics at Manchester Metropolitan University and a technology company. The overarching aims of this project focus on developing a series of game-based resources to promote the teaching of aspects of mathematics through a second language (L2) in primary schools. This study explores the potential of game-based learning (GBL) as a dynamic way to engage and motivate learners, making learning fun and purposeful. The research examines the capacity of GBL resources to provide a meaningful and purposeful context for CLIL. GBL is a powerful learning environment and acts as an effective vehicle to promote the learning of mathematics through an L2. The fun element of GBL can minimise stress and anxiety associated with mathematics and L2 learning that can create barriers. GBL provides one of the few safe domains where it is acceptable for learners to fail. Games can provide a life-enhancing experience for learners, revolutionizing the routinized ways of learning through fusing learning and play. This study argues that playing games requires learners to think creatively to solve mathematical problems, using the L2 in order to progress, which can be associated with the development of higher-order thinking skills and independent learning. GBL requires learners to engage appropriate cognitive processes with increased speed of processing, sensitivity to environmental inputs, or flexibility in allocating cognitive and perceptual resources. At surface level, GBL resources provide opportunities for learners to learn to do things. Games that fuse subject content and appropriate learning objectives have the potential to make learning academic subjects more learner-centered, promote learner autonomy, easier, more enjoyable, more stimulating and engaging and therefore, more effective. Data includes observations of the children playing the games and follow up group interviews. Given that learning as a cognitive event cannot be directly observed or measured. A Cognitive Discourse Functions (CDF) construct was used to frame the research, to map the development of learners’ conceptual understanding in an L2 context and as a framework to observe the discursive interactions that occur learner to learner and between learner and teacher. Cognitively, the children were required to engage with mathematical content, concepts and language to make decisions quickly, to engage with the gameplay to reason, solve and overcome problems and learn through experimentation. The visual elements of the games supported the learning of new concepts. Children recognised the value of the games to consolidate their mathematical thinking and develop their understanding of new ideas. The games afforded them time to think and reflect. The teachers affirmed that the games provided meaningful opportunities for the learners to practise the language. The findings of this research support the view that using the game-based resources supported children’s grasp of mathematical ideas and their confidence and ability to use the L2. Engaging with the content and language through the games led to deeper learning.Keywords: CLIL, gaming, language, mathematics
Procedia PDF Downloads 1424096 Carboxylic Acid-Functionalized Multi-Walled Carbon Nanotubes-Polyindole/Ti2O3 Nanocomposite: Electrochemical Nanomolar Detection of α-Lipoic Acid in Vegetables
Authors: Ragu Sasikumar, Palraj Ranganathan, Shen-Ming Chen, Syang-Peng Rwei
Abstract:
A highly sensitive, and selective α-Lipoic acid (ALA) sensor based on a functionalized multi-walled carbon nanotubes-polyindole/Ti2O3 (f-MWCNTs-PIN/Ti2O3) nanocomposite modified glassy carbon electrode (GCE) was developed. The fabricated f-MWCNTs-PIN/Ti2O3/GCE displayed an enhanced voltammetric response for oxidation towards ALA relative to that of a f-MWCNTs/GCE, f-MWCNTs-PIN/GCE, Ti2O3/GCE, and a bare GCE. Under optimum conditions, the f-MWCNTs-PIN/Ti2O3/GCE showed a wide linear range at ALA concentrations of 0.39-115.8 µM. The limit of detection of 12 nM and sensitivity of about 6.39 µA µM-1cm-2. The developed sensor showed anti-interference, reproducibility, good repeatability, and operational stability. Applied possibility of the sensor has been confirmed in vegetable samples.Keywords: f-MWCNT, polyindole, Ti2O3, Alzheimer’s diseases, ALA sensor
Procedia PDF Downloads 2254095 Low-Cost, Portable Optical Sensor with Regression Algorithm Models for Accurate Monitoring of Nitrites in Environments
Authors: David X. Dong, Qingming Zhang, Meng Lu
Abstract:
Nitrites enter waterways as runoff from croplands and are discharged from many industrial sites. Excessive nitrite inputs to water bodies lead to eutrophication. On-site rapid detection of nitrite is of increasing interest for managing fertilizer application and monitoring water source quality. Existing methods for detecting nitrites use spectrophotometry, ion chromatography, electrochemical sensors, ion-selective electrodes, chemiluminescence, and colorimetric methods. However, these methods either suffer from high cost or provide low measurement accuracy due to their poor selectivity to nitrites. Therefore, it is desired to develop an accurate and economical method to monitor nitrites in environments. We report a low-cost optical sensor, in conjunction with a machine learning (ML) approach to enable high-accuracy detection of nitrites in water sources. The sensor works under the principle of measuring molecular absorptions of nitrites at three narrowband wavelengths (295 nm, 310 nm, and 357 nm) in the ultraviolet (UV) region. These wavelengths are chosen because they have relatively high sensitivity to nitrites; low-cost light-emitting devices (LEDs) and photodetectors are also available at these wavelengths. A regression model is built, trained, and utilized to minimize cross-sensitivities of these wavelengths to the same analyte, thus achieving precise and reliable measurements with various interference ions. The measured absorbance data is input to the trained model that can provide nitrite concentration prediction for the sample. The sensor is built with i) a miniature quartz cuvette as the test cell that contains a liquid sample under test, ii) three low-cost UV LEDs placed on one side of the cell as light sources, with each LED providing a narrowband light, and iii) a photodetector with a built-in amplifier and an analog-to-digital converter placed on the other side of the test cell to measure the power of transmitted light. This simple optical design allows measuring the absorbance data of the sample at the three wavelengths. To train the regression model, absorbances of nitrite ions and their combination with various interference ions are first obtained at the three UV wavelengths using a conventional spectrophotometer. Then, the spectrophotometric data are inputs to different regression algorithm models for training and evaluating high-accuracy nitrite concentration prediction. Our experimental results show that the proposed approach enables instantaneous nitrite detection within several seconds. The sensor hardware costs about one hundred dollars, which is much cheaper than a commercial spectrophotometer. The ML algorithm helps to reduce the average relative errors to below 3.5% over a concentration range from 0.1 ppm to 100 ppm of nitrites. The sensor has been validated to measure nitrites at three sites in Ames, Iowa, USA. This work demonstrates an economical and effective approach to the rapid, reagent-free determination of nitrites with high accuracy. The integration of the low-cost optical sensor and ML data processing can find a wide range of applications in environmental monitoring and management.Keywords: optical sensor, regression model, nitrites, water quality
Procedia PDF Downloads 724094 Social Media Idea Ontology: A Concept for Semantic Search of Product Ideas in Customer Knowledge through User-Centered Metrics and Natural Language Processing
Authors: Martin H¨ausl, Maximilian Auch, Johannes Forster, Peter Mandl, Alexander Schill
Abstract:
In order to survive on the market, companies must constantly develop improved and new products. These products are designed to serve the needs of their customers in the best possible way. The creation of new products is also called innovation and is primarily driven by a company’s internal research and development department. However, a new approach has been taking place for some years now, involving external knowledge in the innovation process. This approach is called open innovation and identifies customer knowledge as the most important source in the innovation process. This paper presents a concept of using social media posts as an external source to support the open innovation approach in its initial phase, the Ideation phase. For this purpose, the social media posts are semantically structured with the help of an ontology and the authors are evaluated using graph-theoretical metrics such as density. For the structuring and evaluation of relevant social media posts, we also use the findings of Natural Language Processing, e. g. Named Entity Recognition, specific dictionaries, Triple Tagger and Part-of-Speech-Tagger. The selection and evaluation of the tools used are discussed in this paper. Using our ontology and metrics to structure social media posts enables users to semantically search these posts for new product ideas and thus gain an improved insight into the external sources such as customer needs.Keywords: idea ontology, innovation management, semantic search, open information extraction
Procedia PDF Downloads 1884093 Temporal Estimation of Hydrodynamic Parameter Variability in Constructed Wetlands
Authors: Mohammad Moezzibadi, Isabelle Charpentier, Adrien Wanko, Robert Mosé
Abstract:
The calibration of hydrodynamic parameters for subsurface constructed wetlands (CWs) is a sensitive process since highly non-linear equations are involved in unsaturated flow modeling. CW systems are engineered systems designed to favour natural treatment processes involving wetland vegetation, soil, and their microbial flora. Their significant efficiency at reducing the ecological impact of urban runoff has been recently proved in the field. Numerical flow modeling in a vertical variably saturated CW is here carried out by implementing the Richards model by means of a mixed hybrid finite element method (MHFEM), particularly well adapted to the simulation of heterogeneous media, and the van Genuchten-Mualem parametrization. For validation purposes, MHFEM results were compared to those of HYDRUS (a software based on a finite element discretization). As van Genuchten-Mualem soil hydrodynamic parameters depend on water content, their estimation is subject to considerable experimental and numerical studies. In particular, the sensitivity analysis performed with respect to the van Genuchten-Mualem parameters reveals a predominant influence of the shape parameters α, n and the saturated conductivity of the filter on the piezometric heads, during saturation and desaturation. Modeling issues arise when the soil reaches oven-dry conditions. A particular attention should also be brought to boundary condition modeling (surface ponding or evaporation) to be able to tackle different sequences of rainfall-runoff events. For proper parameter identification, large field datasets would be needed. As these are usually not available, notably due to the randomness of the storm events, we thus propose a simple, robust and low-cost numerical method for the inverse modeling of the soil hydrodynamic properties. Among the methods, the variational data assimilation technique introduced by Le Dimet and Talagrand is applied. To that end, a variational data assimilation technique is implemented by applying automatic differentiation (AD) to augment computer codes with derivative computations. Note that very little effort is needed to obtain the differentiated code using the on-line Tapenade AD engine. Field data are collected for a three-layered CW located in Strasbourg (Alsace, France) at the water edge of the urban water stream Ostwaldergraben, during several months. Identification experiments are conducted by comparing measured and computed piezometric head by means of the least square objective function. The temporal variability of hydrodynamic parameter is then assessed and analyzed.Keywords: automatic differentiation, constructed wetland, inverse method, mixed hybrid FEM, sensitivity analysis
Procedia PDF Downloads 1644092 Automatic LV Segmentation with K-means Clustering and Graph Searching on Cardiac MRI
Authors: Hae-Yeoun Lee
Abstract:
Quantification of cardiac function is performed by calculating blood volume and ejection fraction in routine clinical practice. However, these works have been performed by manual contouring,which requires computational costs and varies on the observer. In this paper, an automatic left ventricle segmentation algorithm on cardiac magnetic resonance images (MRI) is presented. Using knowledge on cardiac MRI, a K-mean clustering technique is applied to segment blood region on a coil-sensitivity corrected image. Then, a graph searching technique is used to correct segmentation errors from coil distortion and noises. Finally, blood volume and ejection fraction are calculated. Using cardiac MRI from 15 subjects, the presented algorithm is tested and compared with manual contouring by experts to show outstanding performance.Keywords: cardiac MRI, graph searching, left ventricle segmentation, K-means clustering
Procedia PDF Downloads 3994091 Role of Hyperbaric Oxygen Therapy in Management of Diabetic Foot
Authors: Magdy Al Shourbagi
Abstract:
Diabetes mellitus is the commonest cause of neuropathy. The common pattern is a distal symmetrical sensory polyneuropathy, associated with autonomic disturbances. Less often, Diabetes mellitus is responsible for a focal or multifocal neuropathy. Common causes for non-healing of diabetic foot are the infection and ischemia. Diabetes mellitus is associated with a defective cellular and humoral immunity. Particularly, decreased phagocytosis, decreased chemotaxis, impaired bacterial killing and abnormal lymphocytic function resulting in a reduced inflammatory reaction and defective wound healing. Hyperbaric oxygen therapy is defined by the Undersea and Hyperbaric Medical Society as a treatment in which a patient intermittently breathes 100% oxygen and the treatment chamber is pressurized to a pressure greater than sea level (1 atmosphere absolute). The pressure increase may be applied in mono-place (single person) or multi-place chambers. Multi-place chambers are pressurized with air, with oxygen given via face mask or endotracheal tube; while mono-place chambers are pressurized with oxygen. Oxygen gas plays an important role in the physiology of wound healing. Hyperbaric oxygen therapy can raise tissue oxygen tensions to levels where wound healing can be expected. HBOT increases the killing ability of leucocytes also it is lethal for certain anaerobic bacteria and inhibits toxin formation in many other anaerobes. Multiple anecdotal reports and studies in HBO therapy in diabetic patients report that HBO can be an effective adjunct therapy in the management of diabetic foot wounds and is associated with better functional outcomes.Keywords: hyperbari oxygen therapy, diabetic foot, neuropathy, multiplace chambers
Procedia PDF Downloads 2904090 COVID_ICU_BERT: A Fine-Tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes
Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo
Abstract:
Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as vital physiological signs, images, and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision-making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful in influencing the judgement of clinical sentiment in ICU clinical notes. This paper introduces two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of clinical transformer models that can reliably predict clinical sentiment for notes of COVID patients in the ICU. We train the model on clinical notes for COVID-19 patients, a type of notes that were not previously seen by clinicalBERT, and Bio_Discharge_Summary_BERT. The model, which was based on clinicalBERT achieves higher predictive accuracy (Acc 93.33%, AUC 0.98, and precision 0.96 ). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and precision 0.92 ).Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation
Procedia PDF Downloads 2074089 Optimal Policies in a Two-Level Supply Chain with Defective Product and Price Dependent Demand
Authors: Samira Mohabbatdar, Abbas Ahmadi, Mohsen S. Sajadieh
Abstract:
This paper deals with a two-level supply chain consisted of one manufacturer and one retailer for single-type product. The demand function of the customers depends on price. We consider an integrated production inventory system where the manufacturer processes raw materials in order to deliver finished product with imperfect quality to the retailer. Then retailer inspects the products and after that delivers perfect products to customers. The proposed model is based on the joint total profit of both the manufacturer and the retailer, and it determines the optimal ordering lot-size, number of shipment and selling price of the retailer. A numerical example is provided to analyse and illustrate the behaviour and application of the model. Finally, sensitivity analysis of the key parameters are presented to test feasibility of the model.Keywords: supply chain, pricing policy, defective quality, joint economic lot sizing
Procedia PDF Downloads 3374088 Geographic Information System (GIS) for Structural Typology of Buildings
Authors: Néstor Iván Rojas, Wilson Medina Sierra
Abstract:
Managing spatial information is described through a Geographic Information System (GIS), for some neighborhoods in the city of Tunja, in relation to the structural typology of the buildings. The use of GIS provides tools that facilitate the capture, processing, analysis and dissemination of cartographic information, product quality evaluation of the classification of buildings. Allows the development of a method that unifies and standardizes processes information. The project aims to generate a geographic database that is useful to the entities responsible for planning and disaster prevention and care for vulnerable populations, also seeks to be a basis for seismic vulnerability studies that can contribute in a study of urban seismic microzonation. The methodology consists in capturing the plat including road naming, neighborhoods, blocks and buildings, to which were added as attributes, the product of the evaluation of each of the housing data such as the number of inhabitants and classification, year of construction, the predominant structural systems, the type of mezzanine board and state of favorability, the presence of geo-technical problems, the type of cover, the use of each building, damage to structural and non-structural elements . The above data are tabulated in a spreadsheet that includes cadastral number, through which are systematically included in the respective building that also has that attribute. Geo-referenced data base is obtained, from which graphical outputs are generated, producing thematic maps for each evaluated data, which clearly show the spatial distribution of the information obtained. Using GIS offers important advantages for spatial information management and facilitates consultation and update. Usefulness of the project is recognized as a basis for studies on issues of planning and prevention.Keywords: microzonation, buildings, geo-processing, cadastral number
Procedia PDF Downloads 3344087 Text Analysis to Support Structuring and Modelling a Public Policy Problem-Outline of an Algorithm to Extract Inferences from Textual Data
Authors: Claudia Ehrentraut, Osama Ibrahim, Hercules Dalianis
Abstract:
Policy making situations are real-world problems that exhibit complexity in that they are composed of many interrelated problems and issues. To be effective, policies must holistically address the complexity of the situation rather than propose solutions to single problems. Formulating and understanding the situation and its complex dynamics, therefore, is a key to finding holistic solutions. Analysis of text based information on the policy problem, using Natural Language Processing (NLP) and Text analysis techniques, can support modelling of public policy problem situations in a more objective way based on domain experts knowledge and scientific evidence. The objective behind this study is to support modelling of public policy problem situations, using text analysis of verbal descriptions of the problem. We propose a formal methodology for analysis of qualitative data from multiple information sources on a policy problem to construct a causal diagram of the problem. The analysis process aims at identifying key variables, linking them by cause-effect relationships and mapping that structure into a graphical representation that is adequate for designing action alternatives, i.e., policy options. This study describes the outline of an algorithm used to automate the initial step of a larger methodological approach, which is so far done manually. In this initial step, inferences about key variables and their interrelationships are extracted from textual data to support a better problem structuring. A small prototype for this step is also presented.Keywords: public policy, problem structuring, qualitative analysis, natural language processing, algorithm, inference extraction
Procedia PDF Downloads 5894086 Synthesis of Carbon Nanotubes from Coconut Oil and Fabrication of a Non Enzymatic Cholesterol Biosensor
Authors: Mitali Saha, Soma Das
Abstract:
The fabrication of nanoscale materials for use in chemical sensing, biosensing and biological analyses has proven a promising avenue in the last few years. Cholesterol has aroused considerable interest in recent years on account of its being an important parameter in clinical diagnosis. There is a strong positive correlation between high serum cholesterol level and arteriosclerosis, hypertension, and myocardial infarction. Enzyme-based electrochemical biosensors have shown high selectivity and excellent sensitivity, but the enzyme is easily denatured during its immobilization procedure and its activity is also affected by temperature, pH, and toxic chemicals. Besides, the reproducibility of enzyme-based sensors is not very good which further restrict the application of cholesterol biosensor. It has been demonstrated that carbon nanotubes could promote electron transfer with various redox active proteins, ranging from cytochrome c to glucose oxidase with a deeply embedded redox center. In continuation of our earlier work on the synthesis and applications of carbon and metal based nanoparticles, we have reported here the synthesis of carbon nanotubes (CCNT) by burning coconut oil under insufficient flow of air using an oil lamp. The soot was collected from the top portion of the flame, where the temperature was around 6500C which was purified, functionalized and then characterized by SEM, p-XRD and Raman spectroscopy. The SEM micrographs showed the formation of tubular structure of CCNT having diameter below 100 nm. The XRD pattern indicated the presence of two predominant peaks at 25.20 and 43.80, which corresponded to (002) and (100) planes of CCNT respectively. The Raman spectrum (514 nm excitation) showed the presence of 1600 cm-1 (G-band) related to the vibration of sp2-bonded carbon and at 1350 cm-1 (D-band) responsible for the vibrations of sp3-bonded carbon. A nonenzymatic cholesterol biosensor was then fabricated on an insulating Teflon material containing three silver wires at the surface, covered by CCNT, obtained from coconut oil. Here, CCNTs worked as working as well as counter electrodes whereas reference electrode and electric contacts were made of silver. The dimensions of the electrode was 3.5 cm×1.0 cm×0.5 cm (length× width × height) and it is ideal for working with 50 µL volume like the standard screen printed electrodes. The voltammetric behavior of cholesterol at CCNT electrode was investigated by cyclic voltammeter and differential pulse voltammeter using 0.001 M H2SO4 as electrolyte. The influence of the experimental parameters on the peak currents of cholesterol like pH, accumulation time, and scan rates were optimized. Under optimum conditions, the peak current was found to be linear in the cholesterol concentration range from 1 µM to 50 µM with a sensitivity of ~15.31 μAμM−1cm−2 with lower detection limit of 0.017 µM and response time of about 6s. The long-term storage stability of the sensor was tested for 30 days and the current response was found to be ~85% of its initial response after 30 days.Keywords: coconut oil, CCNT, cholesterol, biosensor
Procedia PDF Downloads 2824085 Solvent Extraction and Spectrophotometric Determination of Palladium(II) Using P-Methylphenyl Thiourea as a Complexing Agent
Authors: Shashikant R. Kuchekar, Somnath D. Bhumkar, Haribhau R. Aher, Bhaskar H. Zaware, Ponnadurai Ramasami
Abstract:
A precise, sensitive, rapid and selective method for the solvent extraction, spectrophotometric determination of palladium(II) using para-methylphenyl thiourea (PMPT) as an extractant is developed. Palladium(II) forms yellow colored complex with PMPT which shows an absorption maximum at 300 nm. The colored complex obeys Beer’s law up to 7.0 µg ml-1 of palladium. The molar absorptivity and Sandell’s sensitivity were found to be 8.486 x 103 l mol-1cm-1 and 0.0125 μg cm-2 respectively. The optimum conditions for the extraction and determination of palladium have been established by monitoring the various experimental parameters. The precision of the method has been evaluated and the relative standard deviation has been found to be less than 0.53%. The proposed method is free from interference from large number of foreign ions. The method has been successfully applied for the determination of palladium from alloy, synthetic mixtures corresponding to alloy samples.Keywords: solvent extraction, PMPT, Palladium (II), spectrophotometry
Procedia PDF Downloads 4614084 Effect of Anisotropy on Steady Creep in a Whisker Reinforced Functionally Graded Composite Disc
Authors: V. K. Gupta, Tejeet Singh
Abstract:
In many whisker reinforced composites, anisotropy may result due to material flow during processing operations such as forging, extrusion etc. The consequence of anisotropy, introduced during processing of disc material, has been investigated on the steady state creep deformations of the rotating disc. The disc material is assumed to undergo plastic deformations according to Hill’s anisotropic criterion. Steady state creep has been analyzed in a constant thickness rotating disc made of functionally graded 6061Al-SiCw (where the subscript ‘w’ stands for whisker) using Hill’s The content of reinforcement (SiCw) in the disc is assumed to decrease linearly from the inner to outer radius. The stresses and strain rates in the disc are estimated by solving the force equilibrium equation along with the constitutive equations describing multi-axial creep. The results obtained for anisotropic FGM disc have been compared with those estimated for isotropic FGM disc having the same average whisker content. The anisotropic constants, appearing in Hill’s yield criterion, have been obtained from the available experimental results. The results show that the presence of anisotropy reduces the tangential stress in the middle of the disc but near the inner and outer radii the tangential stress is higher when compared to isotropic disc. On the other hand, the steady state creep rates in the anisotropic disc are reduced significantly over the entire disc radius, with the maximum reduction observed at the inner radius. Further, in the presence of anisotropy the distribution of strain rate becomes relatively uniform over the entire disc, which may be responsible for reducing the extent of distortion in the disc.Keywords: anisotropy, creep, functionally graded composite, rotating disc
Procedia PDF Downloads 3924083 Prediction of Remaining Life of Industrial Cutting Tools with Deep Learning-Assisted Image Processing Techniques
Authors: Gizem Eser Erdek
Abstract:
This study is research on predicting the remaining life of industrial cutting tools used in the industrial production process with deep learning methods. When the life of cutting tools decreases, they cause destruction to the raw material they are processing. This study it is aimed to predict the remaining life of the cutting tool based on the damage caused by the cutting tools to the raw material. For this, hole photos were collected from the hole-drilling machine for 8 months. Photos were labeled in 5 classes according to hole quality. In this way, the problem was transformed into a classification problem. Using the prepared data set, a model was created with convolutional neural networks, which is a deep learning method. In addition, VGGNet and ResNet architectures, which have been successful in the literature, have been tested on the data set. A hybrid model using convolutional neural networks and support vector machines is also used for comparison. When all models are compared, it has been determined that the model in which convolutional neural networks are used gives successful results of a %74 accuracy rate. In the preliminary studies, the data set was arranged to include only the best and worst classes, and the study gave ~93% accuracy when the binary classification model was applied. The results of this study showed that the remaining life of the cutting tools could be predicted by deep learning methods based on the damage to the raw material. Experiments have proven that deep learning methods can be used as an alternative for cutting tool life estimation.Keywords: classification, convolutional neural network, deep learning, remaining life of industrial cutting tools, ResNet, support vector machine, VggNet
Procedia PDF Downloads 774082 Backward-Facing Step Measurements at Different Reynolds Numbers Using Acoustic Doppler Velocimetry
Authors: Maria Amelia V. C. Araujo, Billy J. Araujo, Brian Greenwood
Abstract:
The flow over a backward-facing step is characterized by the presence of flow separation, recirculation and reattachment, for a simple geometry. This type of fluid behaviour takes place in many practical engineering applications, hence the reason for being investigated. Historically, fluid flows over a backward-facing step have been examined in many experiments using a variety of measuring techniques such as laser Doppler velocimetry (LDV), hot-wire anemometry, particle image velocimetry or hot-film sensors. However, some of these techniques cannot conveniently be used in separated flows or are too complicated and expensive. In this work, the applicability of the acoustic Doppler velocimetry (ADV) technique is investigated to such type of flows, at various Reynolds numbers corresponding to different flow regimes. The use of this measuring technique in separated flows is very difficult to find in literature. Besides, most of the situations where the Reynolds number effect is evaluated in separated flows are in numerical modelling. The ADV technique has the advantage in providing nearly non-invasive measurements, which is important in resolving turbulence. The ADV Nortek Vectrino+ was used to characterize the flow, in a recirculating laboratory flume, at various Reynolds Numbers (Reh = 3738, 5452, 7908 and 17388) based on the step height (h), in order to capture different flow regimes, and the results compared to those obtained using other measuring techniques. To compare results with other researchers, the step height, expansion ratio and the positions upstream and downstream the step were reproduced. The post-processing of the AVD records was performed using a customized numerical code, which implements several filtering techniques. Subsequently, the Vectrino noise level was evaluated by computing the power spectral density for the stream-wise horizontal velocity component. The normalized mean stream-wise velocity profiles, skin-friction coefficients and reattachment lengths were obtained for each Reh. Turbulent kinetic energy, Reynolds shear stresses and normal Reynolds stresses were determined for Reh = 7908. An uncertainty analysis was carried out, for the measured variables, using the moving block bootstrap technique. Low noise levels were obtained after implementing the post-processing techniques, showing their effectiveness. Besides, the errors obtained in the uncertainty analysis were relatively low, in general. For Reh = 7908, the normalized mean stream-wise velocity and turbulence profiles were compared directly with those acquired by other researchers using the LDV technique and a good agreement was found. The ADV technique proved to be able to characterize the flow properly over a backward-facing step, although additional caution should be taken for measurements very close to the bottom. The ADV measurements showed reliable results regarding: a) the stream-wise velocity profiles; b) the turbulent shear stress; c) the reattachment length; d) the identification of the transition from transitional to turbulent flows. Despite being a relatively inexpensive technique, acoustic Doppler velocimetry can be used with confidence in separated flows and thus very useful for numerical model validation. However, it is very important to perform adequate post-processing of the acquired data, to obtain low noise levels, thus decreasing the uncertainty.Keywords: ADV, experimental data, multiple Reynolds number, post-processing
Procedia PDF Downloads 1484081 How to Talk about It without Talking about It: Cognitive Processing Therapy Offers Trauma Symptom Relief without Violating Cultural Norms
Authors: Anne Giles
Abstract:
Humans naturally wish they could forget traumatic experiences. To help prevent future harm, however, the human brain has evolved to retain data about experiences of threat, alarm, or violation. When given compassionate support and assistance with thinking helpfully and realistically about traumatic events, most people can adjust to experiencing hardships, albeit with residual sad, unfortunate memories. Persistent, recurrent, intrusive memories, difficulty sleeping, emotion dysregulation, and avoidance of reminders, however, may be symptoms of Post-traumatic Stress Disorder (PTSD). Brain scans show that PTSD affects brain functioning. We currently have no physical means of restoring the system of brain structures and functions involved with PTSD. Medications may ease some symptoms but not others. However, forms of "talk therapy" with cognitive components have been found by researchers to reduce, even resolve, a broad spectrum of trauma symptoms. Many cultures have taboos against talking about hardships. Individuals may present themselves to mental health care professionals with severe, disabling trauma symptoms but, because of cultural norms, be unable to speak about them. In China, for example, relationship expectations may include the belief, "Bad things happening in the family should stay in the family (jiāchǒu bùkě wàiyán 家丑不可外扬)." The concept of "family (jiā 家)" may include partnerships, close and extended families, communities, companies, and the nation itself. In contrast to many trauma therapies, Cognitive Processing Therapy (CPT) for Post-traumatic Stress Disorder asks its participants to focus not on "what" happened but on "why" they think the trauma(s) occurred. The question "why" activates and exercises cognitive functioning. Brain scans of individuals with PTSD reveal executive functioning portions of the brain inadequately active, with emotion centers overly active. CPT conceptualizes PTSD as a network of cognitive distortions that keep an individual "stuck" in this under-functioning and over-functioning dynamic. Through asking participants forms of the question "why," plus offering a protocol for examining answers and relinquishing unhelpful beliefs, CPT assists individuals in consciously reactivating the cognitive, executive functions of their brains, thus restoring normal functioning and reducing distressing trauma symptoms. The culturally sensitive components of CPT that allow people to "talk about it without talking about it" may offer the possibility for worldwide relief from symptoms of trauma.Keywords: cognitive processing therapy (CPT), cultural norms, post-traumatic stress disorder (PTSD), trauma recovery
Procedia PDF Downloads 2134080 Geographic Information System and Dynamic Segmentation of Very High Resolution Images for the Semi-Automatic Extraction of Sandy Accumulation
Authors: A. Bensaid, T. Mostephaoui, R. Nedjai
Abstract:
A considerable area of Algerian lands is threatened by the phenomenon of wind erosion. For a long time, wind erosion and its associated harmful effects on the natural environment have posed a serious threat, especially in the arid regions of the country. In recent years, as a result of increases in the irrational exploitation of natural resources (fodder) and extensive land clearing, wind erosion has particularly accentuated. The extent of degradation in the arid region of the Algerian Mecheria department generated a new situation characterized by the reduction of vegetation cover, the decrease of land productivity, as well as sand encroachment on urban development zones. In this study, we attempt to investigate the potential of remote sensing and geographic information systems for detecting the spatial dynamics of the ancient dune cords based on the numerical processing of LANDSAT images (5, 7, and 8) of three scenes 197/37, 198/36 and 198/37 for the year 2020. As a second step, we prospect the use of geospatial techniques to monitor the progression of sand dunes on developed (urban) lands as well as on the formation of sandy accumulations (dune, dunes fields, nebkha, barkhane, etc.). For this purpose, this study made use of the semi-automatic processing method for the dynamic segmentation of images with very high spatial resolution (SENTINEL-2 and Google Earth). This study was able to demonstrate that urban lands under current conditions are located in sand transit zones that are mobilized by the winds from the northwest and southwest directions.Keywords: land development, GIS, segmentation, remote sensing
Procedia PDF Downloads 1554079 Measuring the Effect of Continuous Performance Test-3 Administration on Regional Cerebral Blood Flow with Single-Photon Emission Computed Tomography in Adult ADHD
Authors: Claire Stafford, Charles Golden, Daniel Amen, Kristen Willeumier
Abstract:
The aim of this study is to investigate the effect of the administration of the Conners Continuous Performance Test (CPT-3) on cerebral blood flow (CBF) in adults with ADHD. The data for this study was derived from a large SPECT database. Participants in the ADHD group (n=81, Mage=37.97) were similar to those in the healthy control group (n=8503, Mage=41.86). All participants were assessed for cerebral blood flow levels before and after CPT-3 administration. Both age and gender were considered covariates. Multiple 2-by-2 ANCOVAs with repeated measures were conducted with sphericity assumed. The main effects of CPT-3 administration on CBF levels were significant in the left and right side of the frontal and occipital, and right temporal lobe. The main effects of ADHD diagnosis were significant in all brain areas assessed. The interaction between CPT-3 administration and ADHD diagnosis was significant in the left and right side of the limbic system, basal ganglia, the frontal lobe, and occipital lobe. Post hoc tests with a Bonferroni adjustment revealed that CBF levels increased following CPT-3 administration but less so in the ADHD group. Individuals had higher levels of CBF following the administration of CPT-3. Due to a significant interaction, we can infer that ADHD diagnosis changes the effect of CPT-3 administration on CBF levels. This is consistent with our hypothesis considering that CPT-3 is a test of sustained attention, a common challenge for children with ADHD. The aforementioned interaction was not found to be significant in the parietal lobe. This may be due to the nature of CPT- 3 which does not require an integration of sensory information.Keywords: SPECT, ADHD, conners continuous performance test, cerebral blood flow
Procedia PDF Downloads 1024078 Design and Evaluation of a Fully-Automated Fluidized Bed Dryer for Complete Drying of Paddy
Authors: R. J. Pontawe, R. C. Martinez, N. T. Asuncion, R. V. Villacorte
Abstract:
Drying of high moisture paddy remains a major problem in the Philippines, especially during inclement weather condition. To alleviate the problem, mechanical dryers were used like a flat bed and recirculating batch-type dryers. However, drying to 14% (wet basis) final moisture content is long which takes 10-12 hours and tedious which is not the ideal for handling high moisture paddy. Fully-automated pilot-scale fluidized bed drying system with 500 kilograms per hour capacity was evaluated using a high moisture paddy. The developed fluidized bed dryer was evaluated using four drying temperatures and two variations in fluidization time at a constant airflow, static pressure and tempering period. Complete drying of paddy with ≥28% (w.b.) initial MC was attained after 2 passes of fluidized-bed drying at 2 minutes exposure to 70 °C drying temperature and 4.9 m/s superficial air velocity, followed by 60 min ambient air tempering period (30 min without ventilation and 30 min with air ventilation) for a total drying time of 2.07 h. Around 82% from normal mechanical drying time was saved at 70 °C drying temperature. The drying cost was calculated to be P0.63 per kilogram of wet paddy. Specific heat energy consumption was only 2.84 MJ/kg of water removed. The Head Rice Yield recovery of the dried paddy passed the Philippine Agricultural Engineering Standards. Sensory evaluation showed that the color and taste of the samples dried in the fluidized bed dryer were comparable to air dried paddy. The optimum drying parameters of using fluidized bed dryer is 70 oC drying temperature at 2 min fluidization time, 4.9 m/s superficial air velocity, 10.16 cm grain depth and 60 min ambient air tempering period.Keywords: drying, fluidized bed dryer, head rice yield, paddy
Procedia PDF Downloads 325