Search results for: computer processing of large databases
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12481

Search results for: computer processing of large databases

9961 Limos Lactobacillus Fermentum from Buffalo Milk Is Suitable for Potential Biotechnological Process Development

Authors: Sergio D’Ambrosioa, Azza Dobousa, Chiara Schiraldia, Donatella Ciminib

Abstract:

Probiotics are living microorganisms that give beneficial effects while consumed. Lactic acid bacteria and bifidobacteria are among the most representative strains assessed as probiotics and exploited as food supplements. Numerous studies demonstrated their potential as a therapeutic candidate for a variety of diseases (restoring gut flora, lowering cholesterol, immune response-enhancing, anti-inflammation and anti-oxidation activities). These beneficial actions are also due to biomolecules produced by probiotics, such as exopolysaccharides (EPSs), that demonstrate plenty of beneficial properties such as antimicrobial, antitumor, anti-biofilm, antiviral and immunomodulatory activities. Limosilactobacillus fermentum is a widely studied member of probiotics; however, few data are available on the development of fermentation and downstream processes for the production of viable biomasses for potential industrial applications. However, few data are available on the development of fermentation processes for the large-scale production of probiotics biomass for industrial applications and for purification processes of EPSs at an industrial scale. For this purpose, L. fermentum strain was isolated from buffalo milk and used as a test example for biotechnological process development. The strain was able to produce up to 109 CFU/mL on a (glucose-based) semi-defined medium deprived of animal-derived raw materials up to the pilot scale (150 L), demonstrating improved results compared to commonly used, although industrially not suitable, media-rich of casein and beef extract. Biomass concentration via microfiltration on hollow fibers, and subsequent spray-drying allowed to recover of about 5.7 × 1010CFU/gpowder of viable cells, indicating strain resistance to harsh processing conditions. Overall, these data demonstrate the possibility of obtaining and maintaining adequate levels of viable L. fermentum cells by using a simple approach that is potentially suitable for industrial development. A downstream EPS purification protocol based on ultrafiltration, precipitation and activated charcoal treatments showed a purity of the recovered polysaccharides of about 70-80%.

Keywords: probiotics, fermentation, exopolysaccharides (EPSs), purification

Procedia PDF Downloads 67
9960 Probabilistic Safety Assessment of Koeberg Spent Fuel Pool

Authors: Sibongiseni Thabethe, Ian Korir

Abstract:

The effective management of spent fuel pool (SFP) safety has been raised as one of the emerging issues to further enhance nuclear installation safety after the Fukushima accident on March 11, 2011. Before then, SFP safety-related issues have been mainly focused on (a) controlling the configuration of the fuel assemblies in the pool with no loss of pool coolants and (b) ensuring adequate pool storage space to prevent fuel criticality owing to chain reactions of the fission products and the ability for neutron absorption to keep the fuel cool. A probabilistic safety (PSA) assessment was performed using the systems analysis program for hands-on integrated reliability evaluations (SAPHIRE) computer code. Event and fault tree analysis was done to develop a PSA model for the Koeberg SFP. We present preliminary PSA results of events that lead to boiling and cause fuel uncovering, resulting in possible fuel damage in the Koeberg SFP.

Keywords: computer code, fuel assemblies, probabilistic risk assessment, spent fuel pool

Procedia PDF Downloads 153
9959 Security Design of Root of Trust Based on RISC-V

Authors: Kang Huang, Wanting Zhou, Shiwei Yuan, Lei Li

Abstract:

Since information technology develops rapidly, the security issue has become an increasingly critical for computer system. In particular, as cloud computing and the Internet of Things (IoT) continue to gain widespread adoption, computer systems need to new security threats and attacks. The Root of Trust (RoT) is the foundation for providing basic trusted computing, which is used to verify the security and trustworthiness of other components. Design a reliable Root of Trust and guarantee its own security are essential for improving the overall security and credibility of computer systems. In this paper, we discuss the implementation of self-security technology based on the RISC-V Root of Trust at the hardware level. To effectively safeguard the security of the Root of Trust, researches on security safeguard technology on the Root of Trust have been studied. At first, a lightweight and secure boot framework is proposed as a secure mechanism. Secondly, two kinds of memory protection mechanism are built to against memory attacks. Moreover, hardware implementation of proposed method has been also investigated. A series of experiments and tests have been carried on to verify to effectiveness of the proposed method. The experimental results demonstrated that the proposed approach is effective in verifying the integrity of the Root of Trust’s own boot rom, user instructions, and data, ensuring authenticity and enabling the secure boot of the Root of Trust’s own system. Additionally, our approach provides memory protection against certain types of memory attacks, such as cache leaks and tampering, and ensures the security of root-of-trust sensitive information, including keys.

Keywords: root of trust, secure boot, memory protection, hardware security

Procedia PDF Downloads 173
9958 Preparation of Nano-Scaled linbo3 by Polyol Method

Authors: Gabriella Dravecz, László Péter, Zsolt Kis

Abstract:

Abstract— The growth of optical LiNbO3 single crystal and its physical and chemical properties are well known on the macroscopic scale. Nowadays the rare-earth doped single crystals became important for coherent quantum optical experiments: electromagnetically induced transparency, slow down of light pulses, coherent quantum memory. The expansion of applications is increasingly requiring the production of nano scaled LiNbO3 particles. For example, rare-earth doped nanoscaled particles of lithium niobate can be act like single photon source which can be the bases of a coding system of the quantum computer providing complete inaccessibility to strangers. The polyol method is a chemical synthesis where oxide formation occurs instead of hydroxide because of the high temperature. Moreover the polyol medium limits the growth and agglomeration of the grains producing particles with the diameter of 30-200 nm. In this work nano scaled LiNbO3 was prepared by the polyol method. The starting materials (niobium oxalate and LiOH) were diluted in H2O2. Then it was suspended in ethylene glycol and heated up to about the boiling point of the mixture with intensive stirring. After the thermal equilibrium was reached, the mixture was kept in this temperature for 4 hours. The suspension was cooled overnight. The mixture was centrifuged and the particles were filtered. Dynamic Light Scattering (DLS) measurement was carried out and the size of the particles were found to be 80-100 nms. This was confirmed by Scanning Electron Microscope (SEM) investigations. The element analysis of SEM showed large amount of Nb in the sample. The production of LiNbO3 nano particles were succesful by the polyol method. The agglomeration of the particles were avoided and the size of 80-100nm could be reached.

Keywords: lithium-niobate, nanoparticles, polyol, SEM

Procedia PDF Downloads 116
9957 Analysis of Matching Pursuit Features of EEG Signal for Mental Tasks Classification

Authors: Zin Mar Lwin

Abstract:

Brain Computer Interface (BCI) Systems have developed for people who suffer from severe motor disabilities and challenging to communicate with their environment. BCI allows them for communication by a non-muscular way. For communication between human and computer, BCI uses a type of signal called Electroencephalogram (EEG) signal which is recorded from the human„s brain by means of an electrode. The electroencephalogram (EEG) signal is an important information source for knowing brain processes for the non-invasive BCI. Translating human‟s thought, it needs to classify acquired EEG signal accurately. This paper proposed a typical EEG signal classification system which experiments the Dataset from “Purdue University.” Independent Component Analysis (ICA) method via EEGLab Tools for removing artifacts which are caused by eye blinks. For features extraction, the Time and Frequency features of non-stationary EEG signals are extracted by Matching Pursuit (MP) algorithm. The classification of one of five mental tasks is performed by Multi_Class Support Vector Machine (SVM). For SVMs, the comparisons have been carried out for both 1-against-1 and 1-against-all methods.

Keywords: BCI, EEG, ICA, SVM

Procedia PDF Downloads 261
9956 Design of Aesthetic Acoustic Metamaterials Window Panel Based on Sierpiński Fractal Triangle for Sound-Silencing with Free Airflow

Authors: Sanjeet Kumar Singh, Shantanu Bhatacharya

Abstract:

Design of high-efficiency low, frequency (<1000Hz) soundproof window or wall absorber which is transparent to airflow is presented. Due to the massive rise in human population and modernization, environmental noise has significantly risen globally. Prolonged noise exposure can cause severe physiological and psychological symptoms like nausea, headaches, fatigue, and insomnia. There has been continuous growth in building construction and infrastructure like offices, bus stops, and airports due to the urban population. Generally, a ventilated window is used for getting fresh air into the room, but at the same time, unwanted noise comes along. Researchers used traditional approaches like noise barrier mats in front of the window or designed the entire window using sound-absorbing materials. However, this solution is not aesthetically pleasing, and at the same time, it's heavy and not adequate for low-frequency noise shielding. To address this challenge, we design a transparent hexagonal panel based on the Sierpiński fractal triangle, which is aesthetically pleasing and demonstrates a normal incident sound absorption coefficient of more than 0.96 around 700 Hz and transmission loss of around 23 dB while maintaining e air circulation through the triangular cutout. Next, we present a concept of fabrication of large acoustic panels for large-scale applications, which leads to suppressing urban noise pollution.

Keywords: acoustic metamaterials, ventilation, urban noise pollution, noise control

Procedia PDF Downloads 99
9955 On the Development of Medical Additive Manufacturing in Egypt

Authors: Khalid Abdelghany

Abstract:

Additive Manufacturing (AM) is the manufacturing technology that is used to fabricate fast products direct from CAD models in very short time and with minimum operation steps. Jointly with the advancement in medical computer modeling, AM proved to be a very efficient tool to help physicians, orthopedic surgeons and dentists design and fabricate patient-tailored surgical guides, templates and customized implants from the patient’s CT / MRI images. AM jointly with computer-assisted designing/computer-assisted manufacturing (CAD/CAM) technology have enabled medical practitioners to tailor physical models in a patient-and purpose-specific fashion and helped to design and manufacture of templates, appliances and devices with a high range of accuracy using biocompatible materials. In developing countries, there are some technical and financial limitations of implementing such advanced tools as an essential portion of medical applications. CMRDI institute in Egypt has been working in the field of Medical Additive Manufacturing since 2003 and has assisted in the recovery of hundreds of poor patients using these advanced tools. This paper focuses on the surgical and dental use of 3D printing technology in Egypt as a developing country. The presented case studies have been designed and processed using the software tools and additive manufacturing machines in CMRDI through cooperative engineering and medical works. Results showed that the implementation of the additive manufacturing tools in developed countries is successful and could be economical comparing to long treatment plans.

Keywords: additive manufacturing, dental and orthopeadic stents, patient specific surgical tools, titanium implants

Procedia PDF Downloads 296
9954 For Post-traumatic Stress Disorder Counselors in China, the United States, and around the Globe, Cultural Beliefs Offer Challenges and Opportunities

Authors: Anne Giles

Abstract:

Trauma is generally defined as an experience, or multiple experiences, overwhelming a person's ability to cope. Over time, many people recover from the neurobiological, physical, and emotional effects of trauma on their own. For some people, however, troubling symptoms develop over time that can result in distress and disability. This cluster of symptoms is classified as Post-traumatic Stress Disorder (PTSD). People who meet the criteria for PTSD and other trauma-related disorder diagnoses often hold a set of understandable but unfounded beliefs about traumatic events that cause undue suffering. Becoming aware of unhelpful beliefs—termed "cognitive distortions"—and challenging them is the realm of Cognitive Behavior Therapy (CBT). A form of CBT found by researchers to be especially effective for PTSD is Cognitive Processing Therapy (CPT). Through the compassionate use of CPT, people identify, examine, challenge, and relinquish unhelpful beliefs, thereby reducing symptoms and suffering. Widely-held cultural beliefs can interfere with the progress of recovery from trauma-related disorders. Although highly revered, largely unquestioned, and often stabilizing, cultural beliefs can be founded in simplistic, dichotomous thinking, i.e., things are all right, or all wrong, all good, or all bad. The reality, however, is nuanced and complex. After studying examples of cultural beliefs from China and the United States and how these might interfere with trauma recovery, trauma counselors can help clients derive criteria for preserving helpful beliefs, discover, examine, and jettison unhelpful beliefs, reduce trauma symptoms, and live their lives more freely and fully.

Keywords: cognitive processing therapy (CPT), cultural beliefs, post-traumatic stress disorder (PTSD), trauma recovery

Procedia PDF Downloads 232
9953 Cross Attention Fusion for Dual-Stream Speech Emotion Recognition

Authors: Shaode Yu, Jiajian Meng, Bing Zhu, Hang Yu, Qiurui Sun

Abstract:

Speech emotion recognition (SER) is for recognizing human subjective emotions through audio data in-depth analysis. From speech audios, how to comprehensively extract emotional information and how to effectively fuse extracted features remain challenging. This paper presents a dual-stream SER framework that embraces both full training and transfer learning of different networks for thorough feature encoding. Besides, a plug-and-play cross-attention fusion (CAF) module is implemented for the valid integration of the dual-stream encoder output. The effectiveness of the proposed CAF module is compared to the other three fusion modules (feature summation, feature concatenation, and feature-wise linear modulation) on two databases (RAVDESS and IEMO-CAP) using different dual-stream encoders (full training network, DPCNN or TextRCNN; transfer learning network, HuBERT or Wav2Vec2). Experimental results suggest that the CAF module can effectively reconcile conflicts between features from different encoders and outperform the other three feature fusion modules on the SER task. In the future, the plug-and-play CAF module can be extended for multi-branch feature fusion, and the dual-stream SER framework can be widened for multi-stream data representation to improve the recognition performance and generalization capacity.

Keywords: speech emotion recognition, cross-attention fusion, dual-stream, pre-trained

Procedia PDF Downloads 55
9952 Competition between Verb-Based Implicit Causality and Theme Structure's Influence on Anaphora Bias in Mandarin Chinese Sentences: Evidence from Corpus

Authors: Linnan Zhang

Abstract:

Linguists, as well as psychologists, have shown great interests in implicit causality in reference processing. However, most frequently-used approaches to this issue are psychological experiments (such as eye tracking or self-paced reading, etc.). This research is a corpus-based one and is assisted with statistical tool – software R. The main focus of the present study is about the competition between verb-based implicit causality and theme structure’s influence on anaphora bias in Mandarin Chinese sentences. In Accessibility Theory, it is believed that salience, which is also known as accessibility, and relevance are two important factors in reference processing. Theme structure, which is a special syntactic structure in Chinese, determines the salience of an antecedent on the syntactic level while verb-based implicit causality is a key factor to the relevance between antecedent and anaphora. Therefore, it is a study about anaphora, combining psychology with linguistics. With analysis of the sentences from corpus as well as the statistical analysis of Multinomial Logistic Regression, major findings of the present study are as follows: 1. When the sentence is stated in a ‘cause-effect’ structure, the theme structure will always be the antecedent no matter forward biased verbs or backward biased verbs co-occur; in non-theme structure, the anaphora bias will tend to be the opposite of the verb bias; 2. When the sentence is stated in a ‘effect-cause’ structure, theme structure will not always be the antecedent and the influence of verb-based implicit causality will outweigh that of theme structure; moreover, the anaphora bias will be the same with the bias of verbs. All the results indicate that implicit causality functions conditionally and the noun in theme structure will not be the high-salience antecedent under any circumstances.

Keywords: accessibility theory, anaphora, theme strcture, verb-based implicit causality

Procedia PDF Downloads 183
9951 Effect of Extrusion Processing Parameters on Protein in Banana Flour Extrudates: Characterisation Using Fourier-Transform Infrared Spectroscopy

Authors: Surabhi Pandey, Pavuluri Srinivasa Rao

Abstract:

Extrusion processing is a high-temperature short time (HTST) treatment which can improve protein quality and digestibility together with retaining active nutrients. In-vitro protein digestibility of plant protein-based foods is generally enhanced by extrusion. The current study aimed to investigate the effect of extrusion cooking on in-vitro protein digestibility (IVPD) and conformational modification of protein in green banana flour extrudates. Green banana flour was extruded through a co-rotating twin-screw extruder varying the moisture content, barrel temperature, screw speed in the range of 10-20 %, 60-80 °C, 200-300 rpm, respectively, at constant feed rate. Response surface methodology was used to optimise the result for IVPD. Fourier-transform infrared spectroscopy (FTIR) analysis provided a convenient and powerful means to monitor interactions and changes in functional and conformational properties of extrudates. Results showed that protein digestibility was highest in extrudate produced at 80°C, 250 rpm and 15% feed moisture. FTIR analysis was done for the optimised sample having highest IVPD. FTIR analysis showed that there were no changes in primary structure of protein while the secondary protein structure changed. In order to explain this behaviour, infrared spectroscopy analysis was carried out, mainly in the amide I and II regions. Moreover, curve fitting analysis showed the conformational changes produced in the flour due to protein denaturation. The quantitative analysis of the changes in the amide I and II regions provided information about the modifications produced in banana flour extrudates.

Keywords: extrusion, FTIR, protein conformation, raw banana flour, SDS-PAGE method

Procedia PDF Downloads 143
9950 Lab Support: A Computer Laboratory Class Management Support System

Authors: Eugenia P. Ramirez, Kevin Matthe Caramancion, Mia Eleazar

Abstract:

Getting the attention of students is a constant challenge to the instructors/lecturers. Although in the computer laboratories some networking and entertainment websites are blocked, yet, these websites have unlimited ways of attracting students to get into it. Thus, when an instructor gives a specific set of instructions, some students may not be able to follow sequentially the steps that are given. The instructor has to physically go to the specific remote terminal and show the student the details. Sometimes, during an examination in laboratory set-up, a proctor may prefer to give detailed and text-written instructions rather than verbal instructions. Even the mere calling of a specific student at any time will distract the whole class especially when activities are being performed. What is needed is : An application software that is able to lock the student's monitor and at the same time display the instructor’s screen; a software that is powerful enough to process in its side alone and manipulate a specific user’s terminal in terms of free configuration that is, without restrictions at the server level is a required functionality for a modern and optimal server structure; a software that is able to send text messages to students, per terminal or in group will be a solution. These features are found in LabSupport. This paper outlines the LabSupport application software framework to efficiently manage computer laboratory sessions and will include different modules: screen viewer, demonstration mode, monitor locking system, text messaging, and class management. This paper's ultimate aim is to provide a system that increases instructor productivity.

Keywords: application software, broadcast messaging, class management, locking system

Procedia PDF Downloads 423
9949 The Beneficial Effects of Hydrotherapy for Recovery from Team Sport – A Meta-Analysis

Authors: Trevor R. Higgins

Abstract:

To speed/enhance recovery from sport, cold water immersion (CWI) and contrast water therapy (CWT) have become common practice within the high-level team sport. Initially, research into CWI and CWT protocols and recovery was sparse; athletes relied solely upon an anecdotal support. However, an increase into recovery research has occurred. A number of reviews have subsequently been conducted to clarify scientific evidence. However, as the nature of physiological stress and training status of participants will impact on results, an opportunity existed to narrow the focus to a more exacting review evaluating hydrotherapy for recovery in a team sport. A Boolean logic [AND] keyword search of databases was conducted: SPORTDiscus; AMED; CINAHL; MEDLINE. Data was extracted and the standardized mean differences were calculated with 95% CI. The analysis of pooled data was conducted using a random-effect model, with Heterogeneity assessed using I2. 23 peer reviewed papers (n=606) met the criteria. Meta-analyses results indicated CWI was likely beneficial for recovery at 24h (Countermovement Jump (CMJ): p= 0.05, CI -0.004 to 0.578; All-out sprint: p=0.02, -0.056 to 0.801; DOMS: p=0.08, CI -0.092 to 1.936) and at 72h (accumulated sprinting: p=0.07, CI -0.062 to 1.209; DOMS: p=0.09, CI -0.121 to 1.555) following team sport. Whereas CWT was likely beneficial for recovery at 1h (CMJ: p= 0.07, CI -0.004 to 0.863) and at 48h (fatigue: p=0.04, CI 0.013 to 0.942) following team sport. Athlete’s perceptions of muscle soreness and fatigue are enhanced with CWI and/or CWT, however even though CWI and CWT were beneficial in attenuating decrements in neuromuscular performance 24 hours following team sport, indications are those benefits were no longer Sydney evident 48 hours following team sport.

Keywords: cold water immersion, contrast water therapy, recovery, team sport

Procedia PDF Downloads 492
9948 Microencapsulation of Phenobarbital by Ethyl Cellulose Matrix

Authors: S. Bouameur, S. Chirani

Abstract:

The aim of this study was to evaluate the potential use of EthylCellulose in the preparation of microspheres as a Drug Delivery System for sustained release of phenobarbital. The microspheres were prepared by solvent evaporation technique using ethylcellulose as polymer matrix with a ratio 1:2, dichloromethane as solvent and Polyvinyl alcohol 1% as processing medium to solidify the microspheres. Size, shape, drug loading capacity and entrapement efficiency were studied.

Keywords: phenobarbital, microspheres, ethylcellulose, polyvinylacohol

Procedia PDF Downloads 352
9947 Hand Detection and Recognition for Malay Sign Language

Authors: Mohd Noah A. Rahman, Afzaal H. Seyal, Norhafilah Bara

Abstract:

Developing a software application using an interface with computers and peripheral devices using gestures of human body such as hand movements keeps growing in interest. A review on this hand gesture detection and recognition based on computer vision technique remains a very challenging task. This is to provide more natural, innovative and sophisticated way of non-verbal communication, such as sign language, in human computer interaction. Nevertheless, this paper explores hand detection and hand gesture recognition applying a vision based approach. The hand detection and recognition used skin color spaces such as HSV and YCrCb are applied. However, there are limitations that are needed to be considered. Almost all of skin color space models are sensitive to quickly changing or mixed lighting circumstances. There are certain restrictions in order for the hand recognition to give better results such as the distance of user’s hand to the webcam and the posture and size of the hand.

Keywords: hand detection, hand gesture, hand recognition, sign language

Procedia PDF Downloads 289
9946 Recent Developments in the Application of Deep Learning to Stock Market Prediction

Authors: Shraddha Jain Sharma, Ratnalata Gupta

Abstract:

Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.

Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume

Procedia PDF Downloads 72
9945 Spectral Mapping of Hydrothermal Alteration Minerals for Geothermal Exploration Using Advanced Spaceborne Thermal Emission and Reflection Radiometer Short Wave Infrared Data

Authors: Aliyu J. Abubakar, Mazlan Hashim, Amin B. Pour

Abstract:

Exploiting geothermal resources for either power, home heating, Spa, greenhouses, industrial or tourism requires an initial identification of suitable areas. This can be done cost-effectively using remote sensing satellite imagery which has synoptic capabilities of covering large areas in real time and by identifying possible areas of hydrothermal alteration and minerals related to Geothermal systems. Earth features and minerals are known to have unique diagnostic spectral reflectance characteristics that can be used to discriminate them. The focus of this paper is to investigate the applicability of mapping hydrothermal alteration in relation to geothermal systems (thermal springs) at Yankari Park Northeastern Nigeria, using Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) satellite data for resource exploration. The ASTER Short Wave Infrared (SWIR) bands are used to highlight and discriminate alteration areas by employing sophisticated digital image processing techniques including image transformations and spectral mapping methods. Field verifications are conducted at the Yankari Park using hand held Global Positioning System (GPS) monterra to identify locations of hydrothermal alteration and rock samples obtained at the vicinity and surrounding areas of the ‘Mawulgo’ and ‘Wikki’ thermal springs. X-Ray Diffraction (XRD) results of rock samples obtained from the field validated hydrothermal alteration by the presence of indicator minerals including; Dickite, Kaolinite, Hematite and Quart. The study indicated the applicability of mapping geothermal anomalies for resource exploration in unmapped sparsely vegetated savanna environment characterized by subtle surface manifestations such as thermal springs. The results could have implication for geothermal resource exploration especially at the prefeasibility stages by narrowing targets for comprehensive surveys and in unexplored savanna regions where expensive airborne surveys are unaffordable.

Keywords: geothermal exploration, image enhancement, minerals, spectral mapping

Procedia PDF Downloads 346
9944 Efficacy of Cognitive Rehabilitation Therapy on Poststroke Depression among Survivors of Stroke; A Systematic Review

Authors: Zahra Hassani

Abstract:

Background and Purpose: Poststroke depression (PSD) is one of the complications of a stroke that reduces the patient's chance of recovery, becomes irritable, and changes personality. Cognitive rehabilitation is one of the non-pharmacological methods that improve deficits such as attention, memory, and symptoms of depression. Therefore, the purpose of the present study is to evaluate the Efficacy of Cognitive Rehabilitation Therapy on Poststroke Depression among Survivors of stroke. Method: In this study, a systematic review of the databases Google Scholar, PubMed, Science Direct, Elsevier between the years 2015 and 2019 with the keywords cognitive rehabilitation therapy, post-stroke, depression Search is done. In this process, studies that examined the Efficacy of Cognitive Rehabilitation Therapy on Poststroke Depression among Survivors of stroke were included in the study. Results: Inclusion criteria were full-text availability, interventional study, and non-review articles. There was a significant difference between the articles in terms of the indices studied, sample number, method of implementation, and so on. A review of studies have shown that cognitive rehabilitation therapy has a significant role in reducing the symptoms of post-stroke depression. The use of these interventions is also effective in improving problem-solving skills, improving memory, and improving attention and concentration. Conclusion: This study emphasizes on the development of efficient and flexible adaptive skills through cognitive processes and its effect on reducing depression in patients after stroke.

Keywords: cognitive therapy, depression, stroke, rehabilitation

Procedia PDF Downloads 111
9943 Predicting Dose Level and Length of Time for Radiation Exposure Using Gene Expression

Authors: Chao Sima, Shanaz Ghandhi, Sally A. Amundson, Michael L. Bittner, David J. Brenner

Abstract:

In a large-scale radiologic emergency, potentially affected population need to be triaged efficiently using various biomarkers where personal dosimeters are not likely worn by the individuals. It has long been established that radiation injury can be estimated effectively using panels of genetic biomarkers. Furthermore, the rate of radiation, in addition to dose of radiation, plays a major role in determining biological responses. Therefore, a better and more accurate triage involves estimating both the dose level of the exposure and the length of time of that exposure. To that end, a large in vivo study was carried out on mice with internal emitter caesium-137 (¹³⁷Cs). Four different injection doses of ¹³⁷Cs were used: 157.5 μCi, 191 μCi, 214.5μCi, and 259 μCi. Cohorts of 6~7 mice from the control arm and each of the dose levels were sacrificed, and blood was collected 2, 3, 5, 7 and 14 days after injection for microarray RNA gene expression analysis. Using a generalized linear model with penalized maximum likelihood, a panel of 244 genes was established and both the doses of injection and the number of days after injection were accurately predicted for all 155 subjects using this panel. This has proven that microarray gene expression can be used effectively in radiation biodosimetry in predicting both the dose levels and the length of exposure time, which provides a more holistic view on radiation exposure and helps improving radiation damage assessment and treatment.

Keywords: caesium-137, gene expression microarray, multivariate responses prediction, radiation biodosimetry

Procedia PDF Downloads 184
9942 On Cloud Computing: A Review of the Features

Authors: Assem Abdel Hamed Mousa

Abstract:

The Internet of Things probably already influences your life. And if it doesn’t, it soon will, say computer scientists; Ubiquitous computing names the third wave in computing, just now beginning. First were mainframes, each shared by lots of people. Now we are in the personal computing era, person and machine staring uneasily at each other across the desktop. Next comes ubiquitous computing, or the age of calm technology, when technology recedes into the background of our lives. Alan Kay of Apple calls this "Third Paradigm" computing. Ubiquitous computing is essentially the term for human interaction with computers in virtually everything. Ubiquitous computing is roughly the opposite of virtual reality. Where virtual reality puts people inside a computer-generated world, ubiquitous computing forces the computer to live out here in the world with people. Virtual reality is primarily a horse power problem; ubiquitous computing is a very difficult integration of human factors, computer science, engineering, and social sciences. The approach: Activate the world. Provide hundreds of wireless computing devices per person per office, of all scales (from 1" displays to wall sized). This has required new work in operating systems, user interfaces, networks, wireless, displays, and many other areas. We call our work "ubiquitous computing". This is different from PDA's, dynabooks, or information at your fingertips. It is invisible; everywhere computing that does not live on a personal device of any sort, but is in the woodwork everywhere. The initial incarnation of ubiquitous computing was in the form of "tabs", "pads", and "boards" built at Xerox PARC, 1988-1994. Several papers describe this work, and there are web pages for the Tabs and for the Boards (which are a commercial product now): Ubiquitous computing will drastically reduce the cost of digital devices and tasks for the average consumer. With labor intensive components such as processors and hard drives stored in the remote data centers powering the cloud , and with pooled resources giving individual consumers the benefits of economies of scale, monthly fees similar to a cable bill for services that feed into a consumer’s phone.

Keywords: internet, cloud computing, ubiquitous computing, big data

Procedia PDF Downloads 369
9941 An Ensemble System of Classifiers for Computer-Aided Volcano Monitoring

Authors: Flavio Cannavo

Abstract:

Continuous evaluation of the status of potentially hazardous volcanos plays a key role for civil protection purposes. The importance of monitoring volcanic activity, especially for energetic paroxysms that usually come with tephra emissions, is crucial not only for exposures to the local population but also for airline traffic. Presently, real-time surveillance of most volcanoes worldwide is essentially delegated to one or more human experts in volcanology, who interpret data coming from different kind of monitoring networks. Unfavorably, the high nonlinearity of the complex and coupled volcanic dynamics leads to a large variety of different volcanic behaviors. Moreover, continuously measured parameters (e.g. seismic, deformation, infrasonic and geochemical signals) are often not able to fully explain the ongoing phenomenon, thus making the fast volcano state assessment a very puzzling task for the personnel on duty at the control rooms. With the aim of aiding the personnel on duty in volcano surveillance, here we introduce a system based on an ensemble of data-driven classifiers to infer automatically the ongoing volcano status from all the available different kind of measurements. The system consists of a heterogeneous set of independent classifiers, each one built with its own data and algorithm. Each classifier gives an output about the volcanic status. The ensemble technique allows weighting the single classifier output to combine all the classifications into a single status that maximizes the performance. We tested the model on the Mt. Etna (Italy) case study by considering a long record of multivariate data from 2011 to 2015 and cross-validated it. Results indicate that the proposed model is effective and of great power for decision-making purposes.

Keywords: Bayesian networks, expert system, mount Etna, volcano monitoring

Procedia PDF Downloads 231
9940 Preserving Urban Cultural Heritage with Deep Learning: Color Planning for Japanese Merchant Towns

Authors: Dongqi Li, Yunjia Huang, Tomo Inoue, Kohei Inoue

Abstract:

With urbanization, urban cultural heritage is facing the impact and destruction of modernization and urbanization. Many historical areas are losing their historical information and regional cultural characteristics, so it is necessary to carry out systematic color planning for historical areas in conservation. As an early focus on urban color planning, Japan has a systematic approach to urban color planning. Hence, this paper selects five merchant towns from the category of important traditional building preservation areas in Japan as the subject of this study to explore the color structure and emotion of this type of historic area. First, the image semantic segmentation method identifies the buildings, roads, and landscape environments. Their color data were extracted for color composition and emotion analysis to summarize their common features. Second, the obtained Internet evaluations were extracted by natural language processing for keyword extraction. The correlation analysis of the color structure and keywords provides a valuable reference for conservation decisions for this historic area in the town. This paper also combines the color structure and Internet evaluation results with generative adversarial networks to generate predicted images of color structure improvements and color improvement schemes. The methods and conclusions of this paper can provide new ideas for the digital management of environmental colors in historic districts and provide a valuable reference for the inheritance of local traditional culture.

Keywords: historic districts, color planning, semantic segmentation, natural language processing

Procedia PDF Downloads 65
9939 Long Time Oxidation Behavior of Machined 316 Austenitic Stainless Steel in Primary Water Reactor

Authors: Siyang Wang, Yujin Hu, Xuelin Wang, Wenqian Zhang

Abstract:

Austenitic stainless steels are widely used in nuclear industry to manufacture critical components owing to their excellent corrosion resistance at high temperatures. Almost all the components used in nuclear power plants are produced by surface finishing (surface cold work) such as milling, grinding and so on. The change of surface states induced by machining has great influence on the corrosion behavior. In the present study, long time oxidation behavior of machined 316 austenitic stainless steel exposed to simulated pressure water reactor environment was investigated considering different surface states. Four surface finishes were produced by electro-polishing (P), grinding (G), and two milling (M and M1) processes respectively. Before oxidation, the surface Vickers micro-hardness, surface roughness of each type of sample was measured. Corrosion behavior of four types of sample was studied by using oxidation weight gain method for six oxidation periods. The oxidation time of each period was 120h, 216h, 336h, 504h, 672h and 1344h, respectively. SEM was used to observe the surface morphology of oxide film in several period. The results showed that oxide film on austenitic stainless steel has a duplex-layer structure. The inner oxide film is continuous and compact, while the outer layer is composed of oxide particles. The oxide particle consisted of large particles (nearly micron size) and small particles (dozens of nanometers to a few hundred nanometers). The formation of oxide particle could be significantly affected by the machined surface states. The large particle on cold worked samples (grinding and milling) appeared earlier than electro-polished one, and the milled sample has the largest particle size followed by ground one and electro-polished one. For machined samples, the large particles were almost distributed along the direction of machining marks. Severe exfoliation was observed on one milled surface (M) which had the most heavily cold worked layer, while rare local exfoliation occurred on the ground sample (G) and the other milled sample (M1). The electro-polished sample (P) entirely did not exfoliate.

Keywords: austenitic stainless steel, oxidation, machining, SEM

Procedia PDF Downloads 272
9938 Deep Reinforcement Learning-Based Computation Offloading for 5G Vehicle-Aware Multi-Access Edge Computing Network

Authors: Ziying Wu, Danfeng Yan

Abstract:

Multi-Access Edge Computing (MEC) is one of the key technologies of the future 5G network. By deploying edge computing centers at the edge of wireless access network, the computation tasks can be offloaded to edge servers rather than the remote cloud server to meet the requirements of 5G low-latency and high-reliability application scenarios. Meanwhile, with the development of IOV (Internet of Vehicles) technology, various delay-sensitive and compute-intensive in-vehicle applications continue to appear. Compared with traditional internet business, these computation tasks have higher processing priority and lower delay requirements. In this paper, we design a 5G-based Vehicle-Aware Multi-Access Edge Computing Network (VAMECN) and propose a joint optimization problem of minimizing total system cost. In view of the problem, a deep reinforcement learning-based joint computation offloading and task migration optimization (JCOTM) algorithm is proposed, considering the influences of multiple factors such as concurrent multiple computation tasks, system computing resources distribution, and network communication bandwidth. And, the mixed integer nonlinear programming problem is described as a Markov Decision Process. Experiments show that our proposed algorithm can effectively reduce task processing delay and equipment energy consumption, optimize computing offloading and resource allocation schemes, and improve system resource utilization, compared with other computing offloading policies.

Keywords: multi-access edge computing, computation offloading, 5th generation, vehicle-aware, deep reinforcement learning, deep q-network

Procedia PDF Downloads 93
9937 Consortium Blockchain-based Model for Data Management Applications in the Healthcare Sector

Authors: Teo Hao Jing, Shane Ho Ken Wae, Lee Jin Yu, Burra Venkata Durga Kumar

Abstract:

Current distributed healthcare systems face the challenge of interoperability of health data. Storing electronic health records (EHR) in local databases causes them to be fragmented. This problem is aggravated as patients visit multiple healthcare providers in their lifetime. Existing solutions are unable to solve this issue and have caused burdens to healthcare specialists and patients alike. Blockchain technology was found to be able to increase the interoperability of health data by implementing digital access rules, enabling uniformed patient identity, and providing data aggregation. Consortium blockchain was found to have high read throughputs, is more trustworthy, more secure against external disruptions and accommodates transactions without fees. Therefore, this paper proposes a blockchain-based model for data management applications. In this model, a consortium blockchain is implemented by using a delegated proof of stake (DPoS) as its consensus mechanism. This blockchain allows collaboration between users from different organizations such as hospitals and medical bureaus. Patients serve as the owner of their information, where users from other parties require authorization from the patient to view their information. Hospitals upload the hash value of patients’ generated data to the blockchain, whereas the encrypted information is stored in a distributed cloud storage.

Keywords: blockchain technology, data management applications, healthcare, interoperability, delegated proof of stake

Procedia PDF Downloads 122
9936 EcoTeka, an Open-Source Software for Urban Ecosystem Restoration through Technology

Authors: Manon Frédout, Laëtitia Bucari, Mathias Aloui, Gaëtan Duhamel, Olivier Rovellotti, Javier Blanco

Abstract:

Ecosystems must be resilient to ensure cleaner air, better water and soil quality, and thus healthier citizens. Technology can be an excellent tool to support urban ecosystem restoration projects, especially when based on Open Source and promoting Open Data. This is the goal of the ecoTeka application: one single digital tool for tree management which allows decision-makers to improve their urban forestry practices, enabling more responsible urban planning and climate change adaptation. EcoTeka provides city councils with three main functionalities tackling three of their challenges: easier biodiversity inventories, better green space management, and more efficient planning. To answer the cities’ need for reliable tree inventories, the application has been first built with open data coming from the websites OpenStreetMap and OpenTrees, but it will also include very soon the possibility of creating new data. To achieve this, a multi-source algorithm will be elaborated, based on existing artificial intelligence Deep Forest, integrating open-source satellite images, 3D representations from LiDAR, and street views from Mapillary. This data processing will permit identifying individual trees' position, height, crown diameter, and taxonomic genus. To support urban forestry management, ecoTeka offers a dashboard for monitoring the city’s tree inventory and trigger alerts to inform about upcoming due interventions. This tool was co-constructed with the green space departments of the French cities of Alès, Marseille, and Rouen. The third functionality of the application is a decision-making tool for urban planning, promoting biodiversity and landscape connectivity metrics to drive ecosystem restoration roadmap. Based on landscape graph theory, we are currently experimenting with new methodological approaches to scale down regional ecological connectivity principles to local biodiversity conservation and urban planning policies. This methodological framework will couple graph theoretic approach and biological data, mainly biodiversity occurrences (presence/absence) data available on both international (e.g., GBIF), national (e.g., Système d’Information Nature et Paysage) and local (e.g., Atlas de la Biodiversté Communale) biodiversity data sharing platforms in order to help reasoning new decisions for ecological networks conservation and restoration in urban areas. An experiment on this subject is currently ongoing with Montpellier Mediterranee Metropole. These projects and studies have shown that only 26% of tree inventory data is currently geo-localized in France - the rest is still being done on paper or Excel sheets. It seems that technology is not yet used enough to enrich the knowledge city councils have about biodiversity in their city and that existing biodiversity open data (e.g., occurrences, telemetry, or genetic data), species distribution models, landscape graph connectivity metrics are still underexploited to make rational decisions for landscape and urban planning projects. This is the goal of ecoTeka: to support easier inventories of urban biodiversity and better management of urban spaces through rational planning and decisions relying on open databases. Future studies and projects will focus on the development of tools for reducing the artificialization of soils, selecting plant species adapted to climate change, and highlighting the need for ecosystem and biodiversity services in cities.

Keywords: digital software, ecological design of urban landscapes, sustainable urban development, urban ecological corridor, urban forestry, urban planning

Procedia PDF Downloads 57
9935 The Location of Park and Ride Facilities Using the Fuzzy Inference Model

Authors: Anna Lower, Michal Lower, Robert Masztalski, Agnieszka Szumilas

Abstract:

Contemporary cities are facing serious congestion and parking problems. In urban transport policy the introduction of the park and ride system (P&R) is an increasingly popular way of limiting vehicular traffic. The determining of P&R facilities location is a key aspect of the system. Criteria for assessing the quality of the selected location are formulated generally and descriptively. The research outsourced to specialists are expensive and time consuming. The most focus is on the examination of a few selected places. The practice has shown that the choice of the location of these sites in a intuitive way without a detailed analysis of all the circumstances, often gives negative results. Then the existing facilities are not used as expected. Methods of location as a research topic are also widely taken in the scientific literature. Built mathematical models often do not bring the problem comprehensively, e.g. assuming that the city is linear, developed along one important communications corridor. The paper presents a new method where the expert knowledge is applied to fuzzy inference model. With such a built system even a less experienced person could benefit from it, e.g. urban planners, officials. The analysis result is obtained in a very short time, so a large number of the proposed location can also be verified in a short time. The proposed method is intended for testing of car parks location in a city. The paper will show selected examples of locations of the P&R facilities in cities planning to introduce the P&R. The analysis of existing objects will also be shown in the paper and they will be confronted with the opinions of the system users, with particular emphasis on unpopular locations. The research are executed using the fuzzy inference model which was built and described in more detail in the earlier paper of the authors. The results of analyzes are compared to documents of P&R facilities location outsourced by the city and opinions of existing facilities users expressed on social networking sites. The research of existing facilities were conducted by means of the fuzzy model. The results are consistent with actual users feedback. The proposed method proves to be good, but does not require the involvement of a large experts team and large financial contributions for complicated research. The method also provides an opportunity to show the alternative location of P&R facilities. The performed studies show that the method has been confirmed. The method can be applied in urban planning of the P&R facilities location in relation to the accompanying functions. Although the results of the method are approximate, they are not worse than results of analysis of employed experts. The advantage of this method is ease of use, which simplifies the professional expert analysis. The ability of analyzing a large number of alternative locations gives a broader view on the problem. It is valuable that the arduous analysis of the team of people can be replaced by the model's calculation. According to the authors, the proposed method is also suitable for implementation on a GIS platform.

Keywords: fuzzy logic inference, park and ride system, P&R facilities, P&R location

Procedia PDF Downloads 315
9934 A Brief History of Kampo Extract Formulations for Prescription in Japan

Authors: Kazunari Ozaki, Mitsuru Kageyama, Kenki Miyazawa, Yoshio Nakamura

Abstract:

Background: Kampo (Japanese Traditional medicine) is a medicine traditionally practiced in Japan, based on ancient Chinese medicine. Most Kampo doctors have used decoction of crude drug pieces for treatment. 93% of the Kampo drugs sold in Japan are Kampo products nowadays. Of all Kampo products, 81% of them are Kampo extract formulations for prescription, which is prepared in powdered or granulated form from medicinal crude drug extracts mixed with appropriate excipient. Physicians with medical license for Western medicine prescribe these Kampo extract formulations for prescription in Japan. Objectives: Our study aims at presenting a brief history of Kampo extract formulations for prescription in Japan. Methods: Systematic searches for relevant studies were conducted using not only printed journals but also electronic journals from the bibliographic databases, such as PubMed/Medline, Ichushi-Web, and university/institutional websites, as well as search engines, such as Google and Google Scholar. Results: The first commercialization of Kampo extract formulations for general use (or OTC (over-the-counter) Kampo extract formulation) was achieved after 1957. The number of drugs has been subsequentially increased, reaching 148 Kampo extract formulation for prescription currently. Conclusion: We provide a history of Kampo extract formulations for prescription in Japan. The originality of this research is that it analyzes the background history of Kampo in parallel with relevant transitions in the government and insurance systems.

Keywords: health insurance system, history, Kampo, Kampo extract formulation for prescription, OTC Kampo extract formulation, pattern corresponding prescription (Ho-sho-so-tai) system

Procedia PDF Downloads 269
9933 The Views of Teachers, Students and Parents on the FATIH Project

Authors: Şemsettin Şahin, Ahmet Oğuz Aktürk, İsmail Çelik

Abstract:

This study investigated the views of teachers, students and students' parents on the FATIH (Movement of Enhancing Opportunities and Improving Technology) Project, which was put into service by the Ministry of National Education in cooperation with the Ministry of Transportation in Turkey in November 2010 for the purpose of increasing students' success and planned to be completed within 5 years. The study group consisted of teachers employed in a pilot school in the province of Karaman in central Turkey included within the scope of the FATIH Project, students attending this school and parents whose children are students in that school. The research data were collected through forms developed by the researchers to determine the views of teachers, students and students' parents on the FATIH Project. The descriptive analysis method, one of the qualitative research methods, was used in the study. An analysis of the data revealed that a large majority of the teachers and the students believed that if computers were used to serve their set purpose, then they could make considerable contributions to education. A large majority of the students' parents, on the other hand, regard the use of computers in education as a great opportunity for the students. The views of the teachers, students and students' parents on the FATIH Project usually overlap. Most of the participants in the study pointed out that the FATIH Project was intended to use technology effectively in education. Moreover, each individual participant described their role in the FATIH Project in accordance with their relative position and stated that they could perform whatever was expected of them for the effective and efficient use and progress of the project. The views of the participants regarding the FATİH Project vary according to the kind of the participants.

Keywords: education, FATIH project, technology, students

Procedia PDF Downloads 426
9932 The Influence of Knowledge Spillovers on High-Impact Firm Growth: A Comparison of Indigenous and Foreign Firms

Authors: Yazid Abdullahi Abubakar, Jay Mitra

Abstract:

This paper is concerned with entrepreneurial high-impact firms, which are firms that generate ‘both’ disproportionate levels of employment and sales growth, and have high levels of innovative activity. It investigates differences in factors influencing high-impact growth between indigenous and foreign firms. The study is based on an analysis of data from United Kingdom (UK) Innovation Scoreboard on 865 firms, which were divided into high-impact firms (those achieving positive growth in both sales and employment) and low-impact firms (negative or no growth in sales or employment); in order to identifying the critical differences in regional, sectorial and size related factors that facilitate knowledge spillovers and high-impact growth between indigenous and foreign firms. The findings suggest that: 1) Firms’ access to regional knowledge spillovers (from businesses and higher education institutions) is more significantly associated with high-impact growth of UK firms in comparison to foreign firms, 2) Because high-tech sectors have greater use of knowledge spillovers (compared to low-tech sectors), high-tech sectors are more associated with high-impact growth, but the relationship is stronger for UK firms compared to foreign firms, 3) Because small firms have greater need for knowledge spillovers (relative to large firms), there is a negative relationship between firm size and high-impact growth, but the negative relationship is greater for UK firms in comparison to foreign firms.

Keywords: entrepreneurship, high-growth, indigenous firms, foreign firms, small firms, large firms

Procedia PDF Downloads 414