Search results for: degree-days methods
13576 A Mechanical Diagnosis Method Based on Vibration Fault Signal down-Sampling and the Improved One-Dimensional Convolutional Neural Network
Authors: Bowei Yuan, Shi Li, Liuyang Song, Huaqing Wang, Lingli Cui
Abstract:
Convolutional neural networks (CNN) have received extensive attention in the field of fault diagnosis. Many fault diagnosis methods use CNN for fault type identification. However, when the amount of raw data collected by sensors is massive, the neural network needs to perform a time-consuming classification task. In this paper, a mechanical fault diagnosis method based on vibration signal down-sampling and the improved one-dimensional convolutional neural network is proposed. Through the robust principal component analysis, the low-rank feature matrix of a large amount of raw data can be separated, and then down-sampling is realized to reduce the subsequent calculation amount. In the improved one-dimensional CNN, a smaller convolution kernel is used to reduce the number of parameters and computational complexity, and regularization is introduced before the fully connected layer to prevent overfitting. In addition, the multi-connected layers can better generalize classification results without cumbersome parameter adjustments. The effectiveness of the method is verified by monitoring the signal of the centrifugal pump test bench, and the average test accuracy is above 98%. When compared with the traditional deep belief network (DBN) and support vector machine (SVM) methods, this method has better performance.Keywords: fault diagnosis, vibration signal down-sampling, 1D-CNN
Procedia PDF Downloads 13313575 On the Absence of BLAD, CVM, DUMPS and BC Autosomal Recessive Mutations in Stud Bulls of the Local Alatau Cattle Breed of the Republic of Kazakhstan
Authors: Yessengali Ussenbekov, Valery Terletskiy, Orik Zhanserkenova, Shynar Kasymbekova, Indira Beyshova, Aitkali Imanbayev, Almas Serikov
Abstract:
Currently, there are 46 hereditary diseases afflicting cattle with known molecular genetic diagnostic methods developed for them. Genetic anomalies frequently occur in the Holstein cattle breeds from American and Canadian bloodlines. The data on the incidence of BLAD, CVM, DUMPS and BC autosomal recessive lethal mutations in pedigree animals are discordant, the detrimental allele incidence rates are high for the Holstein cattle breed, whereas the incidence rates of these mutations are low in some breeds or they are completely absent. Data were obtained on the basis of frozen semen of stud bulls. DNA was extracted from the semen with the DNA-Sorb-B extraction kit. The lethal mutation in the genes CD18, SLC35A3, UMP and ASS of Alatau stud bulls (N=124) was detected by polymerase chain reaction and RFLP analysis. It was established that stud bulls of the local Alatau breed were not carriers of the BLAD, CVM, DUMPS, and BC detrimental mutations. However, with a view to preventing the dissemination of hereditary diseases it is recommended to monitor the pedigree stock using molecular genetic methods.Keywords: PCR, autosomal recessive point mutation, BLAD, CVM, DUMPS, BC, stud bulls
Procedia PDF Downloads 44413574 Identification and Classification of Fiber-Fortified Semolina by Near-Infrared Spectroscopy (NIR)
Authors: Amanda T. Badaró, Douglas F. Barbin, Sofia T. Garcia, Maria Teresa P. S. Clerici, Amanda R. Ferreira
Abstract:
Food fortification is the intentional addition of a nutrient in a food matrix and has been widely used to overcome the lack of nutrients in the diet or increasing the nutritional value of food. Fortified food must meet the demand of the population, taking into account their habits and risks that these foods may cause. Wheat and its by-products, such as semolina, has been strongly indicated to be used as a food vehicle since it is widely consumed and used in the production of other foods. These products have been strategically used to add some nutrients, such as fibers. Methods of analysis and quantification of these kinds of components are destructive and require lengthy sample preparation and analysis. Therefore, the industry has searched for faster and less invasive methods, such as Near-Infrared Spectroscopy (NIR). NIR is a rapid and cost-effective method, however, it is based on indirect measurements, yielding high amount of data. Therefore, NIR spectroscopy requires calibration with mathematical and statistical tools (Chemometrics) to extract analytical information from the corresponding spectra, as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA). PCA is well suited for NIR, once it can handle many spectra at a time and be used for non-supervised classification. Advantages of the PCA, which is also a data reduction technique, is that it reduces the data spectra to a smaller number of latent variables for further interpretation. On the other hand, LDA is a supervised method that searches the Canonical Variables (CV) with the maximum separation among different categories. In LDA, the first CV is the direction of maximum ratio between inter and intra-class variances. The present work used a portable infrared spectrometer (NIR) for identification and classification of pure and fiber-fortified semolina samples. The fiber was added to semolina in two different concentrations, and after the spectra acquisition, the data was used for PCA and LDA to identify and discriminate the samples. The results showed that NIR spectroscopy associate to PCA was very effective in identifying pure and fiber-fortified semolina. Additionally, the classification range of the samples using LDA was between 78.3% and 95% for calibration and 75% and 95% for cross-validation. Thus, after the multivariate analysis such as PCA and LDA, it was possible to verify that NIR associated to chemometric methods is able to identify and classify the different samples in a fast and non-destructive way.Keywords: Chemometrics, fiber, linear discriminant analysis, near-infrared spectroscopy, principal component analysis, semolina
Procedia PDF Downloads 21413573 An Investigation to Study the Moisture Dependency of Ground Enhancement Compound
Authors: Arunima Shukla, Vikas Almadi, Devesh Jaiswal, Sunil Saini, Bhusan S. Patil
Abstract:
Lightning protection consists of three main parts; mainly air termination system, down conductor, and earth termination system. Earth termination system is the most important part as earth is the sink and source of charges. Therefore, even when the charges are captured and delivered to the ground, and an easy path is not provided to the charges, earth termination system would lead to problems. Soil has significantly different resistivities ranging from 10 Ωm for wet organic soil to 10000 Ωm for bedrock. Different methods have been discussed and used conventionally such as deep-ground-well method and altering the length of the rod. Those methods are not considered economical. Therefore, it was a general practice to use charcoal along with salt to reduce the soil resistivity. Bentonite is worldwide acceptable material, that had led our interest towards study of bentonite at first. It was concluded that bentonite is a clay which is non-corrosive, environment friendly. Whereas bentonite is suitable only when there is moisture present in the soil, as in the absence of moisture, cracks will appear on the surface which will provide an open passage to the air, resulting into increase in the resistivity. Furthermore, bentonite without moisture does not have enough bonding property, moisture retention, conductivity, and non-leachability. Therefore, bentonite was used along with the other backfill material to overcome the dependency of bentonite on moisture. Different experiments were performed to get the best ratio of bentonite and carbon backfill. It was concluded that properties will highly depend on the quantity of bentonite and carbon-based backfill material.Keywords: backfill material, bentonite, grounding material, low resistivity
Procedia PDF Downloads 14713572 Formal Implementation of Routing Information Protocol Using Event-B
Authors: Jawid Ahmad Baktash, Tadashi Shiroma, Tomokazu Nagata, Yuji Taniguchi, Morikazu Nakamura
Abstract:
The goal of this paper is to explore the use of formal methods for Dynamic Routing, The purpose of network communication with dynamic routing is sending a massage from one node to others by using pacific protocols. In dynamic routing connections are possible based on protocols of Distance vector (Routing Information Protocol, Border Gateway protocol), Link State (Open Shortest Path First, Intermediate system Intermediate System), Hybrid (Enhanced Interior Gateway Routing Protocol). The responsibility for proper verification becomes crucial with Dynamic Routing. Formal methods can play an essential role in the Routing, development of Networks and testing of distributed systems. Event-B is a formal technique consists of describing rigorously the problem; introduce solutions or details in the refinement steps to obtain more concrete specification, and verifying that proposed solutions are correct. The system is modeled in terms of an abstract state space using variables with set theoretic types and the events that modify state variables. Event-B is a variant of B, was designed for developing distributed systems. In Event-B, the events consist of guarded actions occurring spontaneously rather than being invoked. The invariant state properties must be satisfied by the variables and maintained by the activation of the events.Keywords: dynamic rout RIP, formal method, event-B, pro-B
Procedia PDF Downloads 40313571 Influence of Thermal Treatments on Ovomucoid as Allergenic Protein
Authors: Nasser A. Al-Shabib
Abstract:
Food allergens are most common non-native form when exposed to the immune system. Most food proteins undergo various treatments (e.g. thermal or proteolytic processing) during food manufacturing. Such treatments have the potential to impact the chemical structure of food allergens so as to convert them to more denatured or unfolded forms. The conformational changes in the proteins may affect the allergenicity of treated-allergens. However, most allergenic proteins possess high resistance against thermal modification or digestive enzymes. In the present study, ovomucoid (a major allergenic protein of egg white) was heated in phosphate-buffered saline (pH 7.4) at different temperatures, aqueous solutions and on different surfaces for various times. The results indicated that different antibody-based methods had different sensitivities in detecting the heated ovomucoid. When using one particular immunoassay‚ the immunoreactivity of ovomucoid increased rapidly after heating in water whereas immunoreactivity declined after heating in alkaline buffer (pH 10). Ovomucoid appeared more immunoreactive when dissolved in PBS (pH 7.4) and heated on a stainless steel surface. To the best of our knowledge‚ this is the first time that antibody-based methods have been applied for the detection of ovomucoid adsorbed onto different surfaces under various conditions. The results obtained suggest that use of antibodies to detect ovomucoid after food processing may be problematic. False assurance will be given with the use of inappropriate‚ non-validated immunoassays such as those available commercially as ‘Swab’ tests. A greater understanding of antibody-protein interaction after processing of a protein is required.Keywords: ovomucoid, thermal treatment, solutions, surfaces
Procedia PDF Downloads 44913570 Towards Developing Social Assessment Tool for Siwan Ecolodge Case Study: Babenshal Ecolodge
Authors: Amr Ali Bayoumi, Ola Ali Bayoumi
Abstract:
The aim of this research is enhancing one of the main aspects (Social Aspect) for developing an eco-lodge in Siwa oasis in Egyptian Western Desert. According to credible weightings built in this research through formal and informal questionnaires, the researcher detected one of the highest credible aspects, 'Social Aspect': through which it carries the maximum priorities among the total environmental and economic categories. From here, the researcher suggested the usage of ethnographic design approach and Space Syntax as observational and computational methods for developing future Eco-lodge in Siwa Oasis. These methods are used to study social spaces of Babenshal eco-lodge as a case study. This hybrid method is considered as a beginning of building Social Assessment Tool (SAT) for ecological tourism buildings located in Siwa as a case of Egyptian Western desert community. Towards livable social spaces, the proposed SAT was planned to be the optimum measurable weightings for social aspect's priorities of future Siwan eco-lodge(s). Finally, recommendations are proposed for enhancing SAT to be more correlated with sensitive desert biome (Siwa Oasis) to be adapted with the continuous social and environmental changes of the oasis.Keywords: ecolodge, social aspect, space syntax, Siwa Oasis
Procedia PDF Downloads 12813569 Controlled Shock Response Spectrum Test on Spacecraft Subsystem Using Electrodynamic Shaker
Authors: M. Madheswaran, A. R. Prashant, S. Ramakrishna, V. Ramesh Naidu, P. Govindan, P. Aravindakshan
Abstract:
Shock Response spectrum (SRS) tests are one of the tests that are conducted on some critical systems of spacecraft as part of environmental testing. The SRS tests are conducted to simulate the pyro shocks that occur during launch phases as well as during deployment of spacecraft appendages. Some of the methods to carryout SRS tests are pyro technique method, impact hammer method, drop shock method and using electro dynamic shakers. The pyro technique, impact hammer and drop shock methods are open loop tests, whereas SRS testing using electrodynamic shaker is a controlled closed loop test. SRS testing using electrodynamic shaker offers various advantages such as simple test set up, better controllability and repeatability. However, it is important to devise a a proper test methodology so that safety of the electro dynamic shaker and that of test specimen are not compromised. This paper discusses the challenges that are involved in conducting SRS tests, shaker validation and the necessary precautions to be considered. Approach involved in choosing various test parameters like synthesis waveform, spectrum convergence level, etc., are discussed. A case study of SRS test conducted on an optical payload of Indian Geo stationary spacecraft is presented.Keywords: maxi-max spectrum, SRS (shock response spectrum), SDOf (single degree of freedom), wavelet synthesis
Procedia PDF Downloads 36213568 A Tool to Measure Efficiency and Trust Towards eXplainable Artificial Intelligence in Conflict Detection Tasks
Authors: Raphael Tuor, Denis Lalanne
Abstract:
The ATM research community is missing suitable tools to design, test, and validate new UI prototypes. Important stakes underline the implementation of both DSS and XAI methods into current systems. ML-based DSS are gaining in relevance as ATFM becomes increasingly complex. However, these systems only prove useful if a human can understand them, and thus new XAI methods are needed. The human-machine dyad should work as a team and should understand each other. We present xSky, a configurable benchmark tool that allows us to compare different versions of an ATC interface in conflict detection tasks. Our main contributions to the ATC research community are (1) a conflict detection task simulator (xSky) that allows to test the applicability of visual prototypes on scenarios of varying difficulty and outputting relevant operational metrics (2) a theoretical approach to the explanations of AI-driven trajectory predictions. xSky addresses several issues that were identified within available research tools. Researchers can configure the dimensions affecting scenario difficulty with a simple CSV file. Both the content and appearance of the XAI elements can be customized in a few steps. As a proof-of-concept, we implemented an XAI prototype inspired by the maritime field.Keywords: air traffic control, air traffic simulation, conflict detection, explainable artificial intelligence, explainability, human-automation collaboration, human factors, information visualization, interpretability, trajectory prediction
Procedia PDF Downloads 16013567 Schizophrenia in Childhood and Adolescence: Research Topics and Applied Methodology
Authors: Jhonas Geraldo Peixoto Flauzino, Pedro Pompeo Boechat Araujo, Alexia Allis Rocha Lima, Giovanna Biângulo Lacerda Chaves, Victor Ryan Ferrão Chaves
Abstract:
Schizophrenia is characterized as a set of psychiatric signs and symptoms (syndrome) that commonly erupt in the stages of adolescence or early adulthood, being recognized as one of the most serious diseases, as it causes important problems during the life of the patient. carrier - both in mental health and in physical health and in social life. Objectives: This is an integrative literature review that aimed to verify what has been produced of scientific knowledge in the field of child and adolescent psychiatry regarding schizophrenia in these stages of life, correlated to the most discussed themes and methodologies of choice for the preparation of studies. Methods: Articles were selected from the following databases: Virtual Health Library and CAPES Journal Portal, published in the last five years; and on Google Scholar, published in 2021, totaling 62 works, searched in September 2021. Results: The studies focus mainly on diagnosis through the DSM-V (25.8%), on drug treatment (25.8%) and in psychotherapy (24.2%), most of them in the literature review format: integrative (27.4%) and systematic (24.2%). Conclusion: The themes and study methods are redundant, and do not cover in depth the immense aspects that encompass Schizophrenia in Childhood and Adolescence, giving attention to the disease in a general way or focusing on the adult patient.Keywords: schizophrenia, mental health, childhood, adolescence
Procedia PDF Downloads 18713566 Integrating of Multi-Criteria Decision Making and Spatial Data Warehouse in Geographic Information System
Authors: Zohra Mekranfar, Ahmed Saidi, Abdellah Mebrek
Abstract:
This work aims to develop multi-criteria decision making (MCDM) and spatial data warehouse (SDW) methods, which will be integrated into a GIS according to a ‘GIS dominant’ approach. The GIS operating tools will be operational to operate the SDW. The MCDM methods can provide many solutions to a set of problems with various and multiple criteria. When the problem is so complex, integrating spatial dimension, it makes sense to combine the MCDM process with other approaches like data mining, ascending analyses, we present in this paper an experiment showing a geo-decisional methodology of SWD construction, On-line analytical processing (OLAP) technology which combines both basic multidimensional analysis and the concepts of data mining provides powerful tools to highlight inductions and information not obvious by traditional tools. However, these OLAP tools become more complex in the presence of the spatial dimension. The integration of OLAP with a GIS is the future geographic and spatial information solution. GIS offers advanced functions for the acquisition, storage, analysis, and display of geographic information. However, their effectiveness for complex spatial analysis is questionable due to their determinism and their decisional rigor. A prerequisite for the implementation of any analysis or exploration of spatial data requires the construction and structuring of a spatial data warehouse (SDW). This SDW must be easily usable by the GIS and by the tools offered by an OLAP system.Keywords: data warehouse, GIS, MCDM, SOLAP
Procedia PDF Downloads 17813565 Effect of Heat Treatment on Nutrients, Bioactive Contents and Biological Activities of Red Beet (Beta Vulgaris L.)
Authors: Amessis-Ouchemoukh Nadia, Salhi Rim, Ouchemoukh Salim, Ayad Rabha, Sadou Dyhia, Guenaoui Nawel, Hamouche Sara, Madani Khodir
Abstract:
The cooking method is a key factor influencing the quality of vegetables. In this study, the effect of the most common cooking methods on the nutritional composition, phenolic content, pigment content and antioxidant activities (evaluated by DPPH, ABTS, CUPRAC, FRAP, reducing power and phosphomolybdene method) of fresh, steamed, and boiled red beet was investigated. The fresh samples showed the highest nutritional and bioactive composition compared to the cooked ones. The boiling method didn’t lead to a significant reduction (p< 0.05) in the content of phenolics, flavonoids, flavanols and DPPH, ABTS, FRAP, CUPRAC, phosphomolybdeneum and reducing power capacities. This effect was less pronounced when steam cooking was used, and the losses of bioactive compounds were lower. As a result, steam cooking resulted in greater retention of bioactive compounds and antioxidant activity compared to boiling. Overall, this study suggests that steam cooking is a better method in terms of retention of pigments and bioactive compounds and antioxidant activity of beetroot.Keywords: beta vulgaris, cooking methods, bioactive compounds, antioxidant activities
Procedia PDF Downloads 6313564 Intra-miR-ExploreR, a Novel Bioinformatics Platform for Integrated Discovery of MiRNA:mRNA Gene Regulatory Networks
Authors: Surajit Bhattacharya, Daniel Veltri, Atit A. Patel, Daniel N. Cox
Abstract:
miRNAs have emerged as key post-transcriptional regulators of gene expression, however identification of biologically-relevant target genes for this epigenetic regulatory mechanism remains a significant challenge. To address this knowledge gap, we have developed a novel tool in R, Intra-miR-ExploreR, that facilitates integrated discovery of miRNA targets by incorporating target databases and novel target prediction algorithms, using statistical methods including Pearson and Distance Correlation on microarray data, to arrive at high confidence intragenic miRNA target predictions. We have explored the efficacy of this tool using Drosophila melanogaster as a model organism for bioinformatics analyses and functional validation. A number of putative targets were obtained which were also validated using qRT-PCR analysis. Additional features of the tool include downloadable text files containing GO analysis from DAVID and Pubmed links of literature related to gene sets. Moreover, we are constructing interaction maps of intragenic miRNAs, using both micro array and RNA-seq data, focusing on neural tissues to uncover regulatory codes via which these molecules regulate gene expression to direct cellular development.Keywords: miRNA, miRNA:mRNA target prediction, statistical methods, miRNA:mRNA interaction network
Procedia PDF Downloads 51313563 Mechanism of Veneer Colouring for Production of Multilaminar Veneer from Plantation-Grown Eucalyptus Globulus
Authors: Ngoc Nguyen
Abstract:
There is large plantation of Eucalyptus globulus established which has been grown to produce pulpwood. This resource is not suitable for the production of decorative products, principally due to low grades of wood and “dull” appearance but many trials have been already undertaken for the production of veneer and veneer-based engineered wood products, such as plywood and laminated veneer lumber (LVL). The manufacture of veneer-based products has been recently identified as an unprecedented opportunity to promote higher value utilisation of plantation resources. However, many uncertainties remain regarding the impacts of inferior wood quality of young plantation trees on product recovery and value, and with respect to optimal processing techniques. Moreover, the quality of veneer and veneer-based products is far from optimal as trees are young and have small diameters; and the veneers have the significant colour variation which affects to the added value of final products. Developing production methods which would enhance appearance of low-quality veneer would provide a great potential for the production of high-value wood products such as furniture, joinery, flooring and other appearance products. One of the methods of enhancing appearance of low quality veneer, developed in Italy, involves the production of multilaminar veneer, also named “reconstructed veneer”. An important stage of the multilaminar production is colouring the veneer which can be achieved by dyeing veneer with dyes of different colours depending on the type of appearance products, their design and market demand. Although veneer dyeing technology has been well advanced in Italy, it has been focused on poplar veneer from plantation which wood is characterized by low density, even colour, small amount of defects and high permeability. Conversely, the majority of plantation eucalypts have medium to high density, have a lot of defects, uneven colour and low permeability. Therefore, detailed study is required to develop dyeing methods suitable for colouring eucalypt veneers. Brown reactive dye is used for veneer colouring process. Veneers from sapwood and heartwood of two moisture content levels are used to conduct colouring experiments: green veneer and veneer dried to 12% MC. Prior to dyeing, all samples are treated. Both soaking (dipping) and vacuum pressure methods are used in the study to compare the results and select most efficient method for veneer dyeing. To date, the results of colour measurements by CIELAB colour system showed significant differences in the colour of the undyed veneers produced from heartwood part. The colour became moderately darker with increasing of Sodium chloride, compared to control samples according to the colour measurements. It is difficult to conclude a suitable dye solution used in the experiments at this stage as the variables such as dye concentration, dyeing temperature or dyeing time have not been done. The dye will be used with and without UV absorbent after all trials are completed using optimal parameters in colouring veneers.Keywords: Eucalyptus globulus, veneer colouring/dyeing, multilaminar veneer, reactive dye
Procedia PDF Downloads 35113562 Urban Household Waste Disposal Modes and Their Determinants: Evidence from Bure Town, North-Western Ethiopia
Authors: Mastawal Melese, Yismaw Assefa
Abstract:
This study aims to identify household-level determinants of solid waste disposal (SWD) practices in Bure Town, north-western Ethiopia. Using a cross-sectional design and a mixed-methods approach, data were collected from 238 randomly selected households through structured interviews, focus group discussions, and field observations. Descriptive analysis revealed that 14.7% of households used composting as a primary SWD method, 37.4% practiced open dumping, 25.6% used burning, and 22.3% resorted to burial. Multinomial logistic regression showed that factors such as monthly income, age, family size, length of residence, sex, home ownership, solid waste sorting procedures, and education significantly influenced the choice of disposal method. Households with lower education, income, home ownership, and shorter residence times were more likely to use improper disposal methods. Females were found to be more likely to engage in better waste disposal practices than males. These findings underscore the need for context-specific interventions in newly developing towns to enhance household-level SWM systems by addressing key socio-economic factors.Keywords: multinomial logistic regression, solid waste management, solid waste disposal, urban household
Procedia PDF Downloads 2413561 Efficiency of Investments, Financed from EU Funds in Small and Medium Enterprises in Poland
Authors: Jolanta Brodowska-Szewczuk
Abstract:
The article includes the results and conclusions from empirical researches that had been done. The research focuses on the impact of investments made in small and medium-sized enterprises financed from EU funds on the competitiveness of these companies. The researches includes financial results in sales revenue and net income, expenses, and many other new products/services on offer, higher quality products and services, more modern methods of production, innovation in management processes, increase in the number of customers, increase in market share, increase in profitability of production and provision of services. The main conclusions are that, companies with direct investments under this measure shall apply the modern methods of production. The consequence of this is to increase the quality of our products and services. Furthermore, both small and medium-sized enterprises have introduced new products and services. Investments were carried out, thus enabling better work organization in enterprises. Entrepreneurs would guarantee higher quality of service, which would result in better relationships with their customers, what is more, noting the rise in number of clients. More than half of the companies indicated that the investments contributed to the increase in market share. Same thing as for market reach and brand recognition of particular company. An interesting finding is that, investments in small enterprises were more effective than medium-sized enterprises.Keywords: competitiveness, efficiency, EU funds, small and medium-sized enterprises
Procedia PDF Downloads 38413560 Temporal and Spacial Adaptation Strategies in Aerodynamic Simulation of Bluff Bodies Using Vortex Particle Methods
Authors: Dario Milani, Guido Morgenthal
Abstract:
Fluid dynamic computation of wind caused forces on bluff bodies e.g light flexible civil structures or high incidence of ground approaching airplane wings, is one of the major criteria governing their design. For such structures a significant dynamic response may result, requiring the usage of small scale devices as guide-vanes in bridge design to control these effects. The focus of this paper is on the numerical simulation of the bluff body problem involving multiscale phenomena induced by small scale devices. One of the solution methods for the CFD simulation that is relatively successful in this class of applications is the Vortex Particle Method (VPM). The method is based on a grid free Lagrangian formulation of the Navier-Stokes equations, where the velocity field is modeled by particles representing local vorticity. These vortices are being convected due to the free stream velocity as well as diffused. This representation yields the main advantages of low numerical diffusion, compact discretization as the vorticity is strongly localized, implicitly accounting for the free-space boundary conditions typical for this class of FSI problems, and a natural representation of the vortex creation process inherent in bluff body flows. When the particle resolution reaches the Kolmogorov dissipation length, the method becomes a Direct Numerical Simulation (DNS). However, it is crucial to note that any solution method aims at balancing the computational cost against the accuracy achievable. In the classical VPM method, if the fluid domain is discretized by Np particles, the computational cost is O(Np2). For the coupled FSI problem of interest, for example large structures such as long-span bridges, the aerodynamic behavior may be influenced or even dominated by small structural details such as barriers, handrails or fairings. For such geometrically complex and dimensionally large structures, resolving the complete domain with the conventional VPM particle discretization might become prohibitively expensive to compute even for moderate numbers of particles. It is possible to reduce this cost either by reducing the number of particles or by controlling its local distribution. It is also possible to increase the accuracy of the solution without increasing substantially the global computational cost by computing a correction of the particle-particle interaction in some regions of interest. In this paper different strategies are presented in order to extend the conventional VPM method to reduce the computational cost whilst resolving the required details of the flow. The methods include temporal sub stepping to increase the accuracy of the particles convection in certain regions as well as dynamically re-discretizing the particle map to locally control the global and the local amount of particles. Finally, these methods will be applied on a test case and the improvements in the efficiency as well as the accuracy of the proposed extension to the method are presented. The important benefits in terms of accuracy and computational cost of the combination of these methods will be thus presented as long as their relevant applications.Keywords: adaptation, fluid dynamic, remeshing, substepping, vortex particle method
Procedia PDF Downloads 26513559 The Effectiveness of Intervention Methods for Repetitive Behaviors in Preschool Children with Autism Spectrum Disorder: A Systematic Review
Authors: Akane Uda, Ami Tabata, Mi An, Misa Komaki, Ryotaro Ito, Mayumi Inoue, Takehiro Sasai, Yusuke Kusano, Toshihiro Kato
Abstract:
Early intervention is recommended for children with autism spectrum disorder (ASD), and an increasing number of children have received support and intervention before school age in recent years. In this study, we systematically reviewed preschool interventions focused on repetitive behaviors observed in children with ASD, which are often observed at younger ages. Inclusion criteria were as follows : (1) Child of preschool status (age ≤ 7 years) with a diagnosis of ASD (including autism, Asperger's, and pervasive developmental disorder) or a parent (caregiver) with a preschool child with ASD, (2) Physician-confirmed diagnosis of ASD (autism, Asperger's, and pervasive developmental disorder), (3) Interventional studies for repetitive behaviors, (4) Original articles published within the past 10 years (2012 or later), (5) Written in English and Japanese. Exclusion criteria were as follows: (1) Systematic reviews or meta-analyses, (2) Conference reports or books. We carefully scrutinized databases to remove duplicate references and used a two-step screening process to select papers. The primary screening included close scrutiny of titles and abstracts to exclude articles that did not meet the eligibility criteria. During the secondary screening, we carefully read the complete text to assess eligibility, which was double-checked by six members at the laboratory. Disagreements were resolved through consensus-based discussion. Our search yielded 304 papers, of which nine were included in the study. The level of evidence was as follows: three randomized controlled trials (level 2), four pre-post studies (level 4b), and two case reports (level 5). Seven articles selected for this study described the effectiveness of interventions. Interventions for repetitive behaviors in preschool children with ASD were categorized as five interventions that directly involved the child and four educational programs for caregivers and parents. Studies that directly intervened with children used early intensive intervention based on applied behavior analysis (Early Start Denver Model, Early Intensive Behavioral Intervention, and the Picture Exchange Communication System) and individualized education based on sensory integration. Educational interventions for caregivers included two methods; (a) education regarding combined methods and practices of applied behavior analysis in addition to classification and coping methods for repetitive behaviors, and (b) education regarding evaluation methods and practices based on children’s developmental milestones in play. With regard to the neurophysiological basis of repetitive behaviors, environmental factors are implicated as possible contributors. We assumed that applied behavior analysis was shown to be effective in reducing repetitive behaviors because analysis focused on the interaction between the individual and the environment. Additionally, with regard to educational interventions for caregivers, the intervention was shown to promote behavioral change in children based on the caregivers' understanding of the classification of repetitive behaviors and the children’s developmental milestones in play and adjustment of the person-environment context led to a reduction in repetitive behaviors.Keywords: autism spectrum disorder, early intervention, repetitive behaviors, systematic review
Procedia PDF Downloads 14113558 Efficacy of Hemi-Facetectomy in Treatment of Lumbar Foraminal Stenosis
Authors: Manoj Deepak, N. Mathivanan, K. Venkatachalam
Abstract:
Nerve root stenosis is one of the main cause for back pain. There are many methods both conservative and surgical to treat this disease. It is pertinent to decompress the spine to a proper extent so as to avoid the recurrence of symptoms. But too much of an aggressive approach also has its disadvantages. We present one of the methods to effectively decompress the nerve with better results. Our study was carried out in 52 patients with foramina stenosis between 2008 to 2011.We carried out the surgical procedure of shaving off the medial part of the facet joint so as to decompress the root. We selected those patients who had symptoms of claudication for more than 2 years. They had no signs of instability and they underwent conservative treatment for a period of 2 months before the procedure. Oswersty scoring was used to record the functional level of the patient before and after the procedure. All patients were followed up for a period of minimum 2.5 years. After evaluation for a minimum of 2.5 years, 34 patients had no evidence of recurrence of symptoms with improvement in the functional level.7 patients complained of minimal pain but their functional quality had improved postop. Six patients had symptoms of lumbar canal disease which reduced with conservative treatment. 5 patients required spinal fusion surgeries in the later period. Conclusion: Thus, we can effectively conclude that our procedure is safe and effective in reducing the symptoms in those patients with neurogenic claudication.Keywords: facetectoemy, stenosis, decompression, Lumbar Foraminal Stenosis, hemi-facetectomy
Procedia PDF Downloads 35013557 Evaluation of Heterogeneity of Paint Coating on Metal Substrate Using Laser Infrared Thermography and Eddy Current
Authors: S. Mezghani, E. Perrin, J. L. Bodnar, J. Marthe, B. Cauwe, V. Vrabie
Abstract:
Non contact evaluation of the thickness of paint coatings can be attempted by different destructive and nondestructive methods such as cross-section microscopy, gravimetric mass measurement, magnetic gauges, Eddy current, ultrasound or terahertz. Infrared thermography is a nondestructive and non-invasive method that can be envisaged as a useful tool to measure the surface thickness variations by analyzing the temperature response. In this paper, the thermal quadrupole method for two layered samples heated up with a pulsed excitation is firstly used. By analyzing the thermal responses as a function of thermal properties and thicknesses of both layers, optimal parameters for the excitation source can be identified. Simulations show that a pulsed excitation with duration of ten milliseconds allows to obtain a substrate-independent thermal response. Based on this result, an experimental setup consisting of a near-infrared laser diode and an Infrared camera was next used to evaluate the variation of paint coating thickness between 60 µm and 130 µm on two samples. Results show that the parameters extracted for thermal images are correlated with the estimated thicknesses by the Eddy current methods. The laser pulsed thermography is thus an interesting alternative nondestructive method that can be moreover used for non conductive substrates.Keywords: non destructive, paint coating, thickness, infrared thermography, laser, heterogeneity
Procedia PDF Downloads 63913556 Rotorcraft Performance and Environmental Impact Evaluation by Multidisciplinary Modelling
Authors: Pierre-Marie Basset, Gabriel Reboul, Binh DangVu, Sébastien Mercier
Abstract:
Rotorcraft provides invaluable services thanks to their Vertical Take-Off and Landing (VTOL), hover and low speed capabilities. Yet their use is still often limited by their cost and environmental impact, especially noise and energy consumption. One of the main brakes to the expansion of the use of rotorcraft for urban missions is the environmental impact. The first main concern for the population is the noise. In order to develop the transversal competency to assess the rotorcraft environmental footprint, a collaboration has been launched between six research departments within ONERA. The progress in terms of models and methods are capitalized into the numerical workshop C.R.E.A.T.I.O.N. “Concepts of Rotorcraft Enhanced Assessment Through Integrated Optimization Network”. A typical mission for which the environmental impact issue is of great relevance has been defined. The first milestone is to perform the pre-sizing of a reference helicopter for this mission. In a second milestone, an alternate rotorcraft concept has been defined: a tandem rotorcraft with optional propulsion. The key design trends are given for the pre-sizing of this rotorcraft aiming at a significant reduction of the global environmental impact while still giving equivalent flight performance and safety with respect to the reference helicopter. The models and methods have been improved for catching sooner and more globally, the relative variations on the environmental impact when changing the rotorcraft architecture, the pre-design variables and the operation parameters.Keywords: environmental impact, flight performance, helicopter, multi objectives multidisciplinary optimization, rotorcraft
Procedia PDF Downloads 27113555 Enhanced Image Representation for Deep Belief Network Classification of Hyperspectral Images
Authors: Khitem Amiri, Mohamed Farah
Abstract:
Image classification is a challenging task and is gaining lots of interest since it helps us to understand the content of images. Recently Deep Learning (DL) based methods gave very interesting results on several benchmarks. For Hyperspectral images (HSI), the application of DL techniques is still challenging due to the scarcity of labeled data and to the curse of dimensionality. Among other approaches, Deep Belief Network (DBN) based approaches gave a fair classification accuracy. In this paper, we address the problem of the curse of dimensionality by reducing the number of bands and replacing the HSI channels by the channels representing radiometric indices. Therefore, instead of using all the HSI bands, we compute the radiometric indices such as NDVI (Normalized Difference Vegetation Index), NDWI (Normalized Difference Water Index), etc, and we use the combination of these indices as input for the Deep Belief Network (DBN) based classification model. Thus, we keep almost all the pertinent spectral information while reducing considerably the size of the image. In order to test our image representation, we applied our method on several HSI datasets including the Indian pines dataset, Jasper Ridge data and it gave comparable results to the state of the art methods while reducing considerably the time of training and testing.Keywords: hyperspectral images, deep belief network, radiometric indices, image classification
Procedia PDF Downloads 28013554 A Prediction of Electrical Cost for High-Rise Building Construction
Authors: Picha Sriprachan
Abstract:
The increase in electricity prices affects the cost of high-rise building construction. The objectives of this research are to study the electrical cost, trend of electrical cost and to forecast electrical cost of high-rise building construction. The methods of this research are: 1) to study electrical payment formats, cost data collection methods, and the factors affecting electrical cost of high-rise building construction, 2) to study the quantity and trend of cumulative percentage of the electrical cost, and 3) to forecast the electrical cost for different types of high-rise buildings. The results of this research show that the average proportion between electrical cost and the value of the construction project is 0.87 percent. The proportion of electrical cost for residential, office and commercial, and hotel buildings are closely proportional. If construction project value increases, the proportion of electrical cost and the value of the construction project will decrease. However, there is a relationship between the amount of electrical cost and the value of the construction project. During the structural construction phase, the amount of electrical cost will increase and during structural and architectural construction phase, electrical cost will be maximum. The cumulative percentage of the electrical cost is related to the cumulative percentage of the high-rise building construction cost in the same direction. The amount of service space of the building, number of floors and the duration of the construction affect the electrical cost of construction. The electrical cost of construction forecasted by using linear regression equation is close to the electrical cost forecasted by using the proportion of electrical cost and value of the project.Keywords: high-rise building construction, electrical cost, construction phase, architectural phase
Procedia PDF Downloads 39213553 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models
Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg
Abstract:
Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction
Procedia PDF Downloads 30913552 A Domain Specific Modeling Language Semantic Model for Artefact Orientation
Authors: Bunakiye R. Japheth, Ogude U. Cyril
Abstract:
Since the process of transforming user requirements to modeling constructs are not very well supported by domain-specific frameworks, it became necessary to integrate domain requirements with the specific architectures to achieve an integrated customizable solutions space via artifact orientation. Domain-specific modeling language specifications of model-driven engineering technologies focus more on requirements within a particular domain, which can be tailored to aid the domain expert in expressing domain concepts effectively. Modeling processes through domain-specific language formalisms are highly volatile due to dependencies on domain concepts or used process models. A capable solution is given by artifact orientation that stresses on the results rather than expressing a strict dependence on complicated platforms for model creation and development. Based on this premise, domain-specific methods for producing artifacts without having to take into account the complexity and variability of platforms for model definitions can be integrated to support customizable development. In this paper, we discuss methods for the integration capabilities and necessities within a common structure and semantics that contribute a metamodel for artifact-orientation, which leads to a reusable software layer with concrete syntax capable of determining design intents from domain expert. These concepts forming the language formalism are established from models explained within the oil and gas pipelines industry.Keywords: control process, metrics of engineering, structured abstraction, semantic model
Procedia PDF Downloads 14313551 A Demonstration of How to Employ and Interpret Binary IRT Models Using the New IRT Procedure in SAS 9.4
Authors: Ryan A. Black, Stacey A. McCaffrey
Abstract:
Over the past few decades, great strides have been made towards improving the science in the measurement of psychological constructs. Item Response Theory (IRT) has been the foundation upon which statistical models have been derived to increase both precision and accuracy in psychological measurement. These models are now being used widely to develop and refine tests intended to measure an individual's level of academic achievement, aptitude, and intelligence. Recently, the field of clinical psychology has adopted IRT models to measure psychopathological phenomena such as depression, anxiety, and addiction. Because advances in IRT measurement models are being made so rapidly across various fields, it has become quite challenging for psychologists and other behavioral scientists to keep abreast of the most recent developments, much less learn how to employ and decide which models are the most appropriate to use in their line of work. In the same vein, IRT measurement models vary greatly in complexity in several interrelated ways including but not limited to the number of item-specific parameters estimated in a given model, the function which links the expected response and the predictor, response option formats, as well as dimensionality. As a result, inferior methods (a.k.a. Classical Test Theory methods) continue to be employed in efforts to measure psychological constructs, despite evidence showing that IRT methods yield more precise and accurate measurement. To increase the use of IRT methods, this study endeavors to provide a comprehensive overview of binary IRT models; that is, measurement models employed on test data consisting of binary response options (e.g., correct/incorrect, true/false, agree/disagree). Specifically, this study will cover the most basic binary IRT model, known as the 1-parameter logistic (1-PL) model dating back to over 50 years ago, up until the most recent complex, 4-parameter logistic (4-PL) model. Binary IRT models will be defined mathematically and the interpretation of each parameter will be provided. Next, all four binary IRT models will be employed on two sets of data: 1. Simulated data of N=500,000 subjects who responded to four dichotomous items and 2. A pilot analysis of real-world data collected from a sample of approximately 770 subjects who responded to four self-report dichotomous items pertaining to emotional consequences to alcohol use. Real-world data were based on responses collected on items administered to subjects as part of a scale-development study (NIDA Grant No. R44 DA023322). IRT analyses conducted on both the simulated data and analyses of real-world pilot will provide a clear demonstration of how to construct, evaluate, and compare binary IRT measurement models. All analyses will be performed using the new IRT procedure in SAS 9.4. SAS code to generate simulated data and analyses will be available upon request to allow for replication of results.Keywords: instrument development, item response theory, latent trait theory, psychometrics
Procedia PDF Downloads 35813550 [Keynote Talk]: The Challenges and Solutions for Developing Mobile Apps in a Small University
Authors: Greg Turner, Bin Lu, Cheer-Sun Yang
Abstract:
As computing technology advances, smartphone applications can assist in student learning in a pervasive way. For example, the idea of using a mobile apps for the PA Common Trees, Pests, Pathogens, in the field as a reference tool allows middle school students to learn about trees and associated pests/pathogens without bringing a textbook. In the past, some researches study the mobile software Mobile Application Software Development Life Cycle (MADLC) including traditional models such as the waterfall model, or more recent Agile Methods. Others study the issues related to the software development process. Very little research is on the development of three heterogenous mobile systems simultaneously in a small university where the availability of developers is an issue. In this paper, we propose to use a hybride model of Waterfall Model and the Agile Model, known as the Relay Race Methodology (RRM) in practice, to reflect the concept of racing and relaying for scheduling. Based on the development project, we observe that the modeling of the transition between any two phases is manifested naturally. Thus, we claim that the RRM model can provide a de fecto rather than a de jure basis for the core concept in the MADLC. In this paper, the background of the project is introduced first. Then, the challenges are pointed out followed by our solutions. Finally, the experiences learned and the future work are presented.Keywords: agile methods, mobile apps, software process model, waterfall model
Procedia PDF Downloads 40913549 Feasibility Study of Wind Energy Potential in Turkey: Case Study of Catalca District in Istanbul
Authors: Mohammed Wadi, Bedri Kekezoglu, Mustafa Baysal, Mehmet Rida Tur, Abdulfetah Shobole
Abstract:
This paper investigates the technical evaluation of the wind potential for present and future investments in Turkey taking into account the feasibility of sites, installments, operation, and maintenance. This evaluation based on the hourly measured wind speed data for the three years 2008–2010 at 30 m height for Çatalca district. These data were obtained from national meteorology station in Istanbul–Republic of Turkey are analyzed in order to evaluate the feasibility of wind power potential and to assure supreme assortment of wind turbines installing for the area of interest. Furthermore, the data are extrapolated and analyzed at 60 m and 80 m regarding the variability of roughness factor. Weibull bi-parameter probability function is used to approximate monthly and annually wind potential and power density based on three calculation methods namely, the approximated, the graphical and the energy pattern factor methods. The annual mean wind power densities were to be 400.31, 540.08 and 611.02 W/m² for 30, 60, and 80 m heights respectively. Simulation results prove that the analyzed area is an appropriate place for constructing large-scale wind farms.Keywords: wind potential in Turkey, Weibull bi-parameter probability function, the approximated method, the graphical method, the energy pattern factor method, capacity factor
Procedia PDF Downloads 25913548 Evolution of Design through Documentation of Architecture Design Processes
Authors: Maniyarasan Rajendran
Abstract:
Every design has a process, and every architect deals in the ways best known to them. The design translation from the concept to completion change in accordance with their design philosophies, their tools, availability of resources, and at times the clients and the context of the design as well. The approach to understanding the design process requires formalisation of the design intents. The design process is characterised by change, with the time and the technology. The design flow is just indicative and never exhaustive. The knowledge and experience of stakeholders remain limited to the part they played in the project, and their ability to remember, and is through the Photographs. These artefacts, when circulated can hardly tell what the project is. They can never tell the narrative behind. In due course, the design processes are lost. The Design junctions are lost in the journey. Photographs acted as major source materials, along with its importance in architectural revivalism in the 19th century. From the history, we understand that it has been photographs, that act as the dominant source of evidence. The idea of recording is also followed with the idea of getting inspired from the records and documents. The design concept, the architectural firms’ philosophies, the materials used, the special needs, the numerous ‘Trial-and-error’ methods, design methodology, experience of failures and success levels, and the knowledge acquired, etc., and the various other aspects and methods go through in every project, and they deserve/ought to be recorded. The knowledge can be preserved and passed through generations, by documenting the design processes involved. This paper explores the idea of a process documentation as a tool of self-reflection, creation of architectural firm’ repository, and these implications proceed with the design evolution of the team.Keywords: architecture, design, documentation, records
Procedia PDF Downloads 36913547 Women’s Empowerment on Modern Contraceptive Use in Poor-Rich Segment of Population: Evidence From South Asian Countries
Authors: Muhammad Asim, Mehvish Amjad
Abstract:
Background: Less than half of women in South Asia (SA) use any modern contraceptive method which leads to a huge burden of unintended pregnancies, unsafe abortions, maternal deaths, and socioeconomic loss. Women empowerment plays a pivotal role in improving various health seeking behaviours, including contraceptive use. The objective of this study to explore the association between women's empowerment and modern contraceptive, among rich and poor segment of population in SA. Methods: We used the most recent, large-scale, demographic health survey data of five South Asian countries, namely Afghanistan, Pakistan, Bangladesh, India, and Nepal. The outcome variable was the current use of modern contraceptive methods. The main exposure variable was a combination (interaction) of socio-economic status (SES) and women’s level of empowerment (low, medium, and high), where SES was bifurcated into poor and rich; and women empowerment was divided into three categories: decision making, attitude to violence and social independence. Moreover, overall women empowerment indicator was also created by using three dimensions of women empowerment. We applied both descriptive statistics and multivariable logistic regression techniques for data analyses. Results: Most of the women possessed ‘medium’ level of empowerment across South Asian Countries. The lowest attitude to violence empowerment was found in Afghanistan, and the lowest social independence empowerment was observed in Bangladesh across SA. However, Pakistani women have the lowest decision-making empowerment in the region. The lowest modern contraceptive use (22.1%) was found in Afghanistan and the highest (53.2%) in Bangladesh. The multivariate results depict that the overall measure of women empowerment does not affect modern contraceptive use among poor and rich women in most of South Asian countries. However, the decision-making empowerment plays a significant role among both poor and rich women to use modern contraceptive methods across South Asian countries. Conclusions: The effect of women’s empowerment on modern contraceptive use is not consistent across countries, and among poor and rich segment of population. Of the three dimensions of women’s empowerment, the autonomy of decision making in household affairs emerged as a stronger determinant of mCPR as compared with social independence and attitude towards violence against women.Keywords: women empowerment, modern contraceptive use, South Asia, socio economic status
Procedia PDF Downloads 82