Search results for: hybrid block methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17092

Search results for: hybrid block methods

13042 Reactions of 4-Aryl-1H-1,2,3-Triazoles with Cycloalkenones and Epoxides: Synthesis of 2,4- and 1,4-Disubstituted 1,2,3-Triazoles

Authors: Ujjawal Kumar Bhagat, Kamaluddin, Rama Krishna Peddinti

Abstract:

The Huisgen’s 1,3-dipolar [3+2] cycloaddition of organic azides and alkynes often give the mixtures of both the regioisomers 1,4- and 1,5- disubstituted 1,2,3-triazoles. Later, in presence of metal salts (click chemistry) such as copper(I)-catalyzed azide-alkyne cycloaddition (CuAAC) was used for the synthesis of 1,4-disubstituted 1,2,3-triazoles as a sole products regioselectively. Also, the ‘click reactions’ of Ruthenium-catalyzed azides-alkynes cycloaddition (RuAAC) is used for the synthesis of 1,5-disubstituted 1,2,3-triazoles as a single isomer. The synthesis of 1,4- and 1.5-disubstituted 1,2,3-triazoles has become the gold standard of ‘click chemistry’ due to its reliability, specificity, and biocompatibility. The 1,4- and 1,5-disubstituted 1,2,3-triazoles have emerged as one of the most powerful entities in the varieties of biological properties like antibacterial, antitubercular, antitumor, antifungal and antiprotozoal activities. Some of the 1,4,5-trisubstituted 1,2,3-triazoles exhibit Hsp90 inhibiting properties. The 1,4-disubstituted 1,2,3-triazoles also play a big role in the area of material sciences. The triazole-derived oligomeric, polymeric structures are the potential materials for the preparation of organic optoelectronics, silicon elastomers and unimolecular block copolymers. By the virtue of hydrogen bonding and dipole interactions, the 1,2,3-triazole moiety readily associates with the biological targets. Since, the 4-aryl-1H-1,2,3-triazoles are stable entities, they are chemically robust and very less reactive. In this regard, the addition of 4-aryl-1H-1,2,3-triazoles as nucleophiles to α,β-unsaturated carbonyls and nucleophilic substitution with the epoxides constitutes a powerful and challenging synthetic approach for the generation of disubstituted 1,2,3-triazoles. Herein, we have developed aza-Michael addition of 4-aryl-1H-1,2,3-triazoles to 2-cycloalken-1-ones in the presence of an organic base (DABCO) in acetonotrile solvent leading to the formation of disubstituted 1,2,3-triazoles. The reaction provides 1,4-disubstituted triazoles, 3-(4-aryl-1H-1,2,3-triazol-1-yl)cycloalkanones in major amount along with 1,5-disubstituted 1,2,3-triazoles, minor regioisomers with excellent combined chemical yields (upto99%). The nucleophilic behavior of 4-aryl-1H-1,2,3-triazoles was also tested in the ring opening of meso-epoxides in the presence of organic bases (DABCO/Et3N) in acetonotrile solvent furnishing the two regioisomers1,4- and 1,5-disubstituted 1,2,3-triazoles. Thus, the novelty of this methodology is synthesis of diversified disubstituted 1,2,3-triazoles under metal free condition.The results will be presented in detail.

Keywords: aza-Michael addition, cycloalkenones, epoxides, triazoles

Procedia PDF Downloads 313
13041 On the Absence of BLAD, CVM, DUMPS and BC Autosomal Recessive Mutations in Stud Bulls of the Local Alatau Cattle Breed of the Republic of Kazakhstan

Authors: Yessengali Ussenbekov, Valery Terletskiy, Orik Zhanserkenova, Shynar Kasymbekova, Indira Beyshova, Aitkali Imanbayev, Almas Serikov

Abstract:

Currently, there are 46 hereditary diseases afflicting cattle with known molecular genetic diagnostic methods developed for them. Genetic anomalies frequently occur in the Holstein cattle breeds from American and Canadian bloodlines. The data on the incidence of BLAD, CVM, DUMPS and BC autosomal recessive lethal mutations in pedigree animals are discordant, the detrimental allele incidence rates are high for the Holstein cattle breed, whereas the incidence rates of these mutations are low in some breeds or they are completely absent. Data were obtained on the basis of frozen semen of stud bulls. DNA was extracted from the semen with the DNA-Sorb-B extraction kit. The lethal mutation in the genes CD18, SLC35A3, UMP and ASS of Alatau stud bulls (N=124) was detected by polymerase chain reaction and RFLP analysis. It was established that stud bulls of the local Alatau breed were not carriers of the BLAD, CVM, DUMPS, and BC detrimental mutations. However, with a view to preventing the dissemination of hereditary diseases it is recommended to monitor the pedigree stock using molecular genetic methods.

Keywords: PCR, autosomal recessive point mutation, BLAD, CVM, DUMPS, BC, stud bulls

Procedia PDF Downloads 429
13040 Identification and Classification of Fiber-Fortified Semolina by Near-Infrared Spectroscopy (NIR)

Authors: Amanda T. Badaró, Douglas F. Barbin, Sofia T. Garcia, Maria Teresa P. S. Clerici, Amanda R. Ferreira

Abstract:

Food fortification is the intentional addition of a nutrient in a food matrix and has been widely used to overcome the lack of nutrients in the diet or increasing the nutritional value of food. Fortified food must meet the demand of the population, taking into account their habits and risks that these foods may cause. Wheat and its by-products, such as semolina, has been strongly indicated to be used as a food vehicle since it is widely consumed and used in the production of other foods. These products have been strategically used to add some nutrients, such as fibers. Methods of analysis and quantification of these kinds of components are destructive and require lengthy sample preparation and analysis. Therefore, the industry has searched for faster and less invasive methods, such as Near-Infrared Spectroscopy (NIR). NIR is a rapid and cost-effective method, however, it is based on indirect measurements, yielding high amount of data. Therefore, NIR spectroscopy requires calibration with mathematical and statistical tools (Chemometrics) to extract analytical information from the corresponding spectra, as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA). PCA is well suited for NIR, once it can handle many spectra at a time and be used for non-supervised classification. Advantages of the PCA, which is also a data reduction technique, is that it reduces the data spectra to a smaller number of latent variables for further interpretation. On the other hand, LDA is a supervised method that searches the Canonical Variables (CV) with the maximum separation among different categories. In LDA, the first CV is the direction of maximum ratio between inter and intra-class variances. The present work used a portable infrared spectrometer (NIR) for identification and classification of pure and fiber-fortified semolina samples. The fiber was added to semolina in two different concentrations, and after the spectra acquisition, the data was used for PCA and LDA to identify and discriminate the samples. The results showed that NIR spectroscopy associate to PCA was very effective in identifying pure and fiber-fortified semolina. Additionally, the classification range of the samples using LDA was between 78.3% and 95% for calibration and 75% and 95% for cross-validation. Thus, after the multivariate analysis such as PCA and LDA, it was possible to verify that NIR associated to chemometric methods is able to identify and classify the different samples in a fast and non-destructive way.

Keywords: Chemometrics, fiber, linear discriminant analysis, near-infrared spectroscopy, principal component analysis, semolina

Procedia PDF Downloads 198
13039 An Investigation to Study the Moisture Dependency of Ground Enhancement Compound

Authors: Arunima Shukla, Vikas Almadi, Devesh Jaiswal, Sunil Saini, Bhusan S. Patil

Abstract:

Lightning protection consists of three main parts; mainly air termination system, down conductor, and earth termination system. Earth termination system is the most important part as earth is the sink and source of charges. Therefore, even when the charges are captured and delivered to the ground, and an easy path is not provided to the charges, earth termination system would lead to problems. Soil has significantly different resistivities ranging from 10 Ωm for wet organic soil to 10000 Ωm for bedrock. Different methods have been discussed and used conventionally such as deep-ground-well method and altering the length of the rod. Those methods are not considered economical. Therefore, it was a general practice to use charcoal along with salt to reduce the soil resistivity. Bentonite is worldwide acceptable material, that had led our interest towards study of bentonite at first. It was concluded that bentonite is a clay which is non-corrosive, environment friendly. Whereas bentonite is suitable only when there is moisture present in the soil, as in the absence of moisture, cracks will appear on the surface which will provide an open passage to the air, resulting into increase in the resistivity. Furthermore, bentonite without moisture does not have enough bonding property, moisture retention, conductivity, and non-leachability. Therefore, bentonite was used along with the other backfill material to overcome the dependency of bentonite on moisture. Different experiments were performed to get the best ratio of bentonite and carbon backfill. It was concluded that properties will highly depend on the quantity of bentonite and carbon-based backfill material.

Keywords: backfill material, bentonite, grounding material, low resistivity

Procedia PDF Downloads 137
13038 Influence of Thermal Treatments on Ovomucoid as Allergenic Protein

Authors: Nasser A. Al-Shabib

Abstract:

Food allergens are most common non-native form when exposed to the immune system. Most food proteins undergo various treatments (e.g. thermal or proteolytic processing) during food manufacturing. Such treatments have the potential to impact the chemical structure of food allergens so as to convert them to more denatured or unfolded forms. The conformational changes in the proteins may affect the allergenicity of treated-allergens. However, most allergenic proteins possess high resistance against thermal modification or digestive enzymes. In the present study, ovomucoid (a major allergenic protein of egg white) was heated in phosphate-buffered saline (pH 7.4) at different temperatures, aqueous solutions and on different surfaces for various times. The results indicated that different antibody-based methods had different sensitivities in detecting the heated ovomucoid. When using one particular immunoassay‚ the immunoreactivity of ovomucoid increased rapidly after heating in water whereas immunoreactivity declined after heating in alkaline buffer (pH 10). Ovomucoid appeared more immunoreactive when dissolved in PBS (pH 7.4) and heated on a stainless steel surface. To the best of our knowledge‚ this is the first time that antibody-based methods have been applied for the detection of ovomucoid adsorbed onto different surfaces under various conditions. The results obtained suggest that use of antibodies to detect ovomucoid after food processing may be problematic. False assurance will be given with the use of inappropriate‚ non-validated immunoassays such as those available commercially as ‘Swab’ tests. A greater understanding of antibody-protein interaction after processing of a protein is required.

Keywords: ovomucoid, thermal treatment, solutions, surfaces

Procedia PDF Downloads 435
13037 Controlled Shock Response Spectrum Test on Spacecraft Subsystem Using Electrodynamic Shaker

Authors: M. Madheswaran, A. R. Prashant, S. Ramakrishna, V. Ramesh Naidu, P. Govindan, P. Aravindakshan

Abstract:

Shock Response spectrum (SRS) tests are one of the tests that are conducted on some critical systems of spacecraft as part of environmental testing. The SRS tests are conducted to simulate the pyro shocks that occur during launch phases as well as during deployment of spacecraft appendages. Some of the methods to carryout SRS tests are pyro technique method, impact hammer method, drop shock method and using electro dynamic shakers. The pyro technique, impact hammer and drop shock methods are open loop tests, whereas SRS testing using electrodynamic shaker is a controlled closed loop test. SRS testing using electrodynamic shaker offers various advantages such as simple test set up, better controllability and repeatability. However, it is important to devise a a proper test methodology so that safety of the electro dynamic shaker and that of test specimen are not compromised. This paper discusses the challenges that are involved in conducting SRS tests, shaker validation and the necessary precautions to be considered. Approach involved in choosing various test parameters like synthesis waveform, spectrum convergence level, etc., are discussed. A case study of SRS test conducted on an optical payload of Indian Geo stationary spacecraft is presented.

Keywords: maxi-max spectrum, SRS (shock response spectrum), SDOf (single degree of freedom), wavelet synthesis

Procedia PDF Downloads 344
13036 Computer-Aided Drug Repurposing for Mycobacterium Tuberculosis by Targeting Tryptophanyl-tRNA Synthetase

Authors: Neslihan Demirci, Serdar Durdağı

Abstract:

Mycobacterium tuberculosis is still a worldwide disease-causing agent that, according to WHO, led to the death of 1.5 million people from tuberculosis (TB) in 2020. The bacteria reside in macrophages located specifically in the lung. There is a known quadruple drug therapy regimen for TB consisting of isoniazid (INH), rifampin (RIF), pyrazinamide (PZA), and ethambutol (EMB). Over the past 60 years, there have been great contributions to treatment options, such as recently approved delamanid (OPC67683) and bedaquiline (TMC207/R207910), targeting mycolic acid and ATP synthesis, respectively. Also, there are natural compounds that can block the tryptophanyl-tRNA synthetase (TrpRS) enzyme, chuangxinmycin, and indolmycin. Yet, already the drug resistance is reported for those agents. In this study, the newly released TrpRS enzyme structure is investigated for potential inhibitor drugs from already synthesized molecules to help the treatment of resistant cases and to propose an alternative drug for the quadruple drug therapy of tuberculosis. Maestro, Schrodinger is used for docking and molecular dynamic simulations. In-house library containing ~8000 compounds among FDA-approved indole-containing compounds, a total of 57 obtained from the ChemBL were used for both ATP and tryptophan binding pocket docking. Best of indole-containing 57 compounds were subjected to hit expansion and compared later with virtual screening workflow (VSW) results. After docking, VSW was done. Glide-XP docking algorithm was chosen. When compared, VSW alone performed better than the hit expansion module. Best scored compounds were kept for ten ns molecular dynamic simulations by Desmond. Further, 100 ns molecular dynamic simulation was performed for elected molecules according to Z-score. The top three MMGBSA-scored compounds were subjected to steered molecular dynamic (SMD) simulations by Gromacs. While SMD simulations are still being conducted, ponesimod (for multiple sclerosis), vilanterol (β₂ adrenoreceptor agonist), and silodosin (for benign prostatic hyperplasia) were found to have a significant affinity for tuberculosis TrpRS, which is the propulsive force for the urge to expand the research with in vitro studies. Interestingly, top-scored ponesimod has been reported to have a side effect that makes the patient prone to upper respiratory tract infections.

Keywords: drug repurposing, molecular dynamics, tryptophanyl-tRNA synthetase, tuberculosis

Procedia PDF Downloads 105
13035 A Tool to Measure Efficiency and Trust Towards eXplainable Artificial Intelligence in Conflict Detection Tasks

Authors: Raphael Tuor, Denis Lalanne

Abstract:

The ATM research community is missing suitable tools to design, test, and validate new UI prototypes. Important stakes underline the implementation of both DSS and XAI methods into current systems. ML-based DSS are gaining in relevance as ATFM becomes increasingly complex. However, these systems only prove useful if a human can understand them, and thus new XAI methods are needed. The human-machine dyad should work as a team and should understand each other. We present xSky, a configurable benchmark tool that allows us to compare different versions of an ATC interface in conflict detection tasks. Our main contributions to the ATC research community are (1) a conflict detection task simulator (xSky) that allows to test the applicability of visual prototypes on scenarios of varying difficulty and outputting relevant operational metrics (2) a theoretical approach to the explanations of AI-driven trajectory predictions. xSky addresses several issues that were identified within available research tools. Researchers can configure the dimensions affecting scenario difficulty with a simple CSV file. Both the content and appearance of the XAI elements can be customized in a few steps. As a proof-of-concept, we implemented an XAI prototype inspired by the maritime field.

Keywords: air traffic control, air traffic simulation, conflict detection, explainable artificial intelligence, explainability, human-automation collaboration, human factors, information visualization, interpretability, trajectory prediction

Procedia PDF Downloads 149
13034 Cloud Computing Impact on e-Government Adoption

Authors: Ali Elshabrawy

Abstract:

Cloud computing is expected to be important for e Government in near future. Governments need it for solving some of its e Government, financial, infrastructure, legacy systems and integration problems. It reduces information technology (IT) infrastructure needs and support costs, and offers on-demand infrastructure and computational power, improved collaboration capabilities, which are important for e Government projects start up and sustainability. Budget pressures will continue to drive more and more government IT to hybrid and even public clouds, and more cooperation between cloud service providers and governmental agencies are expected, Or developing governmental private, community clouds. Motivation to convince governments to use cloud computing services, will create a pressure on cloud service providers to cope with government's requirements for interoperability, security standards, open data and integration between their cloud systems There will be significant legal action arising out of governmental uses of cloud computing, and legislation addressing both IT and business needs and consumer fears and protections. Cloud computing is a considered a revolution for IT and E business in general and e commerce, e Government in particular. As governments faces increasing challenges regarding IT infrastructure required for e Government projects implementation. As a result of Lack of required financial resources allocated for e Government projects in developed and developing countries. Cloud computing can play a major role to solve some of e Government projects challenges such as, lack of financial resources, IT infrastructure, Human resources trained to manage e Government applications, interoperability, cost efficiency challenges. If we could solve some security issues related to cloud computing usage which considered critical for e Government projects. Pretty sure it’s Just a matter of time before cloud service providers will find out solutions to attract governments as major customers for their business.

Keywords: cloud computing, e-government, adoption, supply side barriers, e-government requirements, challenges

Procedia PDF Downloads 337
13033 Schizophrenia in Childhood and Adolescence: Research Topics and Applied Methodology

Authors: Jhonas Geraldo Peixoto Flauzino, Pedro Pompeo Boechat Araujo, Alexia Allis Rocha Lima, Giovanna Biângulo Lacerda Chaves, Victor Ryan Ferrão Chaves

Abstract:

Schizophrenia is characterized as a set of psychiatric signs and symptoms (syndrome) that commonly erupt in the stages of adolescence or early adulthood, being recognized as one of the most serious diseases, as it causes important problems during the life of the patient. carrier - both in mental health and in physical health and in social life. Objectives: This is an integrative literature review that aimed to verify what has been produced of scientific knowledge in the field of child and adolescent psychiatry regarding schizophrenia in these stages of life, correlated to the most discussed themes and methodologies of choice for the preparation of studies. Methods: Articles were selected from the following databases: Virtual Health Library and CAPES Journal Portal, published in the last five years; and on Google Scholar, published in 2021, totaling 62 works, searched in September 2021. Results: The studies focus mainly on diagnosis through the DSM-V (25.8%), on drug treatment (25.8%) and in psychotherapy (24.2%), most of them in the literature review format: integrative (27.4%) and systematic (24.2%). Conclusion: The themes and study methods are redundant, and do not cover in depth the immense aspects that encompass Schizophrenia in Childhood and Adolescence, giving attention to the disease in a general way or focusing on the adult patient.

Keywords: schizophrenia, mental health, childhood, adolescence

Procedia PDF Downloads 155
13032 Design, Development and Analysis of Combined Darrieus and Savonius Wind Turbine

Authors: Ashish Bhattarai, Bishnu Bhatta, Hem Raj Joshi, Nabin Neupane, Pankaj Yadav

Abstract:

This report concerns the design, development, and analysis of the combined Darrieus and Savonius wind turbine. Vertical Axis Wind Turbines (VAWT's) are of two type's viz. Darrieus (lift type) and Savonius (drag type). The problem associated with Darrieus is the lack of self-starting while Savonius has low efficiency. There are 3 straight Darrieus blades having the cross-section of NACA(National Advisory Committee of Aeronautics) 0018 placed circumferentially and a helically twisted Savonius blade to get even torque distribution. This unique design allows the use of Savonius as a method of self-starting the wind turbine, which the Darrieus cannot achieve on its own. All the parts of the wind turbine are designed in CAD software, and simulation data were obtained via CFD(Computational Fluid Dynamics) approach. Also, the design was imported to FlashForge Finder to 3D print the wind turbine profile and finally, testing was carried out. The plastic material used for Savonius was ABS(Acrylonitrile Butadiene Styrene) and that for Darrieus was PLA(Polylactic Acid). From the data obtained experimentally, the hybrid VAWT so fabricated has been found to operate at the low cut-in speed of 3 m/s and maximum power output has been found to be 7.5537 watts at the wind speed of 6 m/s. The maximum rpm of the rotor blade is recorded to be 431 rpm(rotation per minute) at the wind velocity of 6 m/s, signifying its potentiality of wind power production. Besides, the data so obtained from both the process when analyzed through graph plots has shown the similar nature slope wise. Also, the difference between the experimental and theoretical data obtained has shown mechanical losses. The objective is to eliminate the need for external motors for self-starting purposes and study the performance of the model. The testing of the model was carried out for different wind velocities.

Keywords: VAWT, Darrieus, Savonius, helical blades, CFD, flash forge finder, ABS, PLA

Procedia PDF Downloads 190
13031 Integrating of Multi-Criteria Decision Making and Spatial Data Warehouse in Geographic Information System

Authors: Zohra Mekranfar, Ahmed Saidi, Abdellah Mebrek

Abstract:

This work aims to develop multi-criteria decision making (MCDM) and spatial data warehouse (SDW) methods, which will be integrated into a GIS according to a ‘GIS dominant’ approach. The GIS operating tools will be operational to operate the SDW. The MCDM methods can provide many solutions to a set of problems with various and multiple criteria. When the problem is so complex, integrating spatial dimension, it makes sense to combine the MCDM process with other approaches like data mining, ascending analyses, we present in this paper an experiment showing a geo-decisional methodology of SWD construction, On-line analytical processing (OLAP) technology which combines both basic multidimensional analysis and the concepts of data mining provides powerful tools to highlight inductions and information not obvious by traditional tools. However, these OLAP tools become more complex in the presence of the spatial dimension. The integration of OLAP with a GIS is the future geographic and spatial information solution. GIS offers advanced functions for the acquisition, storage, analysis, and display of geographic information. However, their effectiveness for complex spatial analysis is questionable due to their determinism and their decisional rigor. A prerequisite for the implementation of any analysis or exploration of spatial data requires the construction and structuring of a spatial data warehouse (SDW). This SDW must be easily usable by the GIS and by the tools offered by an OLAP system.

Keywords: data warehouse, GIS, MCDM, SOLAP

Procedia PDF Downloads 161
13030 Effect of Heat Treatment on Nutrients, Bioactive Contents and Biological Activities of Red Beet (Beta Vulgaris L.)

Authors: Amessis-Ouchemoukh Nadia, Salhi Rim, Ouchemoukh Salim, Ayad Rabha, Sadou Dyhia, Guenaoui Nawel, Hamouche Sara, Madani Khodir

Abstract:

The cooking method is a key factor influencing the quality of vegetables. In this study, the effect of the most common cooking methods on the nutritional composition, phenolic content, pigment content and antioxidant activities (evaluated by DPPH, ABTS, CUPRAC, FRAP, reducing power and phosphomolybdene method) of fresh, steamed, and boiled red beet was investigated. The fresh samples showed the highest nutritional and bioactive composition compared to the cooked ones. The boiling method didn’t lead to a significant reduction (p< 0.05) in the content of phenolics, flavonoids, flavanols and DPPH, ABTS, FRAP, CUPRAC, phosphomolybdeneum and reducing power capacities. This effect was less pronounced when steam cooking was used, and the losses of bioactive compounds were lower. As a result, steam cooking resulted in greater retention of bioactive compounds and antioxidant activity compared to boiling. Overall, this study suggests that steam cooking is a better method in terms of retention of pigments and bioactive compounds and antioxidant activity of beetroot.

Keywords: beta vulgaris, cooking methods, bioactive compounds, antioxidant activities

Procedia PDF Downloads 40
13029 Intra-miR-ExploreR, a Novel Bioinformatics Platform for Integrated Discovery of MiRNA:mRNA Gene Regulatory Networks

Authors: Surajit Bhattacharya, Daniel Veltri, Atit A. Patel, Daniel N. Cox

Abstract:

miRNAs have emerged as key post-transcriptional regulators of gene expression, however identification of biologically-relevant target genes for this epigenetic regulatory mechanism remains a significant challenge. To address this knowledge gap, we have developed a novel tool in R, Intra-miR-ExploreR, that facilitates integrated discovery of miRNA targets by incorporating target databases and novel target prediction algorithms, using statistical methods including Pearson and Distance Correlation on microarray data, to arrive at high confidence intragenic miRNA target predictions. We have explored the efficacy of this tool using Drosophila melanogaster as a model organism for bioinformatics analyses and functional validation. A number of putative targets were obtained which were also validated using qRT-PCR analysis. Additional features of the tool include downloadable text files containing GO analysis from DAVID and Pubmed links of literature related to gene sets. Moreover, we are constructing interaction maps of intragenic miRNAs, using both micro array and RNA-seq data, focusing on neural tissues to uncover regulatory codes via which these molecules regulate gene expression to direct cellular development.

Keywords: miRNA, miRNA:mRNA target prediction, statistical methods, miRNA:mRNA interaction network

Procedia PDF Downloads 490
13028 Mechanism of Veneer Colouring for Production of Multilaminar Veneer from Plantation-Grown Eucalyptus Globulus

Authors: Ngoc Nguyen

Abstract:

There is large plantation of Eucalyptus globulus established which has been grown to produce pulpwood. This resource is not suitable for the production of decorative products, principally due to low grades of wood and “dull” appearance but many trials have been already undertaken for the production of veneer and veneer-based engineered wood products, such as plywood and laminated veneer lumber (LVL). The manufacture of veneer-based products has been recently identified as an unprecedented opportunity to promote higher value utilisation of plantation resources. However, many uncertainties remain regarding the impacts of inferior wood quality of young plantation trees on product recovery and value, and with respect to optimal processing techniques. Moreover, the quality of veneer and veneer-based products is far from optimal as trees are young and have small diameters; and the veneers have the significant colour variation which affects to the added value of final products. Developing production methods which would enhance appearance of low-quality veneer would provide a great potential for the production of high-value wood products such as furniture, joinery, flooring and other appearance products. One of the methods of enhancing appearance of low quality veneer, developed in Italy, involves the production of multilaminar veneer, also named “reconstructed veneer”. An important stage of the multilaminar production is colouring the veneer which can be achieved by dyeing veneer with dyes of different colours depending on the type of appearance products, their design and market demand. Although veneer dyeing technology has been well advanced in Italy, it has been focused on poplar veneer from plantation which wood is characterized by low density, even colour, small amount of defects and high permeability. Conversely, the majority of plantation eucalypts have medium to high density, have a lot of defects, uneven colour and low permeability. Therefore, detailed study is required to develop dyeing methods suitable for colouring eucalypt veneers. Brown reactive dye is used for veneer colouring process. Veneers from sapwood and heartwood of two moisture content levels are used to conduct colouring experiments: green veneer and veneer dried to 12% MC. Prior to dyeing, all samples are treated. Both soaking (dipping) and vacuum pressure methods are used in the study to compare the results and select most efficient method for veneer dyeing. To date, the results of colour measurements by CIELAB colour system showed significant differences in the colour of the undyed veneers produced from heartwood part. The colour became moderately darker with increasing of Sodium chloride, compared to control samples according to the colour measurements. It is difficult to conclude a suitable dye solution used in the experiments at this stage as the variables such as dye concentration, dyeing temperature or dyeing time have not been done. The dye will be used with and without UV absorbent after all trials are completed using optimal parameters in colouring veneers.

Keywords: Eucalyptus globulus, veneer colouring/dyeing, multilaminar veneer, reactive dye

Procedia PDF Downloads 339
13027 Interpreting Ecclesiastical Heritage: Meaning Making and Contentious Conversations

Authors: Alexis Thouki

Abstract:

In our post-Christian societies, ecclesiastical heritage acquired a new extrovert profile aiming to reach out an increasingly diverse audience. In this context, the various motivations, interests, personalities and cultural exchanges, found in the ‘post-modern pilgrimage’, bequeath a hybrid and multidimensional character to religious tourism education. In consequence, churches have acquired the challenging role of enriching visitors cultural and spiritual capital. Despite this promising diversification to relate, reveal and provoke constructive discourses, due to the various ‘conflicting interests’, practitioners attempt to tame the rich in symbolism and meanings religious environment through ‘neutral interpretations’. This paper aims to present the results of an ongoing developing strategy related to the presentation of contentious meanings in English churches. The paper will explore some of the underlying issues related to the capacity of ‘neutrality’ to spark, downplay or eliminate contentious conversations relating to the cultural, religious, and social dimension of Christian cultural heritage thematology. In an effort to understand this issue, the paper examines the concept of neutrality and what it stands for, executing a discourse analysis in the semantic context in which the theological lexicon is interwoven with the cultural and social meanings of sacred sites. Following that, the paper examines whether the preferable interpretive strategies meet the post-modern interpretative framework which is marked by polysemy and critical active engagement. The ultimate aim of the paper is to investigate the hypothesis that the preferable neutral strategies, managing the ‘conflicting’ demands of worshippers and visitors, result in the uneven treatment of both, the religious and historical spirit of the place.

Keywords: contentious dialogue, interpretation, meaning making, religious tourism

Procedia PDF Downloads 147
13026 Efficiency of Investments, Financed from EU Funds in Small and Medium Enterprises in Poland

Authors: Jolanta Brodowska-Szewczuk

Abstract:

The article includes the results and conclusions from empirical researches that had been done. The research focuses on the impact of investments made in small and medium-sized enterprises financed from EU funds on the competitiveness of these companies. The researches includes financial results in sales revenue and net income, expenses, and many other new products/services on offer, higher quality products and services, more modern methods of production, innovation in management processes, increase in the number of customers, increase in market share, increase in profitability of production and provision of services. The main conclusions are that, companies with direct investments under this measure shall apply the modern methods of production. The consequence of this is to increase the quality of our products and services. Furthermore, both small and medium-sized enterprises have introduced new products and services. Investments were carried out, thus enabling better work organization in enterprises. Entrepreneurs would guarantee higher quality of service, which would result in better relationships with their customers, what is more, noting the rise in number of clients. More than half of the companies indicated that the investments contributed to the increase in market share. Same thing as for market reach and brand recognition of particular company. An interesting finding is that, investments in small enterprises were more effective than medium-sized enterprises.

Keywords: competitiveness, efficiency, EU funds, small and medium-sized enterprises

Procedia PDF Downloads 373
13025 Temporal and Spacial Adaptation Strategies in Aerodynamic Simulation of Bluff Bodies Using Vortex Particle Methods

Authors: Dario Milani, Guido Morgenthal

Abstract:

Fluid dynamic computation of wind caused forces on bluff bodies e.g light flexible civil structures or high incidence of ground approaching airplane wings, is one of the major criteria governing their design. For such structures a significant dynamic response may result, requiring the usage of small scale devices as guide-vanes in bridge design to control these effects. The focus of this paper is on the numerical simulation of the bluff body problem involving multiscale phenomena induced by small scale devices. One of the solution methods for the CFD simulation that is relatively successful in this class of applications is the Vortex Particle Method (VPM). The method is based on a grid free Lagrangian formulation of the Navier-Stokes equations, where the velocity field is modeled by particles representing local vorticity. These vortices are being convected due to the free stream velocity as well as diffused. This representation yields the main advantages of low numerical diffusion, compact discretization as the vorticity is strongly localized, implicitly accounting for the free-space boundary conditions typical for this class of FSI problems, and a natural representation of the vortex creation process inherent in bluff body flows. When the particle resolution reaches the Kolmogorov dissipation length, the method becomes a Direct Numerical Simulation (DNS). However, it is crucial to note that any solution method aims at balancing the computational cost against the accuracy achievable. In the classical VPM method, if the fluid domain is discretized by Np particles, the computational cost is O(Np2). For the coupled FSI problem of interest, for example large structures such as long-span bridges, the aerodynamic behavior may be influenced or even dominated by small structural details such as barriers, handrails or fairings. For such geometrically complex and dimensionally large structures, resolving the complete domain with the conventional VPM particle discretization might become prohibitively expensive to compute even for moderate numbers of particles. It is possible to reduce this cost either by reducing the number of particles or by controlling its local distribution. It is also possible to increase the accuracy of the solution without increasing substantially the global computational cost by computing a correction of the particle-particle interaction in some regions of interest. In this paper different strategies are presented in order to extend the conventional VPM method to reduce the computational cost whilst resolving the required details of the flow. The methods include temporal sub stepping to increase the accuracy of the particles convection in certain regions as well as dynamically re-discretizing the particle map to locally control the global and the local amount of particles. Finally, these methods will be applied on a test case and the improvements in the efficiency as well as the accuracy of the proposed extension to the method are presented. The important benefits in terms of accuracy and computational cost of the combination of these methods will be thus presented as long as their relevant applications.

Keywords: adaptation, fluid dynamic, remeshing, substepping, vortex particle method

Procedia PDF Downloads 251
13024 The Effectiveness of Intervention Methods for Repetitive Behaviors in Preschool Children with Autism Spectrum Disorder: A Systematic Review

Authors: Akane Uda, Ami Tabata, Mi An, Misa Komaki, Ryotaro Ito, Mayumi Inoue, Takehiro Sasai, Yusuke Kusano, Toshihiro Kato

Abstract:

Early intervention is recommended for children with autism spectrum disorder (ASD), and an increasing number of children have received support and intervention before school age in recent years. In this study, we systematically reviewed preschool interventions focused on repetitive behaviors observed in children with ASD, which are often observed at younger ages. Inclusion criteria were as follows : (1) Child of preschool status (age ≤ 7 years) with a diagnosis of ASD (including autism, Asperger's, and pervasive developmental disorder) or a parent (caregiver) with a preschool child with ASD, (2) Physician-confirmed diagnosis of ASD (autism, Asperger's, and pervasive developmental disorder), (3) Interventional studies for repetitive behaviors, (4) Original articles published within the past 10 years (2012 or later), (5) Written in English and Japanese. Exclusion criteria were as follows: (1) Systematic reviews or meta-analyses, (2) Conference reports or books. We carefully scrutinized databases to remove duplicate references and used a two-step screening process to select papers. The primary screening included close scrutiny of titles and abstracts to exclude articles that did not meet the eligibility criteria. During the secondary screening, we carefully read the complete text to assess eligibility, which was double-checked by six members at the laboratory. Disagreements were resolved through consensus-based discussion. Our search yielded 304 papers, of which nine were included in the study. The level of evidence was as follows: three randomized controlled trials (level 2), four pre-post studies (level 4b), and two case reports (level 5). Seven articles selected for this study described the effectiveness of interventions. Interventions for repetitive behaviors in preschool children with ASD were categorized as five interventions that directly involved the child and four educational programs for caregivers and parents. Studies that directly intervened with children used early intensive intervention based on applied behavior analysis (Early Start Denver Model, Early Intensive Behavioral Intervention, and the Picture Exchange Communication System) and individualized education based on sensory integration. Educational interventions for caregivers included two methods; (a) education regarding combined methods and practices of applied behavior analysis in addition to classification and coping methods for repetitive behaviors, and (b) education regarding evaluation methods and practices based on children’s developmental milestones in play. With regard to the neurophysiological basis of repetitive behaviors, environmental factors are implicated as possible contributors. We assumed that applied behavior analysis was shown to be effective in reducing repetitive behaviors because analysis focused on the interaction between the individual and the environment. Additionally, with regard to educational interventions for caregivers, the intervention was shown to promote behavioral change in children based on the caregivers' understanding of the classification of repetitive behaviors and the children’s developmental milestones in play and adjustment of the person-environment context led to a reduction in repetitive behaviors.

Keywords: autism spectrum disorder, early intervention, repetitive behaviors, systematic review

Procedia PDF Downloads 124
13023 Perceptions and Experiences of Students and Their Instructors on Online versus Face-To-Face Classrooms

Authors: Rahime Filiz Kiremit

Abstract:

This study involves investigating the comparisons of both online and face-to-face classes, along with providing their respective differences. The research project contains information pertaining to the two courses, provided with testimony from students and instructors alike. There were a total of 37 participants involved within the study from San Jacinto College; 35 students and the two instructors of their respective courses. The online instructor has a total of four years of teaching experience, while the face-to-face instructor has accrued 11 years of instructional education. The both instructors were interviewed and the samples were collected from three different classes - TECA 1311-702 (Educating Young Children 13 week distance learning), TECA 1311-705 (Educating Young Children 13 week distance learning) and TECA 1354 (Child Growth and Development). Among all three classes, 13 of the 29 students enrolled in either of the online courses considered participation within the survey, while 22 of the 28 students enrolled in the face-to-face course elected to do the same thing. With regards to the students’ prior class enrollment, 25 students had taken online classes previously, 9 students had taken early-childhood courses, 4 students had taken general classes, 11 students had taken both types of classes, 10 students had not yet taken online classes, and only 1 of them had taken a hybrid course. 10 of the participants professed that they like face-to-face classes, because they find that they can interact with their classmates and teachers. They find that online classes have more work to do, because they need to read the chapters and instructions on their own time. They said that during the face-to-face instruction, they could take notes and converse concerns with professors and fellow peers. They can have hands-on activities during face-to-face classes, and, as a result, improve their abilities to retain what they have learned within that particular time. Some of the students even mentioned that they are supposed to discipline themselves, because the online classes require more work. According to the remaining six students, online classes are easier than face-to-face classes. Most of them believe that the easiness of a course is dependent on the types of classes, the instructors, and the respective subjects of which they teach. With considerations of all 35 students, almost 63% of the students agreed that they interact more with their classmates in face-to-face classes.

Keywords: distance education, face-to-face education, online classroom, students' perceptions

Procedia PDF Downloads 269
13022 Efficacy of Hemi-Facetectomy in Treatment of Lumbar Foraminal Stenosis

Authors: Manoj Deepak, N. Mathivanan, K. Venkatachalam

Abstract:

Nerve root stenosis is one of the main cause for back pain. There are many methods both conservative and surgical to treat this disease. It is pertinent to decompress the spine to a proper extent so as to avoid the recurrence of symptoms. But too much of an aggressive approach also has its disadvantages. We present one of the methods to effectively decompress the nerve with better results. Our study was carried out in 52 patients with foramina stenosis between 2008 to 2011.We carried out the surgical procedure of shaving off the medial part of the facet joint so as to decompress the root. We selected those patients who had symptoms of claudication for more than 2 years. They had no signs of instability and they underwent conservative treatment for a period of 2 months before the procedure. Oswersty scoring was used to record the functional level of the patient before and after the procedure. All patients were followed up for a period of minimum 2.5 years. After evaluation for a minimum of 2.5 years, 34 patients had no evidence of recurrence of symptoms with improvement in the functional level.7 patients complained of minimal pain but their functional quality had improved postop. Six patients had symptoms of lumbar canal disease which reduced with conservative treatment. 5 patients required spinal fusion surgeries in the later period. Conclusion: Thus, we can effectively conclude that our procedure is safe and effective in reducing the symptoms in those patients with neurogenic claudication.

Keywords: facetectoemy, stenosis, decompression, Lumbar Foraminal Stenosis, hemi-facetectomy

Procedia PDF Downloads 338
13021 Evaluation of Heterogeneity of Paint Coating on Metal Substrate Using Laser Infrared Thermography and Eddy Current

Authors: S. Mezghani, E. Perrin, J. L. Bodnar, J. Marthe, B. Cauwe, V. Vrabie

Abstract:

Non contact evaluation of the thickness of paint coatings can be attempted by different destructive and nondestructive methods such as cross-section microscopy, gravimetric mass measurement, magnetic gauges, Eddy current, ultrasound or terahertz. Infrared thermography is a nondestructive and non-invasive method that can be envisaged as a useful tool to measure the surface thickness variations by analyzing the temperature response. In this paper, the thermal quadrupole method for two layered samples heated up with a pulsed excitation is firstly used. By analyzing the thermal responses as a function of thermal properties and thicknesses of both layers, optimal parameters for the excitation source can be identified. Simulations show that a pulsed excitation with duration of ten milliseconds allows to obtain a substrate-independent thermal response. Based on this result, an experimental setup consisting of a near-infrared laser diode and an Infrared camera was next used to evaluate the variation of paint coating thickness between 60 µm and 130 µm on two samples. Results show that the parameters extracted for thermal images are correlated with the estimated thicknesses by the Eddy current methods. The laser pulsed thermography is thus an interesting alternative nondestructive method that can be moreover used for non conductive substrates.

Keywords: non destructive, paint coating, thickness, infrared thermography, laser, heterogeneity

Procedia PDF Downloads 628
13020 Rotorcraft Performance and Environmental Impact Evaluation by Multidisciplinary Modelling

Authors: Pierre-Marie Basset, Gabriel Reboul, Binh DangVu, Sébastien Mercier

Abstract:

Rotorcraft provides invaluable services thanks to their Vertical Take-Off and Landing (VTOL), hover and low speed capabilities. Yet their use is still often limited by their cost and environmental impact, especially noise and energy consumption. One of the main brakes to the expansion of the use of rotorcraft for urban missions is the environmental impact. The first main concern for the population is the noise. In order to develop the transversal competency to assess the rotorcraft environmental footprint, a collaboration has been launched between six research departments within ONERA. The progress in terms of models and methods are capitalized into the numerical workshop C.R.E.A.T.I.O.N. “Concepts of Rotorcraft Enhanced Assessment Through Integrated Optimization Network”. A typical mission for which the environmental impact issue is of great relevance has been defined. The first milestone is to perform the pre-sizing of a reference helicopter for this mission. In a second milestone, an alternate rotorcraft concept has been defined: a tandem rotorcraft with optional propulsion. The key design trends are given for the pre-sizing of this rotorcraft aiming at a significant reduction of the global environmental impact while still giving equivalent flight performance and safety with respect to the reference helicopter. The models and methods have been improved for catching sooner and more globally, the relative variations on the environmental impact when changing the rotorcraft architecture, the pre-design variables and the operation parameters.

Keywords: environmental impact, flight performance, helicopter, multi objectives multidisciplinary optimization, rotorcraft

Procedia PDF Downloads 256
13019 Physico-Mechanical Properties of Wood-Plastic Composites Produced from Polyethylene Terephthalate Plastic Bottle Wastes and Sawdust of Three Tropical Hardwood Species

Authors: Amos Olajide Oluyege, Akpanobong Akpan Ekong, Emmanuel Uchechukwu Opara, Sunday Adeniyi Adedutan, Joseph Adeola Fuwape, Olawale John Olukunle

Abstract:

This study was carried out to evaluate the influence of wood species and wood plastic ratio on the physical and mechanical properties of wood plastic composites (WPCs) produced from polyethylene terephthalate (PET) plastic bottle wastes and sawdust from three hardwood species, namely, Terminalia superba, Gmelina arborea, and Ceiba pentandra. The experimental WPCs were prepared from sawdust particle size classes of ≤ 0.5, 0.5 – 1.0, and 1.0 – 2.0 mm at wood/plastic ratios of 40:60, 50:50 and 60:40 (percentage by weight). The WPCs for each study variable combination were prepared in 3 replicates and laid out in a randomized complete block design (RCBD). The physical properties investigated water absorption (WA), linear expansion (LE) and thickness swelling (TS) while the mechanical properties evaluated were Modulus of Elasticity (MOE) and Modulus of Rupture (MOR). The mean values for WA, LE and TS ranged from 1.07 to 34.04, 0.11 to 1.76 and 0.11 to 4.05 %, respectively. The mean values of the three physical properties increased with decrease in wood plastic ratio. Wood plastic ratio of 40:60 at each particle size class generally resulted in the lowest values while wood plastic ratio of 60:40 had the highest values for each of the three species. For each of the physical properties, T. superba had the least mean values followed by G. arborea, while the highest values were observed C. pentandra. The mean values for MOE and MOR ranged from 458.17 to 1875.67 and 2.64 to 18.39 N/mm2, respectively. The mean values of the two mechanical properties decreased with increase in wood plastic ratio. Wood plastic ratio of 40:60 at each wood particle size class generally had the highest values while wood plastic ratio of 60:40 had the least values for each of the three species. For each of the mechanical properties, C. pentandra had the highest mean values followed by G. arborea, while the least values were observed T. superba. There were improvements in both the physical and mechanical properties due to decrease in sawdust particle size class with the particle size class of ≤ 0.5 mm giving the best result. The results of the Analysis of variance revealed significant (P < 0.05) effects of the three study variables – wood species, sawdust particle size class and wood/plastic ratio on all the physical and mechanical properties of the WPCs. It can be concluded from the results of this study that wood plastic composites from sawdust particle size ≤ 0.5 and PET plastic bottle wastes with acceptable physical and mechanical properties are better produced using 40:60 wood/plastic ratio, and that at this ratio, all the three species are suitable for the production of wood plastic composites.

Keywords: polyethylene terephthalate plastic bottle wastes, wood plastic composite, physical properties, mechanical properties

Procedia PDF Downloads 187
13018 Enhanced Image Representation for Deep Belief Network Classification of Hyperspectral Images

Authors: Khitem Amiri, Mohamed Farah

Abstract:

Image classification is a challenging task and is gaining lots of interest since it helps us to understand the content of images. Recently Deep Learning (DL) based methods gave very interesting results on several benchmarks. For Hyperspectral images (HSI), the application of DL techniques is still challenging due to the scarcity of labeled data and to the curse of dimensionality. Among other approaches, Deep Belief Network (DBN) based approaches gave a fair classification accuracy. In this paper, we address the problem of the curse of dimensionality by reducing the number of bands and replacing the HSI channels by the channels representing radiometric indices. Therefore, instead of using all the HSI bands, we compute the radiometric indices such as NDVI (Normalized Difference Vegetation Index), NDWI (Normalized Difference Water Index), etc, and we use the combination of these indices as input for the Deep Belief Network (DBN) based classification model. Thus, we keep almost all the pertinent spectral information while reducing considerably the size of the image. In order to test our image representation, we applied our method on several HSI datasets including the Indian pines dataset, Jasper Ridge data and it gave comparable results to the state of the art methods while reducing considerably the time of training and testing.

Keywords: hyperspectral images, deep belief network, radiometric indices, image classification

Procedia PDF Downloads 262
13017 A Prediction of Electrical Cost for High-Rise Building Construction

Authors: Picha Sriprachan

Abstract:

The increase in electricity prices affects the cost of high-rise building construction. The objectives of this research are to study the electrical cost, trend of electrical cost and to forecast electrical cost of high-rise building construction. The methods of this research are: 1) to study electrical payment formats, cost data collection methods, and the factors affecting electrical cost of high-rise building construction, 2) to study the quantity and trend of cumulative percentage of the electrical cost, and 3) to forecast the electrical cost for different types of high-rise buildings. The results of this research show that the average proportion between electrical cost and the value of the construction project is 0.87 percent. The proportion of electrical cost for residential, office and commercial, and hotel buildings are closely proportional. If construction project value increases, the proportion of electrical cost and the value of the construction project will decrease. However, there is a relationship between the amount of electrical cost and the value of the construction project. During the structural construction phase, the amount of electrical cost will increase and during structural and architectural construction phase, electrical cost will be maximum. The cumulative percentage of the electrical cost is related to the cumulative percentage of the high-rise building construction cost in the same direction. The amount of service space of the building, number of floors and the duration of the construction affect the electrical cost of construction. The electrical cost of construction forecasted by using linear regression equation is close to the electrical cost forecasted by using the proportion of electrical cost and value of the project.

Keywords: high-rise building construction, electrical cost, construction phase, architectural phase

Procedia PDF Downloads 374
13016 Development of Hybrid Materials Combining Biomass as Fique Fibers with Metal-Organic Frameworks, and Their Potential as Mercury Adsorbents

Authors: Karen G. Bastidas Gomez, Hugo R. Zea Ramirez, Manuel F. Ribeiro Pereira, Cesar A. Sierra Avila, Juan A. Clavijo Morales

Abstract:

The contamination of water sources with heavy metals such as mercury has been an environmental problem; it has generated a high impact on the environment and human health. In countries such as Colombia, mercury contamination due to mining has reached levels much higher than the world average. This work proposes the use of fique fibers as adsorbent in mercury removal. The evaluation of the material was carried out under five different conditions (raw, pretreated by organosolv, functionalized by TEMPO oxidation, fiber functionalized plus MOF-199 and fiber functionalized plus MOF-199-SH). All the materials were characterized using FTIR, SEM, EDX, XRD, and TGA. Regarding the mercury removal, it was done under room pressure and temperature, also pH = 7 for all materials presentations, followed by Atomic Absorption Spectroscopy. The high cellulose content in fique is the main particularity of this lignocellulosic biomass since the degree of oxidation depends on the number of hydroxyl groups on the surface capable of oxidizing into carboxylic acids, a functional group capable of increasing ion exchange with mercury in solution. It was also expected that the impregnation of the MOF would increase the mercury removal; however, it was found that the functionalized fique achieved a greater percentage of removal, resulting in 81.33% of removal, 44% for the fique with the MOF-199 and 72% for the MOF-199-SH with. The pretreated fiber and raw also showed 74% and 56%, respectively, which indicates that fique does not require considerable modifications in its structure to achieve good performances. Even so, the functionalized fiber increases the percentage of removal considerably compared to the pretreated fique, which suggests that the functionalization process is a feasible procedure to apply with the purpose of improving the removal percentage. In addition, this is a procedure that follows a green approach since the reagents involved have low environmental impact, and the contribution to the remediation of natural resources is high.

Keywords: biomass, nanotechnology, science materials, wastewater treatment

Procedia PDF Downloads 107
13015 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models

Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg

Abstract:

Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.

Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction

Procedia PDF Downloads 300
13014 A Domain Specific Modeling Language Semantic Model for Artefact Orientation

Authors: Bunakiye R. Japheth, Ogude U. Cyril

Abstract:

Since the process of transforming user requirements to modeling constructs are not very well supported by domain-specific frameworks, it became necessary to integrate domain requirements with the specific architectures to achieve an integrated customizable solutions space via artifact orientation. Domain-specific modeling language specifications of model-driven engineering technologies focus more on requirements within a particular domain, which can be tailored to aid the domain expert in expressing domain concepts effectively. Modeling processes through domain-specific language formalisms are highly volatile due to dependencies on domain concepts or used process models. A capable solution is given by artifact orientation that stresses on the results rather than expressing a strict dependence on complicated platforms for model creation and development. Based on this premise, domain-specific methods for producing artifacts without having to take into account the complexity and variability of platforms for model definitions can be integrated to support customizable development. In this paper, we discuss methods for the integration capabilities and necessities within a common structure and semantics that contribute a metamodel for artifact-orientation, which leads to a reusable software layer with concrete syntax capable of determining design intents from domain expert. These concepts forming the language formalism are established from models explained within the oil and gas pipelines industry.

Keywords: control process, metrics of engineering, structured abstraction, semantic model

Procedia PDF Downloads 130
13013 A Demonstration of How to Employ and Interpret Binary IRT Models Using the New IRT Procedure in SAS 9.4

Authors: Ryan A. Black, Stacey A. McCaffrey

Abstract:

Over the past few decades, great strides have been made towards improving the science in the measurement of psychological constructs. Item Response Theory (IRT) has been the foundation upon which statistical models have been derived to increase both precision and accuracy in psychological measurement. These models are now being used widely to develop and refine tests intended to measure an individual's level of academic achievement, aptitude, and intelligence. Recently, the field of clinical psychology has adopted IRT models to measure psychopathological phenomena such as depression, anxiety, and addiction. Because advances in IRT measurement models are being made so rapidly across various fields, it has become quite challenging for psychologists and other behavioral scientists to keep abreast of the most recent developments, much less learn how to employ and decide which models are the most appropriate to use in their line of work. In the same vein, IRT measurement models vary greatly in complexity in several interrelated ways including but not limited to the number of item-specific parameters estimated in a given model, the function which links the expected response and the predictor, response option formats, as well as dimensionality. As a result, inferior methods (a.k.a. Classical Test Theory methods) continue to be employed in efforts to measure psychological constructs, despite evidence showing that IRT methods yield more precise and accurate measurement. To increase the use of IRT methods, this study endeavors to provide a comprehensive overview of binary IRT models; that is, measurement models employed on test data consisting of binary response options (e.g., correct/incorrect, true/false, agree/disagree). Specifically, this study will cover the most basic binary IRT model, known as the 1-parameter logistic (1-PL) model dating back to over 50 years ago, up until the most recent complex, 4-parameter logistic (4-PL) model. Binary IRT models will be defined mathematically and the interpretation of each parameter will be provided. Next, all four binary IRT models will be employed on two sets of data: 1. Simulated data of N=500,000 subjects who responded to four dichotomous items and 2. A pilot analysis of real-world data collected from a sample of approximately 770 subjects who responded to four self-report dichotomous items pertaining to emotional consequences to alcohol use. Real-world data were based on responses collected on items administered to subjects as part of a scale-development study (NIDA Grant No. R44 DA023322). IRT analyses conducted on both the simulated data and analyses of real-world pilot will provide a clear demonstration of how to construct, evaluate, and compare binary IRT measurement models. All analyses will be performed using the new IRT procedure in SAS 9.4. SAS code to generate simulated data and analyses will be available upon request to allow for replication of results.

Keywords: instrument development, item response theory, latent trait theory, psychometrics

Procedia PDF Downloads 338