Search results for: software fault prediction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7340

Search results for: software fault prediction

5000 Assessing the Effect of the Position of the Cavities on the Inner Plate of the Steel Shear Wall under Time History Dynamic Analysis

Authors: Masoud Mahdavi, Mojtaba Farzaneh Moghadam

Abstract:

The seismic forces caused by the waves created in the depths of the earth during the earthquake hit the structure and cause the building to vibrate. Creating large seismic forces will cause low-strength sections in the structure to suffer extensive surface damage. The use of new steel shear walls in steel structures has caused the strength of the building and its main members (columns) to increase due to the reduction and depreciation of seismic forces during earthquakes. In the present study, an attempt was made to evaluate a type of steel shear wall that has regular holes in the inner sheet by modeling the finite element model with Abacus software. The shear wall of the steel plate, measuring 6000 × 3000 mm (one floor) and 3 mm thickness, was modeled with four different pores with a cross-sectional area. The shear wall was dynamically subjected to a time history of 5 seconds by three accelerators, El Centro, Imperial Valley and Kobe. The results showed that increasing the distance between the geometric center of the hole and the geometric center of the inner plate in the steel shear wall (increasing the RCS index) caused the total maximum acceleration to be transferred from the perimeter of the hole to horizontal and vertical beams. The results also show that there is no direct relationship between RCS index and total acceleration in steel shear wall and RCS index is separate from the peak ground acceleration value of earthquake.

Keywords: hollow steel plate shear wall, time history analysis, finite element method, abaqus software

Procedia PDF Downloads 105
4999 Effects of Microbial Biofertilization on Nodulation, Nitrogen Fixation, and Yield of Lablab purpureus

Authors: Benselama Amel, Ounane S. Mohamed, Bekki Abdelkader

Abstract:

A collection of 20 isolates from fresh Nodules of the legume plant Lablab purpureus was isolated. These isolates have been authenticated by seedling inoculation grown in jars containing sand. The results obtained after two months of culture have revealed that the 20 isolates (100% of the isolates) are able to nodulate their host plants. The results obtained were analyzed statistically by ANOVA using the software statistica and had shown that the effect of the inoculation has significantly improved all the growth parameters (the height of the plant and the dry weight of the aerial parts and roots, and the number of nodules). We have evaluated the tolerance of all strains of the collection to the major stress factors as the salinity, pH and extreme temperature. The osmotolerance reached a concentration up to 1710mm of NaCl. The strains were also able to grow on a wide range of pH, ranging from 4.5 to 9.5, and temperature, between 4°C and 40°C. Also, we tested the effect of the acidity, aluminum and ferric deficit on the Lablab-rhizobia symbiosis. Lablab purpureus has not been affected by the presence of high concentrations of aluminum. On the other hand, iron deficiency has caused a net decrease in the dry biomass of the aerial part. The results of all the phenotypic characters have been treated by the statistical Minitab software, the numerical analysis had shown that these bacterial strains are divided into two distinct groups at a level of similarity of 86 %. The SDS-PAGE was carried out to determine the profile of the total protein of the strains. The coefficients of similarity of polypeptide bands between the isolates and strains reference (Bradyrhizobium, Mesorizobium sp.) confirm that our strain belongs to the groups of rhizobia.

Keywords: SDS-PAGE, rhizobia, symbiosis, phenotypic characterization, Lablab purpureus

Procedia PDF Downloads 309
4998 Visual Simulation for the Relationship of Urban Fabric

Authors: Ting-Yu Lin, Han-Liang Lin

Abstract:

This article is about the urban form of visualization by Cityengine. City is composed of different domains, and each domain has its own fabric because of arrangement. For example, a neighborhood unit contains fabrics such as schools, street networks, residential and commercial spaces. Therefore, studying urban morphology can help us understand the urban form in planning process. Streets, plots, and buildings seem as urban fabrics, and they configure urban form. Traditionally, urban morphology usually discussed single parameter, which is building type, ignoring other parameters such as streets and plots. However, urban space is three-dimensional, instead of two-dimensional. People perceive urban space by their visualization. Therefore, using visualization can fill the gap between two dimensions and three dimensions. Hence, the study of urban morphology will strengthen the understanding of whole appearance of a city. Cityengine is a software which can edit, analyze and monitor the data and visualize the result for GIS, a common tool to analyze data and display the map for urban plan and urban design. Cityengine can parameterize the data of streets, plots and building types and visualize the result in three-dimensional way. The research will reappear the real urban form by visualizing. We can know whether the urban form can be parameterized and the parameterized result can match the real urban form. Then, visualizing the result by software in three dimension to analyze the rule of urban form. There will be three stages of the research. It will start with a field survey of Tainan East District in Taiwan to conclude the relationships between urban fabrics of street networks, plots and building types. Second, to visualize the relationship, it will turn the relationship into codes which Cityengine can read. Last, Cityengine will automatically display the result by visualizing.

Keywords: Cityengine, urban fabric, urban morphology, visual simulation

Procedia PDF Downloads 300
4997 Artificial Neural Network and Satellite Derived Chlorophyll Indices for Estimation of Wheat Chlorophyll Content under Rainfed Condition

Authors: Muhammad Naveed Tahir, Wang Yingkuan, Huang Wenjiang, Raheel Osman

Abstract:

Numerous models used in prediction and decision-making process but most of them are linear in natural environment, and linear models reach their limitations with non-linearity in data. Therefore accurate estimation is difficult. Artificial Neural Networks (ANN) found extensive acceptance to address the modeling of the complex real world for the non-linear environment. ANN’s have more general and flexible functional forms than traditional statistical methods can effectively deal with. The link between information technology and agriculture will become more firm in the near future. Monitoring crop biophysical properties non-destructively can provide a rapid and accurate understanding of its response to various environmental influences. Crop chlorophyll content is an important indicator of crop health and therefore the estimation of crop yield. In recent years, remote sensing has been accepted as a robust tool for site-specific management by detecting crop parameters at both local and large scales. The present research combined the ANN model with satellite-derived chlorophyll indices from LANDSAT 8 imagery for predicting real-time wheat chlorophyll estimation. The cloud-free scenes of LANDSAT 8 were acquired (Feb-March 2016-17) at the same time when ground-truthing campaign was performed for chlorophyll estimation by using SPAD-502. Different vegetation indices were derived from LANDSAT 8 imagery using ERADAS Imagine (v.2014) software for chlorophyll determination. The vegetation indices were including Normalized Difference Vegetation Index (NDVI), Green Normalized Difference Vegetation Index (GNDVI), Chlorophyll Absorbed Ratio Index (CARI), Modified Chlorophyll Absorbed Ratio Index (MCARI) and Transformed Chlorophyll Absorbed Ratio index (TCARI). For ANN modeling, MATLAB and SPSS (ANN) tools were used. Multilayer Perceptron (MLP) in MATLAB provided very satisfactory results. For training purpose of MLP 61.7% of the data, for validation purpose 28.3% of data and rest 10% of data were used to evaluate and validate the ANN model results. For error evaluation, sum of squares error and relative error were used. ANN model summery showed that sum of squares error of 10.786, the average overall relative error was .099. The MCARI and NDVI were revealed to be more sensitive indices for assessing wheat chlorophyll content with the highest coefficient of determination R²=0.93 and 0.90 respectively. The results suggested that use of high spatial resolution satellite imagery for the retrieval of crop chlorophyll content by using ANN model provides accurate, reliable assessment of crop health status at a larger scale which can help in managing crop nutrition requirement in real time.

Keywords: ANN, chlorophyll content, chlorophyll indices, satellite images, wheat

Procedia PDF Downloads 149
4996 Artificial Law: Legal AI Systems and the Need to Satisfy Principles of Justice, Equality and the Protection of Human Rights

Authors: Begum Koru, Isik Aybay, Demet Celik Ulusoy

Abstract:

The discipline of law is quite complex and has its own terminology. Apart from written legal rules, there is also living law, which refers to legal practice. Basic legal rules aim at the happiness of individuals in social life and have different characteristics in different branches such as public or private law. On the other hand, law is a national phenomenon. The law of one nation and the legal system applied on the territory of another nation may be completely different. People who are experts in a particular field of law in one country may have insufficient expertise in the law of another country. Today, in addition to the local nature of law, international and even supranational law rules are applied in order to protect basic human values and ensure the protection of human rights around the world. Systems that offer algorithmic solutions to legal problems using artificial intelligence (AI) tools will perhaps serve to produce very meaningful results in terms of human rights. However, algorithms to be used should not be developed by only computer experts, but also need the contribution of people who are familiar with law, values, judicial decisions, and even the social and political culture of the society to which it will provide solutions. Otherwise, even if the algorithm works perfectly, it may not be compatible with the values of the society in which it is applied. The latest developments involving the use of AI techniques in legal systems indicate that artificial law will emerge as a new field in the discipline of law. More AI systems are already being applied in the field of law, with examples such as predicting judicial decisions, text summarization, decision support systems, and classification of documents. Algorithms for legal systems employing AI tools, especially in the field of prediction of judicial decisions and decision support systems, have the capacity to create automatic decisions instead of judges. When the judge is removed from this equation, artificial intelligence-made law created by an intelligent algorithm on its own emerges, whether the domain is national or international law. In this work, the aim is to make a general analysis of this new topic. Such an analysis needs both a literature survey and a perspective from computer experts' and lawyers' point of view. In some societies, the use of prediction or decision support systems may be useful to integrate international human rights safeguards. In this case, artificial law can serve to produce more comprehensive and human rights-protective results than written or living law. In non-democratic countries, it may even be thought that direct decisions and artificial intelligence-made law would be more protective instead of a decision "support" system. Since the values of law are directed towards "human happiness or well-being", it requires that the AI algorithms should always be capable of serving this purpose and based on the rule of law, the principle of justice and equality, and the protection of human rights.

Keywords: AI and law, artificial law, protection of human rights, AI tools for legal systems

Procedia PDF Downloads 78
4995 A Validated Estimation Method to Predict the Interior Wall of Residential Buildings Based on Easy to Collect Variables

Authors: B. Gepts, E. Meex, E. Nuyts, E. Knaepen, G. Verbeeck

Abstract:

The importance of resource efficiency and environmental impact assessment has raised the interest in knowing the amount of materials used in buildings. If no BIM model or energy performance certificate is available, material quantities can be obtained through an estimation or time-consuming calculation. For the interior wall area, no validated estimation method exists. However, in the case of environmental impact assessment or evaluating the existing building stock as future material banks, knowledge of the material quantities used in interior walls is indispensable. This paper presents a validated method for the estimation of the interior wall area for dwellings based on easy-to-collect building characteristics. A database of 4963 residential buildings spread all over Belgium is used. The data are collected through onsite measurements of the buildings during the construction phase (between mid-2010 and mid-2017). The interior wall area refers to the area of all interior walls in the building, including the inner leaf of exterior (party) walls, minus the area of windows and doors, unless mentioned otherwise. The two predictive modelling techniques used are 1) a (stepwise) linear regression and 2) a decision tree. The best estimation method is selected based on the best R² k-fold (5) fit. The research shows that the building volume is by far the most important variable to estimate the interior wall area. A stepwise regression based on building volume per building, building typology, and type of house provides the best fit, with R² k-fold (5) = 0.88. Although the best R² k-fold value is obtained when the other parameters ‘building typology’ and ‘type of house’ are included, the contribution of these variables can be seen as statistically significant but practically irrelevant. Thus, if these parameters are not available, a simplified estimation method based on only the volume of the building can also be applied (R² k-fold = 0.87). The robustness and precision of the method (output) are validated three times. Firstly, the prediction of the interior wall area is checked by means of alternative calculations of the building volume and of the interior wall area; thus, other definitions are applied to the same data. Secondly, the output is tested on an extension of the database, so it has the same definitions but on other data. Thirdly, the output is checked on an unrelated database with other definitions and other data. The validation of the estimation methods demonstrates that the methods remain accurate when underlying data are changed. The method can support environmental as well as economic dimensions of impact assessment, as it can be used in early design. As it allows the prediction of the amount of interior wall materials to be produced in the future or that might become available after demolition, the presented estimation method can be part of material flow analyses on input and on output.

Keywords: buildings as material banks, building stock, estimation method, interior wall area

Procedia PDF Downloads 37
4994 The Application of Artificial Neural Network for Bridge Structures Design Optimization

Authors: Angga S. Fajar, A. Aminullah, J. Kiyono, R. A. Safitri

Abstract:

This paper discusses about the application of ANN for optimizing of bridge structure design. ANN has been applied in various field of science concerning prediction and optimization. The structural optimization has several benefit including accelerate structural design process, saving the structural material, and minimize self-weight and mass of structure. In this paper, there are three types of bridge structure that being optimized including PSC I-girder superstructure, composite steel-concrete girder superstructure, and RC bridge pier. The different optimization strategy on each bridge structure implement back propagation method of ANN is conducted in this research. The optimal weight and easier design process of bridge structure with satisfied error are achieved.

Keywords: bridge structures, ANN, optimization, back propagation

Procedia PDF Downloads 377
4993 Development of Non-Intrusive Speech Evaluation Measure Using S-Transform and Light-Gbm

Authors: Tusar Kanti Dash, Ganapati Panda

Abstract:

The evaluation of speech quality and intelligence is critical to the overall effectiveness of the Speech Enhancement Algorithms. Several intrusive and non-intrusive measures are employed to calculate these parameters. Non-Intrusive Evaluation is most challenging as, very often, the reference clean speech data is not available. In this paper, a novel non-intrusive speech evaluation measure is proposed using audio features derived from the Stockwell transform. These features are used with the Light Gradient Boosting Machine for the effective prediction of speech quality and intelligibility. The proposed model is analyzed using noisy and reverberant speech from four databases, and the results are compared with the standard Intrusive Evaluation Measures. It is observed from the comparative analysis that the proposed model is performing better than the standard Non-Intrusive models.

Keywords: non-Intrusive speech evaluation, S-transform, light GBM, speech quality, and intelligibility

Procedia PDF Downloads 264
4992 The Analysis of Personalized Low-Dose Computed Tomography Protocol Based on Cumulative Effective Radiation Dose and Cumulative Organ Dose for Patients with Breast Cancer with Regular Chest Computed Tomography Follow up

Authors: Okhee Woo

Abstract:

Purpose: The aim of this study is to evaluate 2-year cumulative effective radiation dose and cumulative organ dose on regular follow-up computed tomography (CT) scans in patients with breast cancer and to establish personalized low-dose CT protocol. Methods and Materials: A retrospective study was performed on the patients with breast cancer who were diagnosed and managed consistently on the basis of routine breast cancer follow-up protocol between 2012-01 and 2016-06. Based on ICRP (International Commission on Radiological Protection) 103, the cumulative effective radiation doses of each patient for 2-year follow-up were analyzed using the commercial radiation management software (Radimetrics, Bayer healthcare). The personalized effective doses on each organ were analyzed in detail by the software-providing Monte Carlo simulation. Results: A total of 3822 CT scans on 490 patients was evaluated (age: 52.32±10.69). The mean scan number for each patient was 7.8±4.54. Each patient was exposed 95.54±63.24 mSv of radiation for 2 years. The cumulative CT radiation dose was significantly higher in patients with lymph node metastasis (p = 0.00). The HER-2 positive patients were more exposed to radiation compared to estrogen or progesterone receptor positive patient (p = 0.00). There was no difference in the cumulative effective radiation dose with different age groups. Conclusion: To acknowledge how much radiation exposed to a patient is a starting point of management of radiation exposure for patients with long-term CT follow-up. The precise and personalized protocol, as well as iterative reconstruction, may reduce hazard from unnecessary radiation exposure.

Keywords: computed tomography, breast cancer, effective radiation dose, cumulative organ dose

Procedia PDF Downloads 200
4991 Municipal Solid Waste Management Using Life Cycle Assessment Approach: Case Study of Maku City, Iran

Authors: L. Heidari, M. Jalili Ghazizade

Abstract:

This paper aims to determine the best environmental and economic scenario for Municipal Solid Waste (MSW) management of the Maku city by using Life Cycle Assessment (LCA) approach. The functional elements of this study are collection, transportation, and disposal of MSW in Maku city. Waste composition and density, as two key parameters of MSW, have been determined by field sampling, and then, the other important specifications of MSW like chemical formula, thermal energy and water content were calculated. These data beside other information related to collection and disposal facilities are used as a reliable source of data to assess the environmental impacts of different waste management options, including landfills, composting, recycling and energy recovery. The environmental impact of MSW management options has been investigated in 15 different scenarios by Integrated Waste Management (IWM) software. The photochemical smog, greenhouse gases, acid gases, toxic emissions, and energy consumption of each scenario are measured. Then, the environmental indices of each scenario are specified by weighting these parameters. Economic costs of scenarios have been also compared with each other based on literature. As final result, since the organic materials make more than 80% of the waste, compost can be a suitable method. Although the major part of the remaining 20% of waste can be recycled, due to the high cost of necessary equipment, the landfill option has been suggested. Therefore, the scenario with 80% composting and 20% landfilling is selected as superior environmental and economic scenario. This study shows that, to select a scenario with practical applications, simultaneously environmental and economic aspects of different scenarios must be considered.

Keywords: IWM software, life cycle assessment, Maku, municipal solid waste management

Procedia PDF Downloads 242
4990 Theoretical Prediction of the Structural, Elastic, Electronic, Optical, and Thermal Properties of Cubic Perovskites CsXF3 (X = Ca, Sr, and Hg) under Pressure Effect

Authors: M. A. Ghebouli, A. Bouhemadou, H. Choutri, L. Louaila

Abstract:

Some physical properties of the cubic perovskites CsXF3 (X = Sr, Ca, and Hg) have been investigated using pseudopotential plane–wave (PP-PW) method based on the density functional theory (DFT). The calculated lattice constants within GGA (PBE) and LDA (CA-PZ) agree reasonably with the available experiment data. The elastic constants and their pressure derivatives are predicted using the static finite strain technique. We derived the bulk and shear moduli, Young’s modulus, Poisson’s ratio and Lamé’s constants for ideal polycrystalline aggregates. The analysis of B/G ratio indicates that CsXF3 (X = Ca, Sr, and Hg) are ductile materials. The thermal effect on the volume, bulk modulus, heat capacities CV, CP, and Debye temperature was predicted.

Keywords: perovskite, PP-PW method, elastic constants, electronic band structure

Procedia PDF Downloads 440
4989 A Systematic Review Investigating the Use of EEG Measures in Neuromarketing

Authors: A. M. Byrne, E. Bonfiglio, C. Rigby, N. Edelstyn

Abstract:

Introduction: Neuromarketing employs numerous methodologies when investigating products and advertisement effectiveness. Electroencephalography (EEG), a non-invasive measure of electrical activity from the brain, is commonly used in neuromarketing. EEG data can be considered using time-frequency (TF) analysis, where changes in the frequency of brainwaves are calculated to infer participant’s mental states, or event-related potential (ERP) analysis, where changes in amplitude are observed in direct response to a stimulus. This presentation discusses the findings of a systematic review of EEG measures in neuromarketing. A systematic review summarises evidence on a research question, using explicit measures to identify, select, and critically appraise relevant research papers. Thissystematic review identifies which EEG measures are the most robust predictor of customer preference and purchase intention. Methods: Search terms identified174 papers that used EEG in combination with marketing-related stimuli. Publications were excluded if they were written in a language other than English or were not published as journal articles (e.g., book chapters). The review investigated which TF effect (e.g., theta-band power) and ERP component (e.g., N400) most consistently reflected preference and purchase intention. Machine-learning prediction was also investigated, along with the use of EEG combined with physiological measures such as eye-tracking. Results: Frontal alpha asymmetry was the most reliable TF signal, where an increase in activity over the left side of the frontal lobe indexed a positive response to marketing stimuli, while an increase in activity over the right side indexed a negative response. The late positive potential, a positive amplitude increase around 600 ms after stimulus presentation, was the most reliable ERP component, reflecting the conscious emotional evaluation of marketing stimuli. However, each measure showed mixed results when related to preference and purchase behaviour. Predictive accuracy was greatly improved through machine-learning algorithms such as deep neural networks, especially when combined with eye-tracking or facial expression analyses. Discussion: This systematic review provides a novel catalogue of the most effective use of each EEG measure commonly used in neuromarketing. Exciting findings to emerge are the identification of the frontal alpha asymmetry and late positive potential as markers of preferential responses to marketing stimuli. Predictive accuracy using machine-learning algorithms achieved predictive accuracies as high as 97%, and future research should therefore focus on machine-learning prediction when using EEG measures in neuromarketing.

Keywords: EEG, ERP, neuromarketing, machine-learning, systematic review, time-frequency

Procedia PDF Downloads 120
4988 Effect of 3-Dimensional Knitted Spacer Fabrics Characteristics on Its Thermal and Compression Properties

Authors: Veerakumar Arumugam, Rajesh Mishra, Jiri Militky, Jana Salacova

Abstract:

The thermo-physiological comfort and compression properties of knitted spacer fabrics have been evaluated by varying the different spacer fabric parameters. Air permeability and water vapor transmission of the fabrics were measured using the Textest FX-3300 air permeability tester and PERMETEST. Then thermal behavior of fabrics was obtained by Thermal conductivity analyzer and overall moisture management capacity was evaluated by moisture management tester. Spacer Fabrics compression properties were also tested using Kawabata Evaluation System (KES-FB3). In the KES testing, the compression resilience, work of compression, linearity of compression and other parameters were calculated from the pressure-thickness curves. Analysis of Variance (ANOVA) was performed using new statistical software named QC expert trilobite and Darwin in order to compare the influence of different fabric parameters on thermo-physiological and compression behavior of samples. This study established that the raw materials, type of spacer yarn, density, thickness and tightness of surface layer have significant influence on both thermal conductivity and work of compression in spacer fabrics. The parameter which mainly influence on the water vapor permeability of these fabrics is the properties of raw material i.e. the wetting and wicking properties of fibers. The Pearson correlation between moisture capacity of the fabrics and water vapour permeability was found using statistical software named QC expert trilobite and Darwin. These findings are important requirements for the further designing of clothing for extreme environmental conditions.

Keywords: 3D spacer fabrics, thermal conductivity, moisture management, work of compression (WC), resilience of compression (RC)

Procedia PDF Downloads 547
4987 A Novel Rapid Well Control Technique Modelled in Computational Fluid Dynamics Software

Authors: Michael Williams

Abstract:

The ability to control a flowing well is of the utmost important. During the kill phase, heavy weight kill mud is circulated around the well. While increasing bottom hole pressure near wellbore formation, the damage is increased. The addition of high density spherical objects has the potential to minimise this near wellbore damage, increase bottom hole pressure and reduce operational time to kill the well. This operational time saving is seen in the rapid deployment of high density spherical objects instead of building high density drilling fluid. The research aims to model the well kill process using a Computational Fluid Dynamics software. A model has been created as a proof of concept to analyse the flow of micron sized spherical objects in the drilling fluid. Initial results show that this new methodology of spherical objects in drilling fluid agrees with traditional stream lines seen in non-particle flow. Additional models have been created to demonstrate that areas of higher flow rate around the bit can lead to increased probability of wash out of formations but do not affect the flow of micron sized spherical objects. Interestingly, areas that experience dimensional changes such as tool joints and various BHA components do not appear at this initial stage to experience increased velocity or create areas of turbulent flow, which could lead to further borehole stability. In conclusion, the initial models of this novel well control methodology have not demonstrated any adverse flow patterns, which would conclude that this model may be viable under field conditions.

Keywords: well control, fluid mechanics, safety, environment

Procedia PDF Downloads 176
4986 Micro-Meso 3D FE Damage Modelling of Woven Carbon Fibre Reinforced Plastic Composite under Quasi-Static Bending

Authors: Aamir Mubashar, Ibrahim Fiaz

Abstract:

This research presents a three-dimensional finite element modelling strategy to simulate damage in a quasi-static three-point bending analysis of woven twill 2/2 type carbon fibre reinforced plastic (CFRP) composite on a micro-meso level using cohesive zone modelling technique. A meso scale finite element model comprised of a number of plies was developed in the commercial finite element code Abaqus/explicit. The interfaces between the plies were explicitly modelled using cohesive zone elements to allow for debonding by crack initiation and propagation. Load-deflection response of the CRFP within the quasi-static range was obtained and compared with the data existing in the literature. This provided validation of the model at the global scale. The outputs resulting from the global model were then used to develop a simulation model capturing the micro-meso scale material features. The sub-model consisted of a refined mesh representative volume element (RVE) modelled in texgen software, which was later embedded with cohesive elements in the finite element software environment. The results obtained from the developed strategy were successful in predicting the overall load-deflection response and the damage in global and sub-model at the flexure limit of the specimen. Detailed analysis of the effects of the micro-scale features was carried out.

Keywords: woven composites, multi-scale modelling, cohesive zone, finite element model

Procedia PDF Downloads 142
4985 A Prediction Model of Tornado and Its Impact on Architecture Design

Authors: Jialin Wu, Zhiwei Lian, Jieyu Tang, Jingyun Shen

Abstract:

Tornado is a serious and unpredictable natural disaster, which has an important impact on people's production and life. The probability of being hit by tornadoes in China was analyzed considering the principles of tornado formation. Then some suggestions on layout and shapes for newly-built buildings were provided combined with the characteristics of tornado wind fields. Fuzzy clustering and inverse closeness methods were used to evaluate the probability levels of tornado risks in various provinces based on classification and ranking. GIS was adopted to display the results. Finally, wind field single-vortex tornado was studied to discuss the optimized design of rural low-rise houses in Yancheng, Jiangsu as an example. This paper may provide enough data to support building and urban design in some specific regions.

Keywords: tornado probability, computational fluid dynamics, fuzzy mathematics, optimal design

Procedia PDF Downloads 140
4984 Distributed Automation System Based Remote Monitoring of Power Quality Disturbance on LV Network

Authors: Emmanuel D. Buedi, K. O. Boateng, Griffith S. Klogo

Abstract:

Electrical distribution networks are prone to power quality disturbances originating from the complexity of the distribution network, mode of distribution (overhead or underground) and types of loads used by customers. Data on the types of disturbances present and frequency of occurrence is needed for economic evaluation and hence finding solution to the problem. Utility companies have resorted to using secondary power quality devices such as smart meters to help gather the required data. Even though this approach is easier to adopt, data gathered from these devices may not serve the required purpose, since the installation of these devices in the electrical network usually does not conform to available PQM placement methods. This paper presents a design of a PQM that is capable of integrating into an existing DAS infrastructure to take advantage of available placement methodologies. The monitoring component of the design is implemented and installed to monitor an existing LV network. Data from the monitor is analyzed and presented. A portion of the LV network of the Electricity Company of Ghana is modeled in MATLAB-Simulink and analyzed under various earth fault conditions. The results presented show the ability of the PQM to detect and analyze PQ disturbance such as voltage sag and overvoltage. By adopting a placement methodology and installing these nodes, utilities are assured of accurate and reliable information with respect to the quality of power delivered to consumers.

Keywords: power quality, remote monitoring, distributed automation system, economic evaluation, LV network

Procedia PDF Downloads 356
4983 Intelligent Control of Bioprocesses: A Software Application

Authors: Mihai Caramihai, Dan Vasilescu

Abstract:

The main research objective of the experimental bioprocess analyzed in this paper was to obtain large biomass quantities. The bioprocess is performed in 100 L Bioengineering bioreactor with 42 L cultivation medium made of peptone, meat extract and sodium chloride. The reactor was equipped with pH, temperature, dissolved oxygen, and agitation controllers. The operating parameters were 37 oC, 1.2 atm, 250 rpm and air flow rate of 15 L/min. The main objective of this paper is to present a case study to demonstrate that intelligent control, describing the complexity of the biological process in a qualitative and subjective manner as perceived by human operator, is an efficient control strategy for this kind of bioprocesses. In order to simulate the bioprocess evolution, an intelligent control structure, based on fuzzy logic has been designed. The specific objective is to present a fuzzy control approach, based on human expert’ rules vs. a modeling approach of the cells growth based on bioprocess experimental data. The kinetic modeling may represent only a small number of bioprocesses for overall biosystem behavior while fuzzy control system (FCS) can manipulate incomplete and uncertain information about the process assuring high control performance and provides an alternative solution to non-linear control as it is closer to the real world. Due to the high degree of non-linearity and time variance of bioprocesses, the need of control mechanism arises. BIOSIM, an original developed software package, implements such a control structure. The simulation study has showed that the fuzzy technique is quite appropriate for this non-linear, time-varying system vs. the classical control method based on a priori model.

Keywords: intelligent, control, fuzzy model, bioprocess optimization

Procedia PDF Downloads 328
4982 Prediction of a Nanostructure Called Porphyrin-Like Buckyball, Using Density Functional Theory and Investigating Electro Catalytic Reduction of Co₂ to Co by Cobalt– Porphyrin-Like Buckyball

Authors: Mohammad Asadpour, Maryam Sadeghi, Mahmoud Jafari

Abstract:

The transformation of carbon dioxide into fuels and commodity chemicals is considered one of the most attractive methods to meet energy demands and reduce atmospheric CO₂ levels. Cobalt complexes have previously shown high faradaic efficiency in the reduction of CO₂ to CO. In this study, a nanostructure, referred to as a porphyrin-like buckyball, is simulated and analyzed for its electrical properties. The investigation aims to understand the unique characteristics of this material and its potential applications in electronic devices. Through computational simulations and analysis, the electrocatalytic reduction of CO₂ to CO by Cobalt-porphyrin-like buckyball is explored. The findings of this study offer valuable insights into the electrocatalytic properties of this predicted structure, paving the way for further research and development in the field of nanotechnology.

Keywords: porphyrin-like buckyball, DFT, nanomaterials, CO₂ to CO

Procedia PDF Downloads 57
4981 The Effectiveness of Conflict Management of Factories' Employee in Thailand

Authors: Pacharaporn Lekyan

Abstract:

The purpose of this study is to explore the conflict management affecting the workplace and analyze the ability of the prediction of leadership of the headman and the methods to handle the conflict in an organization. The quantitative research and developed the questionnaire in order to collect information from the respondents from 200 samples from leader or manager who worked in frozen food factories in Thailand. The result analysis shows about the problem of the relationship between conflict management factors, leadership, and the confliction in organization. The emotion of the leader in the organization is not the only factor that can affect conflict management but also the emotion of surrounding people which this factor can happen all the time and shows that four out of five factors of interpersonal conflict management have affected on emotion intelligence and also shows that the behaviors of leadership have an influence on conflict management.

Keywords: conflict management, emotional intelligence, leadership, factories' employee

Procedia PDF Downloads 369
4980 Forecasting Solid Waste Generation in Turkey

Authors: Yeliz Ekinci, Melis Koyuncu

Abstract:

Successful planning of solid waste management systems requires successful prediction of the amount of solid waste generated in an area. Waste management planning can protect the environment and human health, hence it is tremendously important for countries. The lack of information in waste generation can cause many environmental and health problems. Turkey is a country that plans to join European Union, hence, solid waste management is one of the most significant criteria that should be handled in order to be a part of this community. Solid waste management system requires a good forecast of solid waste generation. Thus, this study aims to forecast solid waste generation in Turkey. Artificial Neural Network and Linear Regression models will be used for this aim. Many models will be run and the best one will be selected based on some predetermined performance measures.

Keywords: forecast, solid waste generation, solid waste management, Turkey

Procedia PDF Downloads 512
4979 The Effect That the Data Assimilation of Qinghai-Tibet Plateau Has on a Precipitation Forecast

Authors: Ruixia Liu

Abstract:

Qinghai-Tibet Plateau has an important influence on the precipitation of its lower reaches. Data from remote sensing has itself advantage and numerical prediction model which assimilates RS data will be better than other. We got the assimilation data of MHS and terrestrial and sounding from GSI, and introduced the result into WRF, then got the result of RH and precipitation forecast. We found that assimilating MHS and terrestrial and sounding made the forecast on precipitation, area and the center of the precipitation more accurate by comparing the result of 1h,6h,12h, and 24h. Analyzing the difference of the initial field, we knew that the data assimilating about Qinghai-Tibet Plateau influence its lower reaches forecast by affecting on initial temperature and RH.

Keywords: Qinghai-Tibet Plateau, precipitation, data assimilation, GSI

Procedia PDF Downloads 237
4978 Tracing Digital Traces of Phatic Communion in #Mooc

Authors: Judith Enriquez-Gibson

Abstract:

This paper meddles with the notion of phatic communion introduced 90 years ago by Malinowski, who was a Polish-born British anthropologist. It explores the phatic in Twitter within the contents of tweets related to moocs (massive online open courses) as a topic or trend. It is not about moocs though. It is about practices that could easily be hidden or neglected if we let big or massive topics take the lead or if we simply follow the computational or secret codes behind Twitter itself and third party software analytics. It draws from media and cultural studies. Though at first it appears data-driven as I submitted data collection and analytics into the hands of a third party software, Twitonomy, the aim is to follow how phatic communion might be practised in a social media site, such as Twitter. Lurking becomes its research method to analyse mooc-related tweets. A total of 3,000 tweets were collected on 11 October 2013 (UK timezone). The emphasis of lurking is to engage with Twitter as a system of connectivity. One interesting finding is that a click is in fact a phatic practice. A click breaks the silence. A click in one of the mooc website is actually a tweet. A tweet was posted on behalf of a user who simply chose to click without formulating the text and perhaps without knowing that it contains #mooc. Surely, this mechanism is not about reciprocity. To break the silence, users did not use words. They just clicked the ‘tweet button’ on a mooc website. A click performs and maintains connectivity – and Twitter as the medium in attendance in our everyday, available when needed to be of service. In conclusion, the phatic culture of breaking silence in Twitter does not have to submit to the power of code and analytics. It is a matter of human code.

Keywords: click, Twitter, phatic communion, social media data, mooc

Procedia PDF Downloads 415
4977 Machine Learning Techniques for Estimating Ground Motion Parameters

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.

Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine

Procedia PDF Downloads 125
4976 Experimental and Numerical Performance Analysis for Steam Jet Ejectors

Authors: Abdellah Hanafi, G. M. Mostafa, Mohamed Mortada, Ahmed Hamed

Abstract:

The steam ejectors are the heart of most of the desalination systems that employ vacuum. The systems that employ low grade thermal energy sources like solar energy and geothermal energy use the ejector to drive the system instead of high grade electric energy. The jet-ejector is used to create vacuum employing the flow of steam or air and using the severe pressure drop at the outlet of the main nozzle. The present work involves developing a one dimensional mathematical model for designing jet-ejectors and transform it into computer code using Engineering Equation solver (EES) software. The model receives the required operating conditions at the inlets and outlet of the ejector as inputs and produces the corresponding dimensions required to reach these conditions. The one-dimensional model has been validated using an existed model working on Abu-Qir power station. A prototype has been designed according to the one-dimensional model and attached to a special test bench to be tested before using it in the solar desalination pilot plant. The tested ejector will be responsible for the startup evacuation of the system and adjusting the vacuum of the evaporating effects. The tested prototype has shown a good agreement with the results of the code. In addition a numerical analysis has been applied on one of the designed geometry to give an image of the pressure and velocity distribution inside the ejector from a side, and from other side, to show the difference in results between the two-dimensional ideal gas model and real prototype. The commercial edition of ANSYS Fluent v.14 software is used to solve the two-dimensional axisymmetric case.

Keywords: solar energy, jet ejector, vacuum, evaporating effects

Procedia PDF Downloads 624
4975 Fast Switching Mechanism for Multicasting Failure in OpenFlow Networks

Authors: Alaa Allakany, Koji Okamura

Abstract:

Multicast technology is an efficient and scalable technology for data distribution in order to optimize network resources. However, in the IP network, the responsibility for management of multicast groups is distributed among network routers, which causes some limitations such as delays in processing group events, high bandwidth consumption and redundant tree calculation. Software Defined Networking (SDN) represented by OpenFlow presented as a solution for many problems, in SDN the control plane and data plane are separated by shifting the control and management to a remote centralized controller, and the routers are used as a forwarder only. In this paper we will proposed fast switching mechanism for solving the problem of link failure in multicast tree based on Tabu Search heuristic algorithm and modifying the functions of OpenFlow switch to fasts switch to the pack up sub tree rather than sending to the controller. In this work we will implement multicasting OpenFlow controller, this centralized controller is a core part in our multicasting approach, which is responsible for 1- constructing the multicast tree, 2- handling the multicast group events and multicast state maintenance. And finally modifying OpenFlow switch functions for fasts switch to pack up paths. Forwarders, forward the multicast packet based on multicast routing entries which were generated by the centralized controller. Tabu search will be used as heuristic algorithm for construction near optimum multicast tree and maintain multicast tree to still near optimum in case of join or leave any members from multicast group (group events).

Keywords: multicast tree, software define networks, tabu search, OpenFlow

Procedia PDF Downloads 266
4974 Review on Rainfall Prediction Using Machine Learning Technique

Authors: Prachi Desai, Ankita Gandhi, Mitali Acharya

Abstract:

Rainfall forecast is mainly used for predictions of rainfall in a specified area and determining their future rainfall conditions. Rainfall is always a global issue as it affects all major aspects of one's life. Agricultural, fisheries, forestry, tourism industry and other industries are widely affected by these conditions. The studies have resulted in insufficient availability of water resources and an increase in water demand in the near future. We already have a new forecast system that uses the deep Convolutional Neural Network (CNN) to forecast monthly rainfall and climate changes. We have also compared CNN against Artificial Neural Networks (ANN). Machine Learning techniques that are used in rainfall predictions include ARIMA Model, ANN, LR, SVM etc. The dataset on which we are experimenting is gathered online over the year 1901 to 20118. Test results have suggested more realistic improvements than conventional rainfall forecasts.

Keywords: ANN, CNN, supervised learning, machine learning, deep learning

Procedia PDF Downloads 206
4973 A Study of Faculty Development Programs in India to Assist Pedagogy and Curriculum Development

Authors: Chhavi Rana, Sanjay K Jain

Abstract:

All sides of every education debate agree that quality learning happens when knowledgeable, caring teachers use sound pedagogy. Many deliberations of pedagogy make the fault of considering it as principally being about teaching. There has been lot of research about how to build a positive climate for learning, improve student curiosity, and enhance classroom association. However, these things can only be facilitated when teachers are equipped with better teaching techniques that use sound and accurate pedagogy. Pedagogy is the science and art of education. Its aims range from the full development of the human being to skills acquisition. In India, a project named Mission 10 x has been started by an esteemed IT Corporation Wipro as a faculty development programme (FDP) that particularly focus on elements that facilitated teachers in developing curriculum and new pedagogies that can lead to improvement in student engagement. This paper presents a study of these FDPs and examines (1) the parameters that help teachers in building new pedagogies (2) the extent to which appropriate usage of pedagogy is improved after the conduct of Mission 10 x FDPs, and (3) whether institutions differ in terms of their ability to convert usage of improved pedagogy into academic performance via these FDPs. The sample consisted of 2,236 students at 6 four-year engineering colleges and universities that completed several FDPs during 2012-2014. Many measures of usage of better pedagogy were linked positively with such FDPs, although some of the relationships were weak in strength. The results suggest that the usage of pedagogy were more benefited after conducting these FDPs and application of novel approaches in conducting classes.

Keywords: student engagement, critical thinking; achievement, student learning, pedagogy

Procedia PDF Downloads 423
4972 A Theoretical and Experimental Evaluation of a Solar-Powered Off-Grid Air Conditioning System for Residential Buildings

Authors: Adam Y. Sulaiman, Gerard I.Obasi, Roma Chang, Hussein Sayed Moghaieb, Ming J. Huang, Neil J. Hewitt

Abstract:

Residential air-conditioning units are essential for quality indoor comfort in hot climate countries. Nevertheless, because of their non-renewable energy sources and the contribution of ecologically unfriendly working fluids, these units are a major source of CO2 emissions in these countries. The utilisation of sustainable technologies nowadays is essential to reduce the adverse effects of CO2 emissions by replacing conventional technologies. This paper investigates the feasibility of running an off-grid solar-powered air-conditioning bed unit using three low GWP refrigerants (R32, R290, and R600a) to supersede conventional refrigerants.A prototype air conditioning unit was built to supply cold air to a canopy that was connected to it. The assembled unit was designed to distribute cold air to a canopy connected to it. This system is powered by two 400 W photovoltaic panels, with battery storage supplying power to the unit at night-time. Engineering Equation Solver (EES) software is used to mathematically model the vapor compression cycle (VCC) and predict the unit's energetic and exergetic performance. The TRNSYS software was used to simulate the electricity storage performance of the batteries, whereas the IES-VE was used to determine the amount of solar energy required to power the unit. The article provides an analytical design guideline, as well as a comprehensible process system. Combining a renewable energy source to power an AC based-VCC provides an excellent solution to the real problems of high-energy consumption in warm-climate countries.

Keywords: air-conditioning, refrigerants, PV panel, energy storages, VCC, exergy

Procedia PDF Downloads 180
4971 Real-Time Working Environment Risk Analysis with Smart Textiles

Authors: Jose A. Diaz-Olivares, Nafise Mahdavian, Farhad Abtahi, Kaj Lindecrantz, Abdelakram Hafid, Fernando Seoane

Abstract:

Despite new recommendations and guidelines for the evaluation of occupational risk assessments and their prevention, work-related musculoskeletal disorders are still one of the biggest causes of work activity disruption, productivity loss, sick leave and chronic work disability. It affects millions of workers throughout Europe, with a large-scale economic and social burden. These specific efforts have failed to produce significant results yet, probably due to the limited availability and high costs of occupational risk assessment at work, especially when the methods are complex, consume excessive resources or depend on self-evaluations and observations of poor accuracy. To overcome these limitations, a pervasive system of risk assessment tools in real time has been developed, which has the characteristics of a systematic approach, with good precision, usability and resource efficiency, essential to facilitate the prevention of musculoskeletal disorders in the long term. The system allows the combination of different wearable sensors, placed on different limbs, to be used for data collection and evaluation by a software solution, according to the needs and requirements in each individual working environment. This is done in a non-disruptive manner for both the occupational health expert and the workers. The creation of this solution allows us to attend different research activities that require, as an essential starting point, the recording of data with ergonomic value of very diverse origin, especially in real work environments. The software platform is here presented with a complimentary smart clothing system for data acquisition, comprised of a T-shirt containing inertial measurement units (IMU), a vest sensorized with textile electronics, a wireless electrocardiogram (ECG) and thoracic electrical bio-impedance (TEB) recorder and a glove sensorized with variable resistors, dependent on the angular position of the wrist. The collected data is processed in real-time through a mobile application software solution, implemented in commercially available Android-based smartphones and tablet platforms. Based on the collection of this information and its analysis, real-time risk assessment and feedback about postural improvement is possible, adapted to different contexts. The result is a tool which provides added value to ergonomists and occupational health agents, as in situ analysis of postural behavior can assist in a quantitative manner in the evaluation of work techniques and the occupational environment.

Keywords: ergonomics, mobile technologies, risk assessment, smart textiles

Procedia PDF Downloads 119