Search results for: Standard FEMA.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1670

Search results for: Standard FEMA.

1670 Seismic Performance Assessment of Pre-70 RC Frame Buildings with FEMA P-58

Authors: D. Cardone

Abstract:

Past earthquakes have shown that seismic events may incur large economic losses in buildings. FEMA P-58 provides engineers a practical tool for the performance seismic assessment of buildings. In this study, FEMA P-58 is applied to two typical Italian pre-1970 reinforced concrete frame buildings, characterized by plain rebars as steel reinforcement and masonry infills and partitions. Given that suitable tools for these buildings are missing in FEMA P- 58, specific fragility curves and loss functions are first developed. Next, building performance is evaluated following a time-based assessment approach. Finally, expected annual losses for the selected buildings are derived and compared with past applications to old RC frame buildings representative of the US building stock. 

Keywords: FEMA P-58, RC frame buildings, plain rebars, masonry infills, fragility functions, loss functions, expected annual loss.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1879
1669 Reliability Analysis of P-I Diagram Formula for RC Column Subjected to Blast Load

Authors: Masoud Abedini, Azrul A. Mutalib, Shahrizan Baharom, Hong Hao

Abstract:

This study was conducted published to investigate there liability of the equation pressure-impulse (PI) reinforced concrete column inprevious studies. Equation involves three different levels of damage criteria known as D =0. 2, D =0. 5 and D =0. 8.The damage criteria known as a minor when 0-0.2, 0.2-0.5is known as moderate damage, high damage known as 0.5-0.8, and 0.8-1 of the structure is considered a failure. In this study, two types of reliability analyzes conducted. First, using pressure-impulse equation with different parameters. The parameters involved are the concrete strength, depth, width, and height column, the ratio of longitudinal reinforcement and transverse reinforcement ratio. In the first analysis of the reliability of this new equation is derived to improve the previous equations. The second reliability analysis involves three types of columns used to derive the PI curve diagram using the derived equation to compare with the equation derived from other researchers and graph minimum standoff versus weapon yield Federal Emergency Management Agency (FEMA). The results showed that the derived equation is more accurate with FEMA standards than previous researchers.

Keywords: Blast load, RC column, P-I curve, Analytical formulae, Standard FEMA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2863
1668 A Three Elements Vector Valued Structure’s Ultimate Strength-Strong Motion-Intensity Measure

Authors: A. Nicknam, N. Eftekhari, A. Mazarei, M. Ganjvar

Abstract:

This article presents an alternative collapse capacity intensity measure in the three elements form which is influenced by the spectral ordinates at periods longer than that of the first mode period at near and far source sites. A parameter, denoted by β, is defined by which the spectral ordinate effects, up to the effective period (2T1), on the intensity measure are taken into account. The methodology permits to meet the hazard-levelled target extreme event in the probabilistic and deterministic forms. A MATLAB code is developed involving OpenSees to calculate the collapse capacities of the 8 archetype RC structures having 2 to 20 stories for regression process. The incremental dynamic analysis (IDA) method is used to calculate the structure’s collapse values accounting for the element stiffness and strength deterioration. The general near field set presented by FEMA is used in a series of performing nonlinear analyses. 8 linear relationships are developed for the 8structutres leading to the correlation coefficient up to 0.93. A collapse capacity near field prediction equation is developed taking into account the results of regression processes obtained from the 8 structures. The proposed prediction equation is validated against a set of actual near field records leading to a good agreement. Implementation of the proposed equation to the four archetype RC structures demonstrated different collapse capacities at near field site compared to those of FEMA. The reasons of differences are believed to be due to accounting for the spectral shape effects.

Keywords: Collapse capacity, fragility analysis, spectral shape effects, IDA method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1748
1667 Evaluation of Iranian Standard for Assessment of Liquefaction Potential of Cohesionless Soils Based on Standard Penetration Test

Authors: Reza Ziaie Moayad, Azam Kouhpeyma

Abstract:

In-situ testing is preferred to evaluate the liquefaction potential in cohesionless soils due to high disturbance during sampling. Although new in-situ methods with high accuracy have been developed, standard penetration test, the simplest and the oldest in-situ test, is still used due to the profusion of the recorded data. This paper reviews the Iranian standard of evaluating liquefaction potential in soils (codes 525) and compares the liquefaction assessment methods based on standard penetration test (SPT) results on cohesionless soil in this standard with the international standards. To this, methods for assessing liquefaction potential are compared with what is presented in standard 525. It is found that although the procedure used in Iranian standard of evaluating the potential of liquefaction has not been updated according to the new findings, it is a conservative procedure.

Keywords: cohesionless soil, liquefaction, SPT, Iranian liquefaction standard

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 410
1666 Seismic Fragility Assessment of Strongback Steel Braced Frames Subjected to Near-Field Earthquakes

Authors: Mohammadreza Salek Faramarzi, Touraj Taghikhany

Abstract:

In this paper, seismic fragility assessment of a recently developed hybrid structural system, known as the strongback system (SBS) is investigated. In this system, to mitigate the occurrence of the soft-story mechanism and improve the distribution of story drifts over the height of the structure, an elastic vertical truss is formed. The strengthened members of the braced span are designed to remain substantially elastic during levels of excitation where soft-story mechanisms are likely to occur and impose a nearly uniform story drift distribution. Due to the distinctive characteristics of near-field ground motions, it seems to be necessary to study the effect of these records on seismic performance of the SBS. To this end, a set of 56 near-field ground motion records suggested by FEMA P695 methodology is used. For fragility assessment, nonlinear dynamic analyses are carried out in OpenSEES based on the recommended procedure in HAZUS technical manual. Four damage states including slight, moderate, extensive, and complete damage (collapse) are considered. To evaluate each damage state, inter-story drift ratio and floor acceleration are implemented as engineering demand parameters. Further, to extend the evaluation of the collapse state of the system, a different collapse criterion suggested in FEMA P695 is applied. It is concluded that SBS can significantly increase the collapse capacity and consequently decrease the collapse risk of the structure during its life time. Comparing the observing mean annual frequency (MAF) of exceedance of each damage state against the allowable values presented in performance-based design methods, it is found that using the elastic vertical truss, improves the structural response effectively.

Keywords: Strongback System, Near-fault, Seismic fragility, Uncertainty, IDA, Probabilistic performance assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 509
1665 The Creation of Contemporary Apparel Inspired by the Structural Pattern Sofa Vimanmek Mansion

Authors: Chanoknart Mayusoh

Abstract:

In most of apparel creation, the designer usually uses standard pattern as a fundamental of pattern making. In the design of each kind of apparel, standard pattern is starting point of production. The importance of standard pattern is that it is able to have the apparel fits to general people. Therefore, standard pattern is standardized to be the same. Regardless which type of apparel, its standard pattern will have similar production. Anyhow, the author sees that the apparel design, regardless for which type of apparel, has to stick on the standard pattern as a fundamental of apparel design and this seems to be a limitation of apparel design without any designing alternative being developed. In the research on the creation of contemporary apparel Inspired by the sofa’s pattern structure in Vimanmek Mansion. The author has applied the pattern of the sofa and armchair to be the principle in the apparel design, instead of standard pattern, to create new form of structures and shapes making the contemporary apparel becomes more interesting and different than previous, can be used in daily life, as well as being a new alternative for apparel design. Those who are interesting in such idea can apply and develop it to be more variety further.

Keywords: Contemporary Apparel, Sofa’s Pattern, Armchair’s Pattern, Vimanmek Mansion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1255
1664 Non-Standard Monetary Policy Measures and Their Consequences

Authors: Aleksandra Nocoń (Szunke)

Abstract:

The study is a review of the literature concerning the consequences of non-standard monetary policy, which are used by central banks during unconventional periods, threatening banking sector instability. In particular, the attention was paid to the effects of non-standard monetary policy tools for financial markets. However, the empirical evidence about their effects and real consequences for financial markets is still not final. The main aim of the study is to survey consequences of standard and non-standard monetary policy instruments, implemented during the global financial crisis in the United States, United Kingdom and euro area, with particular attention to the results for the stabilization of global financial markets. The study consists mainly of the empirical review, indicating the impact of the implementation of these tools for financial markets. The following research methods were used in the study: literature studies, including domestic and foreign literature, cause and effect analysis and statistical analysis.

Keywords: Asset purchase facility, consequences of monetary policy instruments, non-standard monetary policy, Quantitative Easing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2191
1663 A Non-Standard Finite Difference Scheme for the Solution of Laplace Equation with Dirichlet Boundary Conditions

Authors: Khaled Moaddy

Abstract:

In this paper, we present a fast and accurate numerical scheme for the solution of a Laplace equation with Dirichlet boundary conditions. The non-standard finite difference scheme (NSFD) is applied to construct the numerical solutions of a Laplace equation with two different Dirichlet boundary conditions. The solutions obtained using NSFD are compared with the solutions obtained using the standard finite difference scheme (SFD). The NSFD scheme is demonstrated to be reliable and efficient.

Keywords: Standard finite difference schemes, non–standard schemes, Laplace equation, Dirichlet boundary conditions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 590
1662 Standard Fuzzy Sets for Aircraft Selection using Multiple Criteria Decision Making Analysis

Authors: C. Ardil

Abstract:

This study uses two-dimensional standard fuzzy sets to enhance multiple criteria decision-making analysis for passenger aircraft selection, allowing decision-makers to express judgments with uncertain and vague information. Using two-dimensional fuzzy numbers, three decision makers evaluated three aircraft alternatives according to seven decision criteria. A validity analysis based on two-dimensional standard fuzzy weighted geometric (SFWG) and two-dimensional standard fuzzy weighted average (SFGA) operators is conducted to test the proposed approach's robustness and effectiveness in the fuzzy multiple criteria decision making (MCDM) evaluation process. 

Keywords: Standard fuzzy sets (SFSs), aircraft selection, multiple criteria decision making, intuitionistic fuzzy sets (IFSs), SFWG, SFGA, MCDM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 304
1661 A Comparative Study of Standard, Casted and Riveted Eye Design of a Mono Leaf Spring Using CAE Tools

Authors: Gian Bhushan, Vinkel Arora, M.L. Aggarwal

Abstract:

The objective of the present study is to determine better eye end design of a mono leaf spring used in light motor vehicle. A conventional 65Si7 spring steel leaf spring model with standard eye, casted and riveted eye end are considered. The CAD model of the leaf springs is prepared in CATIA and analyzed using ANSYS. The standard eye, casted and riveted eye leaf springs are subjected to similar loading conditions. The CAE analysis of the leaf spring is performed for various parameters like deflection and Von- Mises stress. Mass reduction of 62.9% is achieved in case of riveted eye mono leaf spring as compared to standard eye mono leaf spring for the same loading conditions.

Keywords: CAE, Leaf Spring, 65Si7 spring steel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2334
1660 Detecting Abnormal ECG Signals Utilising Wavelet Transform and Standard Deviation

Authors: Dejan Stantic, Jun Jo

Abstract:

ECG contains very important clinical information about the cardiac activities of the heart. Often the ECG signal needs to be captured for a long period of time in order to identify abnormalities in certain situations. Such signal apart of a large volume often is characterised by low quality due to the noise and other influences. In order to extract features in the ECG signal with time-varying characteristics at first need to be preprocessed with the best parameters. Also, it is useful to identify specific parts of the long lasting signal which have certain abnormalities and to direct the practitioner to those parts of the signal. In this work we present a method based on wavelet transform, standard deviation and variable threshold which achieves 100% accuracy in identifying the ECG signal peaks and heartbeat as well as identifying the standard deviation, providing a quick reference to abnormalities.

Keywords: Electrocardiogram-ECG, Arrhythmia, Signal Processing, Wavelet Transform, Standard Deviation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2854
1659 Dosimetric Comparison of aSi1000 EPID and ImatriXX 2-D Array System for Volumetric Modulated Arc and Intensity Modulated Radiotherapy Patient Specific Quality Assurance

Authors: Jayesh K., Ganesh T., Suganthi D., Mohan R., Rakesh C. J., Sarojkumar D. M., Jacob S. J.

Abstract:

Prior to the use of detectors, characteristics comparison study was performed and baseline established. In patient specific QA, the portal dosimetry mean values of area gamma, average gamma and maximum gamma were 1.02, 0.31 and 1.31 with standard deviation of 0.33, 0.03 and 0.14 for IMRT and the corresponding values were 1.58, 0.48 and 1.73 with standard deviation of 0.31, 0.06 and 0.66 for VMAT. With ImatriXX 2-D array system, on an average 99.35% of the pixels passed the criteria of 3%-3 mm gamma with standard deviation of 0.24 for dynamic IMRT. For VMAT, the average value was 98.16% with a standard deviation of 0.86. The results showed that both the systems can be used in patient specific QA measurements for IMRT and VMAT. The values obtained with the portal dosimetry system were found to be relatively more consistent compared to those obtained with ImatriXX 2-D array system.

Keywords: Gamma, IMRT, QA, TPS, VMAT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2521
1658 Fairness and Quality of Service Issues and Analysis of IEEE 802.11e Wireless LAN

Authors: Ammar Abbas, Ibrahim M. Hussain, Osama M. Hussain

Abstract:

The IEEE 802.11e which is an enhanced version of the 802.11 WLAN standards incorporates the Quality of Service (QoS) which makes it a better choice for multimedia and real time applications. In this paper we study various aspects concerned with 802.11e standard. Further, the analysis results for this standard are compared with the legacy 802.11 standard. Simulation results show that IEEE 802.11e out performs legacy IEEE 802.11 in terms of quality of service due to its flow differentiated channel allocation and better queue management architecture. We also propose a method to improve the unfair allocation of bandwidth for downlink and uplink channels by varying the medium access priority level.

Keywords: Wireless, IEEE 802.11e, EDCA, Throughput, QoS, MAC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2149
1657 Hardware Implementations for the ISO/IEC 18033-4:2005 Standard for Stream Ciphers

Authors: Paris Kitsos

Abstract:

In this paper the FPGA implementations for four stream ciphers are presented. The two stream ciphers, MUGI and SNOW 2.0 are recently adopted by the International Organization for Standardization ISO/IEC 18033-4:2005 standard. The other two stream ciphers, MICKEY 128 and TRIVIUM have been submitted and are under consideration for the eSTREAM, the ECRYPT (European Network of Excellence for Cryptology) Stream Cipher project. All ciphers were coded using VHDL language. For the hardware implementation, an FPGA device was used. The proposed implementations achieve throughputs range from 166 Mbps for MICKEY 128 to 6080 Mbps for MUGI.

Keywords: Cryptography, ISO/IEC 18033-4:2005 standard, Hardware implementation, Stream ciphers

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1749
1656 Non-Destructive Evaluation of Launch Tube Welds with Radiography

Authors: Tosapolporn Pornpibunsompop

Abstract:

The non-destructive testing of launch tube weld with radiography was investigated and evaluated with AWS D1.1 standard. The paper started with preparation of launch tube and radiographic inspection. X-Ray inspection then was done and gotten the result. The judgment of inspection results were concluded by certified person and finally, the evaluation with AWS D1.1 standard was conducted as well. The result shown that weld position P1 was not conformed to AWS D1.1 which allowed size of incomplete penetration did not exceed 4 mm. The other welds were corresponded to as mentioned standard. Additionally, the corrective actions for incomplete penetration either provided for future actions.

Keywords: Non-destructive evaluation, Weld, Launch tube, Radiography

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1621
1655 Developing an Online Library for Faster Retrieval of Mold Base and Standard Parts of Injection Molding

Authors: Alan C. Lin, Ricky N. Joevan

Abstract:

This paper focuses on developing a system to transfer mold base plates and standard parts faster during the stage of injection mold design. This system not only provides a way to compare the file version, but also it utilizes Siemens NX 10 to isolate the updated information into a single executable file (.dll), and then, the file can be transferred without the need of transferring the whole file. By this way, the system can help the user to download only necessary mold base plates and standard parts, and those parts downloaded are only the updated portions.

Keywords: CAD, injection molding, mold base, data retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1130
1654 Quality Service Standard of Food and Beverage Service Staff in Hotel

Authors: Thanasit Suksutdhi

Abstract:

This survey research aims to study the standard of service quality of food and beverage service staffs in hotel business by studying the service standard of three sample hotels, Siam Kempinski Hotel Bangkok, Four Seasons Resort Chiang Mai, and Banyan Tree Phuket. In order to find the international service standard of food and beverage service, triangular research, i.e. quantitative, qualitative, and survey were employed. In this research, questionnaires and in-depth interview were used for getting the information on the sequences and method of services. There were three parts of modified questionnaires to measure service quality and guest’s satisfaction including service facilities, attentiveness, responsibility, reliability, and circumspection. This study used sample random sampling to derive subjects with the return rate of the questionnaires was 70% or 280. Data were analyzed by SPSS to find arithmetic mean, SD, percentage, and comparison by t-test and One-way ANOVA. The results revealed that the service quality of the three hotels were in the international level which could create high satisfaction to the international customers. Recommendations for research implementations were to maintain the area of good service quality, and to improve some dimensions of service quality such as reliability. Training in service standard, product knowledge, and new technology for employees should be provided. Furthermore, in order to develop the service quality of the industry, training collaboration between hotel organization and educational institutions in food and beverage service should be considered.

Keywords: Service standard, food and beverage department, sequence of service, service method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7754
1653 Design and Implementation of Medium Access Control Based Routing on Real Wireless Sensor Networks Testbed

Authors: Smriti Agarwal, Ashish Payal, B. V. R. Reddy

Abstract:

IEEE 802.15.4 is a Low Rate Wireless Personal Area Networks (LR-WPAN) standard combined with ZigBee, which is going to enable new applications in Wireless Sensor Networks (WSNs) and Internet of Things (IoT) domain. In recent years, it has become a popular standard for WSNs. Wireless communication among sensor motes, enabled by IEEE 802.15.4 standard, is extensively replacing the existing wired technology in a wide range of monitoring and control applications. Researchers have proposed a routing framework and mechanism that interacts with the IEEE 802.15.4 standard using software platform. In this paper, we have designed and implemented MAC based routing (MBR) based on IEEE 802.15.4 standard using a hardware platform “SENSEnuts”. The experimental results include data through light and temperature sensors obtained from communication between PAN coordinator and source node through coordinator, MAC address of some modules used in the experimental setup, topology of the network created for simulation and the remaining battery power of the source node. Our experimental effort on a WSN Testbed has helped us in bridging the gap between theoretical and practical aspect of implementing IEEE 802.15.4 for WSNs applications.

Keywords: IEEE 802.15.4, routing, wireless sensor networks, ZigBee.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1532
1652 Design Techniques and Implementation of Low Power High-Throughput Discrete Wavelet Transform Tilters for JPEG 2000 Standard

Authors: Grigorios D. Dimitroulakos, N. D. Zervas, N. Sklavos, Costas E. Goutis

Abstract:

In this paper, the implementation of low power, high throughput convolutional filters for the one dimensional Discrete Wavelet Transform and its inverse are presented. The analysis filters have already been used for the implementation of a high performance DWT encoder [15] with minimum memory requirements for the JPEG 2000 standard. This paper presents the design techniques and the implementation of the convolutional filters included in the JPEG2000 standard for the forward and inverse DWT for achieving low-power operation, high performance and reduced memory accesses. Moreover, they have the ability of performing progressive computations so as to minimize the buffering between the decomposition and reconstruction phases. The experimental results illustrate the filters- low power high throughput characteristics as well as their memory efficient operation.

Keywords: Discrete Wavelet Transform; JPEG2000 standard; VLSI design; Low Power-Throughput-optimized filters

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1225
1651 Decimation Filter Design Toolbox for Multi-Standard Wireless Transceivers using MATLAB

Authors: Shahana T. K., Babita R. Jose, K. Poulose Jacob, Sreela Sasi

Abstract:

The demand for new telecommunication services requiring higher capacities, data rates and different operating modes have motivated the development of new generation multi-standard wireless transceivers. A multi-standard design often involves extensive system level analysis and architectural partitioning, typically requiring extensive calculations. In this research, a decimation filter design tool for wireless communication standards consisting of GSM, WCDMA, WLANa, WLANb, WLANg and WiMAX is developed in MATLAB® using GUIDE environment for visual analysis. The user can select a required wireless communication standard, and obtain the corresponding multistage decimation filter implementation using this toolbox. The toolbox helps the user or design engineer to perform a quick design and analysis of decimation filter for multiple standards without doing extensive calculation of the underlying methods.

Keywords: Decimation filter, MATLAB® toolbox, Multistandard transceivers, Sigma-delta A/D converter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2885
1650 Multi-Objective Optimization for Performance-based Seismic Retrofit using Connection Upgrade

Authors: Dong-Chul Lee, Byung-Kwan Oh, Se-Woon Choi, Hyo-Sun Park

Abstract:

The unanticipated brittle fracture of connection of the steel moment resisting frame (SMRF) occurred in 1994 the Northridge earthquake. Since then, the researches for the vulnerability of connection of the existing SMRF and for rehabilitation of those buildings were conducted. This paper suggests performance-based optimal seismic retrofit technique using connection upgrade. For optimal design, a multi-objective genetic algorithm(NSGA-II) is used. One of the two objective functions is to minimize initial cost and another objective function is to minimize lifetime seismic damages cost. The optimal algorithm proposed in this paper is performed satisfying specified performance objective based on FEMA 356. The nonlinear static analysis is performed for structural seismic performance evaluation. A numerical example of SAC benchmark SMRF is provided using the performance-based optimal seismic retrofit technique proposed in this paper

Keywords: connection upgrade, performace-based seismicdesign, seismic retrofit, multi-objective optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1987
1649 Mental Vulnerability and Coping Strategies as a Factor for Academic Success for Pupils with Special Education Needs

Authors: T. Dubayova

Abstract:

Slovak, as well as foreign authors, believe that the influence of non-cognitive factors on a student's academic success or failure is unquestionable. The aim of this paper is to establish a link between the mental vulnerability and coping strategies used by 4th grade elementary school students in dealing with stressful situations and their academic performance, which was used as a simple quantitative indicator of academic success. The research sample consists of 320 students representing the standard population and 60 students with special education needs (SEN), who were assessed by the Strengths and Difficulties Questionnaire (SDQ) by their teachers and the Children’s Coping Strategies Checklist (CCSC-R1) filled in by themselves. Students with SEN recorded an extraordinarily high frequency of mental vulnerability (34.5 %) than students representing the standard population (7 %). The poorest academic performance of students with SEN was associated with the avoidance behavior displayed during stressful situations. Students of the standard population did not demonstrate this association. Students with SEN are more likely to display mental health problems than students of the standard population. This may be caused by the accumulation of and frequent exposure to situations that they perceive as stressful.

Keywords: Coping, mental vulnerability, students with special education needs, academic performance, academic success.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500
1648 Survey of Key Management Algorithms in WiMAX

Authors: R. Chithra, B. Kalavathi, J. Christy Lavanya

Abstract:

WiMAX is a telecommunications technology and it is specified by the Institute of Electrical and Electronics Engineers Inc., as the IEEE 802.16 standard. The goal of this technology is to provide a wireless data over long distances in a variety of ways. IEEE 802.16 is a recent standard for mobile communication. In this paper, we provide an overview of various key management algorithms to provide security for WiMAX.

Keywords: Broadcast, Rekeying, Scalability, Secrecy, Unicast, WiMAX.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1991
1647 Design of Extremum Seeking Control with PD Accelerator and its Application to Monod and Williams-Otto Models

Authors: Hitoshi Takata, Tomohiro Hachino, Masaki Horai, Kazuo Komatsu

Abstract:

In this paper, we are concerned with the design and its simulation studies of a modified extremum seeking control for nonlinear systems. A standard extremum seeking control has a simple structure, but it takes a long time to reach an optimal operating point. We consider a modification of the standard extremum seeking control which is aimed to reach the optimal operating point more speedily than the standard one. In the modification, PD acceleration term is added before an integrator making a principal control, so that it enables the objects to be regulated to the optimal point smoothly. This proposed method is applied to Monod and Williams-Otto models to investigate its effectiveness. Numerical simulation results show that this modified method can improve the time response to the optimal operating point more speedily than the standard one.

Keywords: Extremum seeking control, Monod model, Williams- Otto model, PD acceleration term, Optimal operating point.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1458
1646 Finite Element Modelling of Log Wall Corner Joints

Authors: R. Kalantari, G. Hafeez

Abstract:

The paper presents outcomes of the numerical research performed on standard and dovetail corner joints under lateral loads. An overview of the past research on log shear walls is also presented. To the authors’ best knowledge, currently, there are no specific design guidelines available in the code for the design of log shear walls, implying the need to investigate the performance of log shear walls. This research explores the performance of the log shear wall corner joint system of standard joint and dovetail types using numerical methods based on research available in the literature. A parametric study is performed to study the effect of gap size provided between two orthogonal logs and the presence of wood and steel dowels provided as joinery between log courses on the performance of such a structural system. The research outcomes are the force-displacement curves. Variability of 8% is seen in the reaction forces with the change of gap size for the case of the standard joint, while a variation of 10% is observed in the reaction forces for the dovetail joint system.

Keywords: dovetail joint, finite element modelling, log shear walls, standard joint

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 434
1645 The Determination of Heavy Metal in Herb Used in Dusit Community to Develop a Sustainable Quality of Life

Authors: Chinnawat Satsananan

Abstract:

This research aimed to find amount of heavy metal in herb used in Dusit community and compare of heavy metal in each part by quantity in herb and standard determination in Thai herb books to develop a sustainable quality of life, the result of study in 14 herbs do not find sample of heavy metal., by quantity of heavy contamination of 4 kinds: Cd, Co, Fe and Pb have lower than standard of 2 organizations: Thai herb standard, and World Health Organization, from the test 14 herbs have Fe in every part of herbs and all 14 kinds has Fe that is necessary for our health.

Keywords: Herbs Plants, Heavy Metal, Dusit District

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1791
1644 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques

Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah

Abstract:

Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or underestimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improve accuracies. This requires standard measurement methods to be structured in ontological and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.

Keywords: BIM, Construction projects, Cost estimation, NRM, Ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4380
1643 Standard Deviation of Mean and Variance of Rows and Columns of Images for CBIR

Authors: H. B. Kekre, Kavita Patil

Abstract:

This paper describes a novel and effective approach to content-based image retrieval (CBIR) that represents each image in the database by a vector of feature values called “Standard deviation of mean vectors of color distribution of rows and columns of images for CBIR". In many areas of commerce, government, academia, and hospitals, large collections of digital images are being created. This paper describes the approach that uses contents as feature vector for retrieval of similar images. There are several classes of features that are used to specify queries: colour, texture, shape, spatial layout. Colour features are often easily obtained directly from the pixel intensities. In this paper feature extraction is done for the texture descriptor that is 'variance' and 'Variance of Variances'. First standard deviation of each row and column mean is calculated for R, G, and B planes. These six values are obtained for one image which acts as a feature vector. Secondly we calculate variance of the row and column of R, G and B planes of an image. Then six standard deviations of these variance sequences are calculated to form a feature vector of dimension six. We applied our approach to a database of 300 BMP images. We have determined the capability of automatic indexing by analyzing image content: color and texture as features and by applying a similarity measure Euclidean distance.

Keywords: Standard deviation Image retrieval, color distribution, Variance, Variance of Variance, Euclidean distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3698
1642 An Ontology Model for Systems Engineering Derived from ISO/IEC/IEEE 15288: 2015: Systems and Software Engineering - System Life Cycle Processes

Authors: Lan Yang, Kathryn Cormican, Ming Yu

Abstract:

ISO/IEC/IEEE 15288: 2015, Systems and Software Engineering - System Life Cycle Processes is an international standard that provides generic top-level process descriptions to support systems engineering (SE). However, the processes defined in the standard needs improvement to lift integrity and consistency. The goal of this research is to explore the way by building an ontology model for the SE standard to manage the knowledge of SE. The ontology model gives a whole picture of the SE knowledge domain by building connections between SE concepts. Moreover, it creates a hierarchical classification of the concepts to fulfil different requirements of displaying and analysing SE knowledge.

Keywords: Knowledge management, model-based systems engineering, ontology modelling, systems engineering ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1804
1641 Fuzzy Multiple Criteria Decision Making for Unmanned Combat Aircraft Selection Using Proximity Measure Method

Authors: C. Ardil

Abstract:

Intuitionistic fuzzy sets (IFS), Pythagorean fuzzy sets (PyFS), Picture fuzzy sets (PFS), q-rung orthopair fuzzy sets (q-ROF), Spherical fuzzy sets (SFS), T-spherical FS, and Neutrosophic sets (NS) are reviewed as multidimensional extensions of fuzzy sets in order to more explicitly and informatively describe the opinions of decision-making experts under uncertainty. To handle operations with standard fuzzy sets (SFS), the necessary operators; weighted arithmetic mean (WAM), weighted geometric mean (WGM), and Minkowski distance function are defined. The algorithm of the proposed proximity measure method (PMM) is provided with a multiple criteria group decision making method (MCDM) for use in a standard fuzzy set environment. To demonstrate the feasibility of the proposed method, the problem of selecting the best drone for an Air Force procurement request is used. The proximity measure method (PMM) based multidimensional standard fuzzy sets (SFS) is introduced to demonstrate its use with an issue involving unmanned combat aircraft selection.

Keywords: standard fuzzy sets (SFS), unmanned combat aircraft selection, multiple criteria decision making (MCDM), proximity measure method (PMM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 278