Search results for: early fire detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2175

Search results for: early fire detection

1185 Multi-Layer Multi-Feature Background Subtraction Using Codebook Model Framework

Authors: Yun-Tao Zhang, Jong-Yeop Bae, Whoi-Yul Kim

Abstract:

Background modeling and subtraction in video analysis has been widely used as an effective method for moving objects detection in many computer vision applications. Recently, a large number of approaches have been developed to tackle different types of challenges in this field. However, the dynamic background and illumination variations are the most frequently occurred problems in the practical situation. This paper presents a favorable two-layer model based on codebook algorithm incorporated with local binary pattern (LBP) texture measure, targeted for handling dynamic background and illumination variation problems. More specifically, the first layer is designed by block-based codebook combining with LBP histogram and mean value of each RGB color channel. Because of the invariance of the LBP features with respect to monotonic gray-scale changes, this layer can produce block wise detection results with considerable tolerance of illumination variations. The pixel-based codebook is employed to reinforce the precision from the output of the first layer which is to eliminate false positives further. As a result, the proposed approach can greatly promote the accuracy under the circumstances of dynamic background and illumination changes. Experimental results on several popular background subtraction datasets demonstrate very competitive performance compared to previous models.

Keywords: Background subtraction, codebook model, local binary pattern, dynamic background, illumination changes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1965
1184 Improving Concrete Properties with Fibers Addition

Authors: E. Mello, C. Ribellato, E. Mohamedelhassan

Abstract:

This study investigated the improvement in concrete properties with addition of cellulose, steel, carbon and PET fibers. Each fiber was added at four percentages to the fresh concrete, which was moist-cured for 28-days and then tested for compressive, flexural and tensile strengths. Changes in strength and increases in cost were analyzed. Results showed that addition of cellulose caused a decrease between 9.8% and 16.4% in compressive strength. This range may be acceptable as cellulose fibers can significantly increase the concrete resistance to fire, and freezing and thawing cycles. Addition of steel fibers to concreteincreased the compressive strength by up to 20%. Increases 121.5% and 80.7% were reported in tensile and flexural strengths respectively. Carbon fibers increased flexural and tensile strengths by up to 11% and 45%, respectively. Concrete strength properties decreased after the addition of PET fibers. Results showed that improvement in strength after addition of steel and carbon fibers may justify the extra cost of fibers.

Keywords: Concrete, compressive strength, fibers, flexural strength, tensile strength.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11496
1183 Developing Leadership and Teamwork Skills of Pre-Service Teacher through Learning Camp

Authors: Sirimanee Banjong

Abstract:

This study aimed to 1) develop pre-service teachers’ leadership skills through camp-based learning, and 2) develop preservice teachers’ teamwork skills through camp-based learning. An applied research methodology was used. The target group was derived from a purposive selection. It involved 32 fourth-year students in Early Childhood Education Program enrolling a course entitled Seminar in Early Childhood Education provided during second semester of academic year 2013. The treatment was camp-based learning activities which applied a PDCA process including four stages: 1) plan, 2) do, 3) check, and 4) act. Research instruments were a learning camp program, a camp-based learning management plan, a 5-level assessment form for leadership skills and a 5-level assessment form for assessing teamwork skills. Data were analyzed using descriptive statistics. Results were: 1) pre-service teachers’ leadership skills yielded the before treatment average score at x= 3.4, S.D.=0.6 2and the after-treatment average score at x 4.29 , S.D.=0.66 pre-service teachers’ teamwork skills yielded the before-treatment average score at x=3.31, S.D.=0.60 and the after-treatment average score at x=4.42, S.D.=0.66 Both differences were statistically significant at the .05 level. Thus, the pre-service teachers’ leadership and teamwork skills were significantly improved through the camp-based learning approach.

Keywords: Learning camp, leadership skills, teamwork skills.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1350
1182 Thermal Securing of Electrical Contacts inside Oil Power Transformers

Authors: Ioan Rusu

Abstract:

In the operation of power transformers of 110 kV/MV from substations, these are traveled by fault current resulting from MV line damage. Defect electrical contacts are heated when they are travelled from fault currents. In the case of high temperatures when 135 °C is reached, the electrical insulating oil in the vicinity of the electrical faults comes into contact with these contacts releases gases, and activates the electrical protection. To avoid auto-flammability of electro-insulating oil, we designed a security system thermal of electrical contact defects by pouring fire-resistant polyurethane foam, mastic or mortar fire inside a cardboard electro-insulating cylinder. From practical experience, in the exploitation of power transformers of 110 kV/MT in oil electro-insulating were recorded some passing disconnecting commanded by the gas protection at internal defects. In normal operation and in the optimal load, nominal currents do not require thermal secure contacts inside electrical transformers, contacts are made at the fabrication according to the projects or to repair by solder. In the case of external short circuits close to the substation, the contacts inside electrical transformers, even if they are well made in sizes of Rcontact = 10‑6 Ω, are subjected to short-circuit currents of the order of 10 kA-20 kA which lead to the dissipation of some significant second-order electric powers, 100 W-400 W, on contact. At some internal or external factors which action on electrical contacts, including electrodynamic efforts at short-circuits, these factors could be degraded over time to values in the range of 10-4 Ω to 10-5 Ω and if the action time of protection is great, on the order of seconds, power dissipation on electrical contacts achieve high values of 1,0 kW to 40,0 kW. This power leads to strong local heating, hundreds of degrees Celsius and can initiate self-ignition and burning oil in the vicinity of electro-insulating contacts with action the gas relay. Degradation of electrical contacts inside power transformers may not be limited for the duration of their operation. In order to avoid oil burn with gas release near electrical contacts, at short-circuit currents 10 kA-20 kA, we have outlined the following solutions: covering electrical contacts in fireproof materials that would avoid direct burn oil at short circuit and transmission of heat from electrical contact along the conductors with heat dissipation gradually over time, in a large volume of cooling. Flame retardant materials are: polyurethane foam, mastic, cement (concrete). In the normal condition of operation of transformer, insulating of conductors coils is with paper and insulating oil. Ignition points of its two components respectively are approximated: 135 °C heat for oil and 200 0C for paper. In the case of a faulty electrical contact, about 10-3 Ω, at short-circuit; the temperature can reach for a short time, a value of 300 °C-400 °C, which ignite the paper and also the oil. By burning oil, there are local gases that disconnect the power transformer. Securing thermal electrical contacts inside the transformer, in cardboard tube with polyurethane foams, mastik or cement, ensures avoiding gas release and also gas protection working.

Keywords: Power transformer, oil insulatation, electric contacts, gases, gas relay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 648
1181 Influence of Port Geometry on Thrust Transient of Solid Propellant Rockets at Liftoff

Authors: Karuppasamy Pandian. M, Krishna Raj. K, Sabarinath. K, Sandeep. G, Sanal Kumar. V.R.

Abstract:

Numerical studies have been carried out using a two dimensional code to examine the influence of pressure / thrust transient of solid propellant rockets at liftoff. This code solves unsteady Reynolds-averaged thin-layer Navier–Stokes equations by an implicit LU-factorization time-integration method. The results from the parametric study indicate that when the port is narrow there is a possibility of increase in pressure / thrust-rise rate due to relatively high flame spread rate. Parametric studies further reveal that flame spread rate can be altered by altering the propellant properties, igniter jet characteristics and nozzle closure burst pressure without altering the grain configuration and/or the mission demanding thrust transient. We observed that when the igniter turbulent intensity is relatively low the vehicle could liftoff early due to the early flow choking of the rocket nozzle. We concluded that the high pressurization-rate has structural implications at liftoff in addition to transient burning effect. Therefore prudent selection of the port geometry and the igniter, for meeting the mission requirements, within the given envelop are meaningful objectives for any designer for the smooth liftoff of solid propellant rockets.

Keywords: Igniter Characteristics, Solid Propellant Rocket, SRM Liftoff, Starting Thrust Transient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2784
1180 An Approach on Integrating Cooperative Education Experience into the Engineering Curriculum

Authors: Robin Lok-Wang

Abstract:

The center/unit for industry engagement and collaboration, as well as Internship, plays a significant role at a university. In general, the Center serves as the official interface between industry and the school or department to cultivate students’ early exposure to professional experience. The missions of the Center are not limited to provide a communication channel and collaborative platform for the industries and the university but also to assist students to build their career paths early while still at the university. In recent years, a cooperative education experience (commonly known as a co-op) has been strongly advocated for students to make the school-to-work transition. The nature of the co-op program is not only consistent with the internships/final year design projects, but it is also more industrial-oriented with academic support from faculty at the university. The purpose of this paper is to describe an approach to how cooperative education experience can be integrated into the engineering curriculum. It provides a mutual understanding and exchange of ideas for the approach between the university and industry. A suggested format in terms of timeline, duration, selection of candidates, students, and companies’ expectations for the co-op program is described. Also, feedback from employers/industries shows that a longer-term co-op program is well suited for students compared with a short-term internship. To this end, it provides an insight into collaboration and/or partnership between the university and the industries to prepare professional work-ready graduates.

Keywords: Cooperative education, internship, industry collaboration, engineering curriculum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 287
1179 A Methodology of Testing Beam to Column Connection under Lateral Impact Load

Authors: A. Al-Rifaie, Z. W. Guan, S. W. Jones

Abstract:

Beam to column connection can be considered as the most important structural part that affects the response of buildings to progressive collapse. However, many studies were conducted to investigate the beam to column connection under accidental loads such as fire, blast and impact load to investigate the connection response. The study is a part of a PhD plan to investigate different types of connections under lateral impact load. The conventional test setups, such as cruciform setup, were designed to apply shear forces and bending moment on the connection, whilst, in the lateral impact case, the connection is subjected to combined tension and moment. Hence, a review is presented to introduce the previous test setup that is used to investigate the connection behaviour. Then, the design and fabrication of the novel test setup is presented. Finally, some trial test results to investigate the efficiency of the proposed setup are discussed. The final results indicate that the setup was efficient in terms of the simplicity and strength.

Keywords: Connections, impact load, drop hammer, testing methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1197
1178 Automatic Detection of Defects in Ornamental Limestone Using Wavelets

Authors: Maria C. Proença, Marco Aniceto, Pedro N. Santos, José C. Freitas

Abstract:

A methodology based on wavelets is proposed for the automatic location and delimitation of defects in limestone plates. Natural defects include dark colored spots, crystal zones trapped in the stone, areas of abnormal contrast colors, cracks or fracture lines, and fossil patterns. Although some of these may or may not be considered as defects according to the intended use of the plate, the goal is to pair each stone with a map of defects that can be overlaid on a computer display. These layers of defects constitute a database that will allow the preliminary selection of matching tiles of a particular variety, with specific dimensions, for a requirement of N square meters, to be done on a desktop computer rather than by a two-hour search in the storage park, with human operators manipulating stone plates as large as 3 m x 2 m, weighing about one ton. Accident risks and work times are reduced, with a consequent increase in productivity. The base for the algorithm is wavelet decomposition executed in two instances of the original image, to detect both hypotheses – dark and clear defects. The existence and/or size of these defects are the gauge to classify the quality grade of the stone products. The tuning of parameters that are possible in the framework of the wavelets corresponds to different levels of accuracy in the drawing of the contours and selection of the defects size, which allows for the use of the map of defects to cut a selected stone into tiles with minimum waste, according the dimension of defects allowed.

Keywords: Automatic detection, wavelets, defects, fracture lines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1166
1177 Comparison of Number of Waves Surfed and Duration Using Global Positioning System and Inertial Sensors

Authors: J. Madureira, R. Lagido, I. Sousa

Abstract:

Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU. 

Keywords: Inertial Measurement Unit (IMU), Global Positioning System (GPS), smartphone, surfing performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655
1176 Effective Traffic Lights Recognition Method for Real Time Driving Assistance Systemin the Daytime

Authors: Hyun-Koo Kim, Ju H. Park, Ho-Youl Jung

Abstract:

This paper presents an effective traffic lights recognition method at the daytime. First, Potential Traffic Lights Detector (PTLD) use whole color source of YCbCr channel image and make each binary image of green and red traffic lights. After PTLD step, Shape Filter (SF) use to remove noise such as traffic sign, street tree, vehicle, and building. At this time, noise removal properties consist of information of blobs of binary image; length, area, area of boundary box, etc. Finally, after an intermediate association step witch goal is to define relevant candidates region from the previously detected traffic lights, Adaptive Multi-class Classifier (AMC) is executed. The classification method uses Haar-like feature and Adaboost algorithm. For simulation, we are implemented through Intel Core CPU with 2.80 GHz and 4 GB RAM and tested in the urban and rural roads. Through the test, we are compared with our method and standard object-recognition learning processes and proved that it reached up to 94 % of detection rate which is better than the results achieved with cascade classifiers. Computation time of our proposed method is 15 ms.

Keywords: Traffic Light Detection, Multi-class Classification, Driving Assistance System, Haar-like Feature, Color SegmentationMethod, Shape Filter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2780
1175 On Combining Support Vector Machines and Fuzzy K-Means in Vision-based Precision Agriculture

Authors: A. Tellaeche, X. P. Burgos-Artizzu, G. Pajares, A. Ribeiro

Abstract:

One important objective in Precision Agriculture is to minimize the volume of herbicides that are applied to the fields through the use of site-specific weed management systems. In order to reach this goal, two major factors need to be considered: 1) the similar spectral signature, shape and texture between weeds and crops; 2) the irregular distribution of the weeds within the crop's field. This paper outlines an automatic computer vision system for the detection and differential spraying of Avena sterilis, a noxious weed growing in cereal crops. The proposed system involves two processes: image segmentation and decision making. Image segmentation combines basic suitable image processing techniques in order to extract cells from the image as the low level units. Each cell is described by two area-based attributes measuring the relations among the crops and the weeds. From these attributes, a hybrid decision making approach determines if a cell must be or not sprayed. The hybrid approach uses the Support Vector Machines and the Fuzzy k-Means methods, combined through the fuzzy aggregation theory. This makes the main finding of this paper. The method performance is compared against other available strategies.

Keywords: Fuzzy k-Means, Precision agriculture, SupportVectors Machines, Weed detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1779
1174 Developing a Multiagent Based Decision Support System for Realtime Multi-Risk Disaster Management

Authors: D. Moser, D. Pinto, A. Cipriano

Abstract:

A Disaster Management System (DMS) is very important for countries with multiple disasters, such as Chile. In the world (also in Chile)different disasters (earthquakes, tsunamis, volcanic eruption, fire or other natural or man-made disasters) happen and have an effect on the population. It is also possible that two or more disasters occur at the same time. This meansthata multi-risk situation must be mastered. To handle such a situation a Decision Support System (DSS) based on multiagents is a suitable architecture. The most known DMSs are concernedwith only a singledisaster (sometimes thecombination of earthquake and tsunami) and often with a particular disaster. Nevertheless, a DSS helps for a better real-time response. Analyze the existing systems in the literature and expand them for multi-risk disasters to construct a well-organized system is the proposal of our work. The here shown work is an approach of a multi-risk system, which needs an architecture and well defined aims. In this moment our study is a kind of case study to analyze the way we have to follow to create our proposed system in the future.

Keywords: Decision Support System, Disaster Management System, Multi-Risk, Multiagent System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2612
1173 Detecting Geographically Dispersed Overlay Communities Using Community Networks

Authors: Madhushi Bandara, Dharshana Kasthurirathna, Danaja Maldeniya, Mahendra Piraveenan

Abstract:

Community detection is an extremely useful technique in understanding the structure and function of a social network. Louvain algorithm, which is based on Newman-Girman modularity optimization technique, is extensively used as a computationally efficient method extract the communities in social networks. It has been suggested that the nodes that are in close geographical proximity have a higher tendency of forming communities. Variants of the Newman-Girman modularity measure such as dist-modularity try to normalize the effect of geographical proximity to extract geographically dispersed communities, at the expense of losing the information about the geographically proximate communities. In this work, we propose a method to extract geographically dispersed communities while preserving the information about the geographically proximate communities, by analyzing the ‘community network’, where the centroids of communities would be considered as network nodes. We suggest that the inter-community link strengths, which are normalized over the community sizes, may be used to identify and extract the ‘overlay communities’. The overlay communities would have relatively higher link strengths, despite being relatively apart in their spatial distribution. We apply this method to the Gowalla online social network, which contains the geographical signatures of its users, and identify the overlay communities within it.

Keywords: Social networks, community detection, modularity optimization, geographically dispersed communities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1298
1172 A Novel Approach for Coin Identification using Eigenvalues of Covariance Matrix, Hough Transform and Raster Scan Algorithms

Authors: J. Prakash, K. Rajesh

Abstract:

In this paper we present a new method for coin identification. The proposed method adopts a hybrid scheme using Eigenvalues of covariance matrix, Circular Hough Transform (CHT) and Bresenham-s circle algorithm. The statistical and geometrical properties of the small and large Eigenvalues of the covariance matrix of a set of edge pixels over a connected region of support are explored for the purpose of circular object detection. Sparse matrix technique is used to perform CHT. Since sparse matrices squeeze zero elements and contain only a small number of non-zero elements, they provide an advantage of matrix storage space and computational time. Neighborhood suppression scheme is used to find the valid Hough peaks. The accurate position of the circumference pixels is identified using Raster scan algorithm which uses geometrical symmetry property. After finding circular objects, the proposed method uses the texture on the surface of the coins called texton, which are unique properties of coins, refers to the fundamental micro structure in generic natural images. This method has been tested on several real world images including coin and non-coin images. The performance is also evaluated based on the noise withstanding capability.

Keywords: Circular Hough Transform, Coin detection, Covariance matrix, Eigenvalues, Raster scan Algorithm, Texton.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1880
1171 Factors for Entry Timing Choices Using Principal Axis Factorial Analysis and Logistic Regression Model

Authors: Mat Isa, C. M., Mohd Saman, H., Mohd Nasir, S. R., Jaapar, A.

Abstract:

International market expansion involves a strategic process of market entry decision through which a firm expands its operation from domestic to the international domain. Hence, entry timing choices require the needs to balance the early entry risks and the problems in losing opportunities as a result of late entry into a new market. Questionnaire surveys administered to 115 Malaysian construction firms operating in 51 countries worldwide have resulted in 39.1 percent response rate. Factor analysis was used to determine the most significant factors affecting entry timing choices of the firms to penetrate the international market. A logistic regression analysis used to examine the firms’ entry timing choices, indicates that the model has correctly classified 89.5 per cent of cases as late movers. The findings reveal that the most significant factor influencing the construction firms’ choices as late movers was the firm factor related to the firm’s international experience, resources, competencies and financing capacity. The study also offers valuable information to construction firms with intention to internationalize their businesses.

Keywords: Factors, early movers, entry timing choices, late movers, Logistic Regression Model, Principal Axis Factorial Analysis, Malaysian construction firms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2232
1170 Encouraging the Development of Scientific Literacy in Early Childhood Institutions: Croatian Experience

Authors: L. Vujičić, Ž. Ivković, Ž. Boneta

Abstract:

There is a widespread belief in everyday discourse that science subjects (physics, chemistry and biology) are, along with math, the most difficult school subjects in the education of an individual. This assumption is usually justified by the following facts: low GPA in these subjects, the number of pupils who fail these subjects is high in comparison to other subjects, and the number of pupils interested in continuing their studies in the fields with a focus on science subjects is lower compared to non-science-oriented fields. From that perspective, the project: “Could it be different? How do children explore it?” becomes extremely interesting because it is focused on young children and on the introduction of new methods, with aim of arousing interest in scientific literacy development in 10 kindergartens by applying the methodology of an action research, with an ethnographic approach. We define scientific literacy as a process of encouraging and nurturing the research and explorative spirit in children, as well as their natural potential and abilities that represent an object of scientific research: to learn about exploration by conducting exploration. Upon project completion, an evaluation questionnaire was created for the parents of the children who had participated in the project, as well as for those whose children had not been involved in the project. The purpose of the first questionnaire was to examine the level of satisfaction with the project implementation and its outcomes among those parents whose children had been involved in the project (N=142), while the aim of the second questionnaire was to find out how much the parents of the children not involved (N=154) in this activity were interested in this topic.

Keywords: Documenting, early childhood education, evaluation questionnaire for parents, scientific literacy development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405
1169 Analysis of the Reasons behind the Deteriorated Standing of Engineering Companies during the Financial Crisis

Authors: Levan Sabauri

Abstract:

In this paper, we discuss the deteriorated standing of engineering companies, some of the reasons behind it and the problems facing engineering enterprises during the financial crisis. We show the part that financial analysis plays in the detection of the main factors affecting the standing of a company, classify internal problems and the reasons influencing efficiency thereof. The publication contains the analysis of municipal engineering companies in post-Soviet transitional economies. In the wake of the 2008 world financial crisis the issue became even more poignant. It should be said though that even before the problem had been no less acute for some post-Soviet states caught up in a lengthy transitional period. The paper highlights shortcomings in the management of transportation companies, with new, more appropriate methods suggested. In analyzing the financial stability of a company, three elements need to be considered: current assets, investment policy and structural management of the funding sources leveraging the stability, should be focused on. Inappropriate management of the three may create certain financial problems, with timely and accurate detection thereof being an issue in terms of improved standing of an enterprise. In this connection, the publication contains a diagram reflecting the reasons behind the deteriorated financial standing of a company, as well as a flow chart thereof. The main reasons behind low profitability are also discussed.

Keywords: Efficiency, financial management, financial analysis funding structure, financial sustainability, investment policy, profitability, solvency, working capital.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1571
1168 Evaluating Residual Mechanical and Physical Properties of Concrete at Elevated Temperatures

Authors: S. Hachemi, A. Ounis, S. Chabi

Abstract:

This paper presents the results of an experimental  study on the effects of elevated temperature on compressive and  flexural strength of Normal Strength Concrete (NSC), High Strength  Concrete (HSC) and High Performance Concrete (HPC). In addition,  the specimen mass and volume were measured before and after  heating in order to determine the loss of mass and volume during the  test. In terms of non-destructive measurement, ultrasonic pulse  velocity test was proposed as a promising initial inspection method  for fire damaged concrete structure. 100 Cube specimens for three  grades of concrete were prepared and heated at a rate of 3°C/min up  to different temperatures (150, 250, 400, 600, and 900°C). The results  show a loss of compressive and flexural strength for all the concretes  heated to temperature exceeding 400°C. The results also revealed that  mass and density of the specimen significantly reduced with an  increase in temperature.

 

Keywords: High temperature, Compressive strength, Mass loss, Ultrasonic pulse velocity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2223
1167 Identifying Autism Spectrum Disorder Using Optimization-Based Clustering

Authors: Sharifah Mousli, Sona Taheri, Jiayuan He

Abstract:

Autism spectrum disorder (ASD) is a complex developmental condition involving persistent difficulties with social communication, restricted interests, and repetitive behavior. The challenges associated with ASD can interfere with an affected individual’s ability to function in social, academic, and employment settings. Although there is no effective medication known to treat ASD, to our best knowledge, early intervention can significantly improve an affected individual’s overall development. Hence, an accurate diagnosis of ASD at an early phase is essential. The use of machine learning approaches improves and speeds up the diagnosis of ASD. In this paper, we focus on the application of unsupervised clustering methods in ASD, as a large volume of ASD data generated through hospitals, therapy centers, and mobile applications has no pre-existing labels. We conduct a comparative analysis using seven clustering approaches, such as K-means, agglomerative hierarchical, model-based, fuzzy-C-means, affinity propagation, self organizing maps, linear vector quantisation – as well as the recently developed optimization-based clustering (COMSEP-Clust) approach. We evaluate the performances of the clustering methods extensively on real-world ASD datasets encompassing different age groups: toddlers, children, adolescents, and adults. Our experimental results suggest that the COMSEP-Clust approach outperforms the other seven methods in recognizing ASD with well-separated clusters.

Keywords: Autism spectrum disorder, clustering, optimization, unsupervised machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 417
1166 Detection of Defects in CFRP by Ultrasonic IR Thermographic Method

Authors: W. Swiderski

Abstract:

In the paper introduced the diagnostic technique making possible the research of internal structures in composite materials reinforced fibres using in different applications. The main reason of damages in structures of these materials is the changing distribution of load in constructions in the lifetime. Appearing defect is largely complicated because of the appearance of disturbing of continuity of reinforced fibres, binder cracks and loss of fibres adhesiveness from binders. Defect in composite materials is usually more complicated than in metals. At present, infrared thermography is the most effective method in non-destructive testing composite. One of IR thermography methods used in non-destructive evaluation is vibrothermography. The vibrothermography is not a new non-destructive method, but the new solution in this test is use ultrasonic waves to thermal stimulation of materials. In this paper, both modelling and experimental results which illustrate the advantages and limitations of ultrasonic IR thermography in inspecting composite materials will be presented. The ThermoSon computer program for computing 3D dynamic temperature distribuions in anisotropic layered solids with subsurface defects subject to ulrasonic stimulation was used to optimise heating parameters in the detection of subsurface defects in composite materials. The program allows for the analysis of transient heat conduction and ultrasonic wave propagation phenomena in solids. The experiments at MIAT were fulfilled by means of FLIR SC 7600 IR camera. Ultrasonic stimulation was performed with the frequency from 15 kHz to 30 kHz with maximum power up to 2 kW.

Keywords: Composite material, ultrasonic, infrared thermography, non-destructive testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 842
1165 Behaviour of Lightweight Expanded Clay Aggregate Concrete Exposed to High Temperatures

Authors: Lenka Bodnárová, Rudolf Hela, Michala Hubertová, Iveta Nováková

Abstract:

This paper is concerning the issues of behaviour of lightweight expanded clay aggregates concrete exposed to high temperature. Lightweight aggregates from expanded clay are produced by firing of row material up to temperature 1050°C. Lightweight aggregates have suitable properties in terms of volume stability, when exposed to temperatures up to 1050°C, which could indicate their suitability for construction applications with higher risk of fire. The test samples were exposed to heat by using the standard temperature-time curve ISO 834. Negative changes in resulting mechanical properties, such as compressive strength, tensile strength, and flexural strength were evaluated. Also visual evaluation of the specimen was performed. On specimen exposed to excessive heat, an explosive spalling could be observed, due to evaporation of considerable amount of unbounded water from the inner structure of the concrete.

Keywords: Expanded clay aggregate, explosive spalling, high temperature, lightweight concrete, temperature-time curve ISO 834.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3594
1164 Target Detection using Adaptive Progressive Thresholding Based Shifted Phase-Encoded Fringe-Adjusted Joint Transform Correlator

Authors: Inder K. Purohit, M. Nazrul Islam, K. Vijayan Asari, Mohammad A. Karim

Abstract:

A new target detection technique is presented in this paper for the identification of small boats in coastal surveillance. The proposed technique employs an adaptive progressive thresholding (APT) scheme to first process the given input scene to separate any objects present in the scene from the background. The preprocessing step results in an image having only the foreground objects, such as boats, trees and other cluttered regions, and hence reduces the search region for the correlation step significantly. The processed image is then fed to the shifted phase-encoded fringe-adjusted joint transform correlator (SPFJTC) technique which produces single and delta-like correlation peak for a potential target present in the input scene. A post-processing step involves using a peak-to-clutter ratio (PCR) to determine whether the boat in the input scene is authorized or unauthorized. Simulation results are presented to show that the proposed technique can successfully determine the presence of an authorized boat and identify any intruding boat present in the given input scene.

Keywords: Adaptive progressive thresholding, fringe adjusted filters, image segmentation, joint transform correlation, synthetic discriminant function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1208
1163 Reconsidering the Palaeo-Environmental Reconstruction of the Wet Zone of Sri Lanka: A Zooarchaeological Perspective

Authors: Kalangi Rodrigo, Kelum Manamendra-Arachchi

Abstract:

Bones, teeth, and shells have been acknowledged over the last two centuries as evidence of chronology, Palaeo-environment, and human activity. Faunal traces are valid evidence of past situations because they have properties that have not changed over long periods. Sri Lanka has been known as an Island, which has a diverse variety of prehistoric occupation among ecological zones. Defining the Paleoecology of the past societies has been an archaeological thought developed in the 1960s. It is mainly concerned with the reconstruction from available geological and biological evidence of past biota, populations, communities, landscapes, environments, and ecosystems. This early and persistent human fossil, technical, and cultural florescence, as well as a collection of well-preserved tropical-forest rock shelters with associated 'on-site ' Palaeoenvironmental records, makes Sri Lanka a central and unusual case study to determine the extent and strength of early human tropical forest encounters. Excavations carried out in prehistoric caves in the low country wet zone has shown that in the last 50,000 years, the temperature in the lowland rainforests has not exceeded 5 degrees. Based on Semnopithecus Priam (Gray Langur) remains unearthed from wet zone prehistoric caves, it has been argued periods of momentous climate changes during the Last Glacial Maximum (LGM) and Terminal Pleistocene/Early Holocene boundary, with a recognizable preference for semi-open ‘Intermediate’ rainforest or edges. Continuous genus Acavus and Oligospira occupation along with uninterrupted horizontal pervasive of Canarium sp. (‘kekuna’ nut) have proven that temperatures in the lowland rain forests have not changed by at least 5 °C over the last 50,000 years. Site catchment or territorial analysis cannot be any longer defensible, due to time-distance based factors as well as optimal foraging theory failed as a consequence of prehistoric people were aware of the decrease in cost-benefit ratio and located sites, and generally played out a settlement strategy that minimized the ratio of energy expended to energy produced.

Keywords: Palaeo-environment, palaeo-ecology, palaeo-climate, prehistory, zooarchaeology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 738
1162 Probabilistic Wavelet Neural Network Based Vibration Analysis of Induction Motor Drive

Authors: K. Jayakumar, S. Thangavel

Abstract:

In this paper proposed the effective fault detection of industrial drives by using Biorthogonal Posterior Vibration Signal-Data Probabilistic Wavelet Neural Network (BPPVS-WNN) system. This system was focused to reducing the current flow and to identify faults with lesser execution time with harmonic values obtained through fifth derivative. Initially, the construction of Biorthogonal vibration signal-data based wavelet transform in BPPVS-WNN system localizes the time and frequency domain. The Biorthogonal wavelet approximates the broken bearing using double scaling and factor, identifies the transient disturbance due to fault on induction motor through approximate coefficients and detailed coefficient. Posterior Probabilistic Neural Network detects the final level of faults using the detailed coefficient till fifth derivative and the results obtained through it at a faster rate at constant frequency signal on the industrial drive. Experiment through the Simulink tool detects the healthy and unhealthy motor on measuring parametric factors such as fault detection rate based on time, current flow rate, and execution time.

Keywords: Biorthogonal Wavelet Transform, Posterior Probabilistic Neural Network, Induction Motor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1018
1161 Pre-Operative Tool for Facial-Post-Surgical Estimation and Detection

Authors: Ayat E. Ali, Christeen R. Aziz, Merna A. Helmy, Mohammed M. Malek, Sherif H. El-Gohary

Abstract:

Goal: Purpose of the project was to make a plastic surgery prediction by using pre-operative images for the plastic surgeries’ patients and to show this prediction on a screen to compare between the current case and the appearance after the surgery. Methods: To this aim, we implemented a software which used data from the internet for facial skin diseases, skin burns, pre-and post-images for plastic surgeries then the post- surgical prediction is done by using K-nearest neighbor (KNN). So we designed and fabricated a smart mirror divided into two parts a screen and a reflective mirror so patient's pre- and post-appearance will be showed at the same time. Results: We worked on some skin diseases like vitiligo, skin burns and wrinkles. We classified the three degrees of burns using KNN classifier with accuracy 60%. We also succeeded in segmenting the area of vitiligo. Our future work will include working on more skin diseases, classify them and give a prediction for the look after the surgery. Also we will go deeper into facial deformities and plastic surgeries like nose reshaping and face slim down. Conclusion: Our project will give a prediction relates strongly to the real look after surgery and decrease different diagnoses among doctors. Significance: The mirror may have broad societal appeal as it will make the distance between patient's satisfaction and the medical standards smaller.

Keywords: K-nearest neighbor, face detection, vitiligo, bone deformity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 701
1160 Fetal and Infant Mortality in Botucatu City, São Paulo State, Brazil: Evaluation of Maternal - Infant Health Care

Authors: Noda L. M., Salvador I. C, C. M. L. G. Parada, Fonseca C. R. B.

Abstract:

In Brazil, neonatal mortality rate is considered incompatible with the country development conditions, and has been a Public Health concern. Reduction in infant mortality rates has also been part of the Millennium Development Goals, a commitment made by countries, members of the Organization of United Nations (OUN), including Brazil. Fetal mortality rate is considered a highly sensitive indicator of health care quality. Suitable actions, such as good quality and access to health services may contribute positively towards reduction in these fetal and neonatal rates. With appropriate antenatal follow-up and health care during gestation and delivery, some death causes could be reduced or even prevented by means of early diagnosis and intervention, as well as changes in risk factors and interventions. Objectives: To study the quality of maternal and infant health care based on fetal and neonatal mortality, as well as the possible actions to prevent those deaths in Botucatu (Brazil). Methods: Classification of prevention according to the International Classification of Diseases and the modified Wigglesworth´s classification. In order to evaluate adequacy, indicators of quality of antenatal and delivery care were established by the authors. Results: Considering fetal deaths, 56.7% of them occurred before delivery, which reveals possible shortcomings in antenatal care, and 38.2% of them were a result of intra- labor changes, which could be prevented or reduced by adequate obstetric management. These findings were different from those in the group of early neonatal deaths which were also studied. Adequacy of health services showed that antenatal and childbirth care was appropriate for 24% and 33.3% of pregnant women, respectively, which corroborates the results of prevention. These results revealed that shortcomings in obstetric and antenatal care could be the causes of deaths in the study. Early and late neonatal deaths have similar characteristics: 76% could be prevented or reduced mainly by adequate newborn care (52.9%) and adequate health care for gestational women (11.7%). When adequacy of care was evaluated, childbirth and newborn care was adequate in 25.8% and antenatal care was adequate in 16.1%. In conclusion, direct relationship was found between adequacy and quality of care rendered to pregnant women and newborns, and fetal and infant mortality. Moreover, our findings highlight that deaths could be prevented by an adequate obstetric and neonatal management.

Keywords: Fetal Mortality, Infant Mortality, Maternal-Child Health Services, Program Evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5069
1159 X-Ray Intensity Measurement Using Frequency Output Sensor for Computed Tomography

Authors: R. M. Siddiqui, D. Z. Moghaddam, T. R. Turlapati, S. H. Khan, I. Ul Ahad

Abstract:

Quality of 2D and 3D cross-sectional images produce by Computed Tomography primarily depend upon the degree of precision of primary and secondary X-Ray intensity detection. Traditional method of primary intensity detection is apt to errors. Recently the X-Ray intensity measurement system along with smart X-Ray sensors is developed by our group which is able to detect primary X-Ray intensity unerringly. In this study a new smart X-Ray sensor is developed using Light-to-Frequency converter TSL230 from Texas Instruments which has numerous advantages in terms of noiseless data acquisition and transmission. TSL230 construction is based on a silicon photodiode which converts incoming X-Ray radiation into the proportional current signal. A current to frequency converter is attached to this photodiode on a single monolithic CMOS integrated circuit which provides proportional frequency count to incoming current signal in the form of the pulse train. The frequency count is delivered to the center of PICDEM FS USB board with PIC18F4550 microcontroller mounted on it. With highly compact electronic hardware, this Demo Board efficiently read the smart sensor output data. The frequency output approaches overcome nonlinear behavior of sensors with analog output thus un-attenuated X-Ray intensities could be measured precisely and better normalization could be acquired in order to attain high resolution.

Keywords: Computed tomography, detector technology, X-Ray intensity measurement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2609
1158 A Rule-based Approach for Anomaly Detection in Subscriber Usage Pattern

Authors: Rupesh K. Gopal, Saroj K. Meher

Abstract:

In this report we present a rule-based approach to detect anomalous telephone calls. The method described here uses subscriber usage CDR (call detail record) data sampled over two observation periods: study period and test period. The study period contains call records of customers- non-anomalous behaviour. Customers are first grouped according to their similar usage behaviour (like, average number of local calls per week, etc). For customers in each group, we develop a probabilistic model to describe their usage. Next, we use maximum likelihood estimation (MLE) to estimate the parameters of the calling behaviour. Then we determine thresholds by calculating acceptable change within a group. MLE is used on the data in the test period to estimate the parameters of the calling behaviour. These parameters are compared against thresholds. Any deviation beyond the threshold is used to raise an alarm. This method has the advantage of identifying local anomalies as compared to techniques which identify global anomalies. The method is tested for 90 days of study data and 10 days of test data of telecom customers. For medium to large deviations in the data in test window, the method is able to identify 90% of anomalous usage with less than 1% false alarm rate.

Keywords: Subscription fraud, fraud detection, anomalydetection, maximum likelihood estimation, rule based systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2813
1157 Least Square-SVM Detector for Wireless BPSK in Multi-Environmental Noise

Authors: J. P. Dubois, Omar M. Abdul-Latif

Abstract:

Support Vector Machine (SVM) is a statistical learning tool developed to a more complex concept of structural risk minimization (SRM). In this paper, SVM is applied to signal detection in communication systems in the presence of channel noise in various environments in the form of Rayleigh fading, additive white Gaussian background noise (AWGN), and interference noise generalized as additive color Gaussian noise (ACGN). The structure and performance of SVM in terms of the bit error rate (BER) metric is derived and simulated for these advanced stochastic noise models and the computational complexity of the implementation, in terms of average computational time per bit, is also presented. The performance of SVM is then compared to conventional binary signaling optimal model-based detector driven by binary phase shift keying (BPSK) modulation. We show that the SVM performance is superior to that of conventional matched filter-, innovation filter-, and Wiener filter-driven detectors, even in the presence of random Doppler carrier deviation, especially for low SNR (signal-to-noise ratio) ranges. For large SNR, the performance of the SVM was similar to that of the classical detectors. However, the convergence between SVM and maximum likelihood detection occurred at a higher SNR as the noise environment became more hostile.

Keywords: Colour noise, Doppler shift, innovation filter, least square-support vector machine, matched filter, Rayleigh fading, Wiener filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1813
1156 Evaluation of the Microscopic-Observation Drug-Susceptibility Assay Drugs Concentration for Detection of Multidrug-Resistant Tuberculosis

Authors: Anita, Sari Septiani Tangke, Rusdina Bte Ladju, Nasrum Massi

Abstract:

New diagnostic tools are urgently needed to interrupt the transmission of tuberculosis and multidrug-resistant tuberculosis. The microscopic-observation drug-susceptibility (MODS) assay is a rapid, accurate and simple liquid culture method to detect multidrug-resistant tuberculosis (MDR-TB). MODS were evaluated to determine a lower and same concentration of isoniazid and rifampin for detection of MDR-TB. Direct drug-susceptibility testing was performed with the use of the MODS assay. Drug-sensitive control strains were tested daily. The drug concentrations that used for both isoniazid and rifampin were at the same concentration: 0.16, 0.08 and 0.04μg per milliliter. We tested 56 M. tuberculosis clinical isolates and the control strains M. tuberculosis H37RV. All concentration showed same result. Of 53 M. tuberculosis clinical isolates, 14 were MDR-TB, 38 were susceptible with isoniazid and rifampin, 1 was resistant with isoniazid only. Drug-susceptibility testing was performed with the use of the proportion method using Mycobacteria Growth Indicator Tube (MGIT) system as reference. The result of MODS assay using lower concentration was significance (P<0.001) compare with the reference methods.

A lower and same concentration of isoniazid and rifampin can be used to detect MDR-TB. Operational cost and application can be more efficient and easier in resource-limited environments. However, additional studies evaluating the MODS using lower and same concentration of isoniazid and rifampin must be conducted with a larger number of clinical isolates.

Keywords: Isoniazid, MODS assay, MDR-TB, Rifampin.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1593