Search results for: intelligent computational techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3898

Search results for: intelligent computational techniques

1108 Weight Functions for Signal Reconstruction Based On Level Crossings

Authors: Nagesha, G. Hemantha Kumar

Abstract:

Although the level crossing concept has been the subject of intensive investigation over the last few years, certain problems of great interest remain unsolved. One of these concern is distribution of threshold levels. This paper presents a new threshold level allocation schemes for level crossing based on nonuniform sampling. Intuitively, it is more reasonable if the information rich regions of the signal are sampled finer and those with sparse information are sampled coarser. To achieve this objective, we propose non-linear quantization functions which dynamically assign the number of quantization levels depending on the importance of the given amplitude range. Two new approaches to determine the importance of the given amplitude segment are presented. The proposed methods are based on exponential and logarithmic functions. Various aspects of proposed techniques are discussed and experimentally validated. Its efficacy is investigated by comparison with uniform sampling.

Keywords: speech signals, sampling, signal reconstruction, asynchronousdelta modulation, non-linear quantization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1651
1107 Parameters Influencing the Output Precision of a Lens-Lens Beam Generator Solar Concentrator

Authors: M. Tawfik, X. Tonnellier, C. Sansom

Abstract:

The Lens-Lens Beam Generator (LLBG) is a Fresnel-based optical concentrating technique which provides flexibility in selecting the solar receiver location compared to conventional techniques through generating a powerful concentrated collimated solar beam. In order to achieve that, two successive lenses are used and followed by a flat mirror. Hence the generated beam emerging from the LLBG has a high power flux which impinges on the target receiver, it is important to determine the precision of the system output. In this present work, mathematical investigation of different parameters affecting the precision of the output beam is carried out. These parameters include: Deflection in sun-facing lens and its holding arm, delay in updating the solar tracking system, and the flat mirror surface flatness. Moreover, relationships that describe the power lost due to the effect of each parameter are derived in this study.

Keywords: Fresnel lens, LLBG, solar concentrator, solar tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1145
1106 Payment Problems, Cash Flow and Profitability of Construction Project: A System Dynamics Model

Authors: Wenhua Hou, Xing Liu, Deqiang Chen

Abstract:

The ubiquitous payment problems within construction industry of China are notoriously hard to be resolved, thus lead to a series of impacts to the industry chain. Among of them, the most direct result is affecting the normal operation of contractors negatively. A wealth of research has already discussed reasons of the payment problems by introducing a number of possible improvement strategies. But the causalities of these problems are still far from harsh reality. In this paper, the authors propose a model for cash flow system of construction projects by introducing System Dynamics techniques to explore causal facets of the payment problem. The effects of payment arrears on both cash flow and profitability of project are simulated into four scenarios by using data from real projects. Simulating results show visible clues to help contractors quantitatively determining the consequences for the construction project that arise from payment delay.

Keywords: payment problems, cash flow, profitability, system dynamics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2740
1105 Enhanced Approaches to Rectify the Noise, Illumination and Shadow Artifacts

Authors: M. Sankari, C. Meena

Abstract:

Enhancing the quality of two dimensional signals is one of the most important factors in the fields of video surveillance and computer vision. Usually in real-life video surveillance, false detection occurs due to the presence of random noise, illumination and shadow artifacts. The detection methods based on background subtraction faces several problems in accurately detecting objects in realistic environments: In this paper, we propose a noise removal algorithm using neighborhood comparison method with thresholding. The illumination variations correction is done in the detected foreground objects by using an amalgamation of techniques like homomorphic decomposition, curvelet transformation and gamma adjustment operator. Shadow is removed using chromaticity estimator with local relation estimator. Results are compared with the existing methods and prove as high robustness in the video surveillance.

Keywords: Chromaticity Estimator, Curvelet Transformation, Denoising, Gamma correction, Homomorphic, Neighborhood Assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1960
1104 Work Structuring and the Feasibility of Application to Construction Projects in Vietnam

Authors: Viet-Hung Nguyen, Luh-Maan Chang

Abstract:

Design should be viewed concurrently by three ways as transformation, flow and value generation. An innovative approach to solve design – related problems is described as the integrated product - process design. As a foundation for a formal framework consisting of organizing principles and techniques, Work Structuring has been developed to guide efforts in the integration that enhances the development of operation and process design in alignment with product design. Vietnam construction projects are facing many delays, and cost overruns caused mostly by design related problems. A better design management that integrates product and process design could resolve these problems. A questionnaire survey and in – depth interviews were used to investigate the feasibility of applying Work Structuring to construction projects in Vietnam. The purpose of this paper is to present the research results and to illustrate the possible problems and potential solutions when Work Structuring is implemented to construction projects in Vietnam.

Keywords: integrated product – process design, Work Structuring, construction projects, Vietnam

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692
1103 Prediction of Temperature Distribution during Drilling Process Using Artificial Neural Network

Authors: Ali Reza Tahavvor, Saeed Hosseini, Nazli Jowkar, Afshin Karimzadeh Fard

Abstract:

Experimental & numeral study of temperature distribution during milling process, is important in milling quality and tools life aspects. In the present study the milling cross-section temperature is determined by using Artificial Neural Networks (ANN) according to the temperature of certain points of the work piece and the point specifications and the milling rotational speed of the blade. In the present work, at first three-dimensional model of the work piece is provided and then by using the Computational Heat Transfer (CHT) simulations, temperature in different nods of the work piece are specified in steady-state conditions. Results obtained from CHT are used for training and testing the ANN approach. Using reverse engineering and setting the desired x, y, z and the milling rotational speed of the blade as input data to the network, the milling surface temperature determined by neural network is presented as output data. The desired points temperature for different milling blade rotational speed are obtained experimentally and by extrapolation method for the milling surface temperature is obtained and a comparison is performed among the soft programming ANN, CHT results and experimental data and it is observed that ANN soft programming code can be used more efficiently to determine the temperature in a milling process.

Keywords: Milling process, rotational speed, Artificial Neural Networks, temperature.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2332
1102 Analysis of Hard Turning Process of AISI D3-Thermal Aspects

Authors: B. Varaprasad, C. Srinivasa Rao

Abstract:

In the manufacturing sector, hard turning has emerged as vital machining process for cutting hardened steels. Besides many advantages of hard turning operation, one has to implement to achieve close tolerances in terms of surface finish, high product quality, reduced machining time, low operating cost and environmentally friendly characteristics. In the present study, three-dimensional CAE (Computer Aided Engineering) based simulation of  hard turning by using commercial software DEFORM 3D has been compared to experimental results of  stresses, temperatures and tool forces in machining of AISI D3 steel using mixed Ceramic inserts (CC6050). In the present analysis, orthogonal cutting models are proposed, considering several processing parameters such as cutting speed, feed, and depth of cut. An exhaustive friction modeling at the tool-work interfaces is carried out. Work material flow around the cutting edge is carefully modeled with adaptive re-meshing simulation capability. In process simulations, feed rate and cutting speed are constant (i.e.,. 0.075 mm/rev and 155 m/min), and analysis is focused on stresses, forces, and temperatures during machining. Close agreement is observed between CAE simulation and experimental values.

Keywords: Hard-turning, computer-aided engineering, computational machining, finite element method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1353
1101 Achieving Fair Share Objectives via Goal-Oriented Parallel Computer Job Scheduling Policies

Authors: Sangsuree Vasupongayya

Abstract:

Fair share is one of the scheduling objectives supported on many production systems. However, fair share has been shown to cause performance problems for some users, especially the users with difficult jobs. This work is focusing on extending goaloriented parallel computer job scheduling policies to cover the fair share objective. Goal-oriented parallel computer job scheduling policies have been shown to achieve good scheduling performances when conflicting objectives are required. Goal-oriented policies achieve such good performance by using anytime combinatorial search techniques to find a good compromised schedule within a time limit. The experimental results show that the proposed goal-oriented parallel computer job scheduling policy (namely Tradeofffs( Tw:avgX)) achieves good scheduling performances and also provides good fair share performance.

Keywords: goal-oriented parallel job scheduling policies, fairshare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1191
1100 The Origin, Diffusion and a Comparison of Ordinary Differential Equations Numerical Solutions Used by SIR Model in Order to Predict SARS-CoV-2 in Nordic Countries

Authors: Gleda Kutrolli, Maksi Kutrolli, Etjon Meco

Abstract:

SARS-CoV-2 virus is currently one of the most infectious pathogens for humans. It started in China at the end of 2019 and now it is spread in all over the world. The origin and diffusion of the SARS-CoV-2 epidemic, is analysed based on the discussion of viral phylogeny theory. With the aim of understanding the spread of infection in the affected countries, it is crucial to modelize the spread of the virus and simulate its activity. In this paper, the prediction of coronavirus outbreak is done by using SIR model without vital dynamics, applying different numerical technique solving ordinary differential equations (ODEs). We find out that ABM and MRT methods perform better than other techniques and that the activity of the virus will decrease in April but it never cease (for some time the activity will remain low) and the next cycle will start in the middle July 2020 for Norway and Denmark, and October 2020 for Sweden, and September for Finland.

Keywords: Forecasting, ordinary differential equations, SARS-CoV-2 epidemic, SIR model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 558
1099 Enhanced Performance of Fading Dispersive Channel Using Dynamic Frequency Hopping(DFH)

Authors: Walid M. Saad

Abstract:

techniques are examined to overcome the performance degradation caused by the channel dispersion using slow frequency hopping (SFH) with dynamic frequency hopping (DFH) pattern adaptation. In DFH systems, the frequency slots are selected by continuous quality monitoring of all frequencies available in a system and modification of hopping patterns for each individual link based on replacing slots which its signal to interference ratio (SIR) measurement is below a required threshold. Simulation results will show the improvements in BER obtained by DFH in comparison with matched frequency hopping (MFH), random frequency hopping (RFH) and multi-carrier code division multiple access (MC-CDMA) in multipath slowly fading dispersive channels using a generalized bandpass two-path transfer function model, and will show the improvement obtained according to the threshold selection.

Keywords: code division multiple access (CDMA), dynamic channel allocation (DCA), dynamic channel assignment, frequency hopping, matched frequency hopping (MFH).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1711
1098 Feature Based Dense Stereo Matching using Dynamic Programming and Color

Authors: Hajar Sadeghi, Payman Moallem, S. Amirhassn Monadjemi

Abstract:

This paper presents a new feature based dense stereo matching algorithm to obtain the dense disparity map via dynamic programming. After extraction of some proper features, we use some matching constraints such as epipolar line, disparity limit, ordering and limit of directional derivative of disparity as well. Also, a coarseto- fine multiresolution strategy is used to decrease the search space and therefore increase the accuracy and processing speed. The proposed method links the detected feature points into the chains and compares some of the feature points from different chains, to increase the matching speed. We also employ color stereo matching to increase the accuracy of the algorithm. Then after feature matching, we use the dynamic programming to obtain the dense disparity map. It differs from the classical DP methods in the stereo vision, since it employs sparse disparity map obtained from the feature based matching stage. The DP is also performed further on a scan line, between any matched two feature points on that scan line. Thus our algorithm is truly an optimization method. Our algorithm offers a good trade off in terms of accuracy and computational efficiency. Regarding the results of our experiments, the proposed algorithm increases the accuracy from 20 to 70%, and reduces the running time of the algorithm almost 70%.

Keywords: Chain Correspondence, Color Stereo Matching, Dynamic Programming, Epipolar Line, Stereo Vision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2349
1097 Mean Shift-based Preprocessing Methodology for Improved 3D Buildings Reconstruction

Authors: Nikolaos Vassilas, Theocharis Tsenoglou, Djamchid Ghazanfarpour

Abstract:

In this work, we explore the capability of the mean shift algorithm as a powerful preprocessing tool for improving the quality of spatial data, acquired from airborne scanners, from densely built urban areas. On one hand, high resolution image data corrupted by noise caused by lossy compression techniques are appropriately smoothed while at the same time preserving the optical edges and, on the other, low resolution LiDAR data in the form of normalized Digital Surface Map (nDSM) is upsampled through the joint mean shift algorithm. Experiments on both the edge-preserving smoothing and upsampling capabilities using synthetic RGB-z data show that the mean shift algorithm is superior to bilateral filtering as well as to other classical smoothing and upsampling algorithms. Application of the proposed methodology for 3D reconstruction of buildings of a pilot region of Athens, Greece results in a significant visual improvement of the 3D building block model.

Keywords: 3D buildings reconstruction, data fusion, data upsampling, mean shift.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2005
1096 Low-Cost Pre-Treatment of Pharmaceutical Wastewater

Authors: A. Abu-Safa, S. Abu-Salah, M. Mosa, S. Gharaibeh

Abstract:

Pharmaceutical industries and effluents of sewage treatment plants are the main sources of residual pharmaceuticals in water resources. These emergent pollutants may adversely impact the biophysical environment. Pharmaceutical industries often generate wastewater that changes in characteristics and quantity depending on the used manufacturing processes. Carbamazepine (CBZ), {5Hdibenzo [b,f]azepine-5-carboxamide, (C15H12N2O)}, is a significant non-biodegradable pharmaceutical contaminant in the Jordanian pharmaceutical wastewater, which is not removed by the activated sludge processes in treatment plants. Activated carbon may potentially remove that pollutant from effluents, but the high cost involved suggests that more attention should be given to the potential use of low-cost materials in order to reduce cost and environmental contamination. Powders of Jordanian non-metallic raw materials namely, Azraq Bentonite (AB), Kaolinite (K), and Zeolite (Zeo) were activated (acid and thermal treatment) and evaluated by removing CBZ. The results of batch and column techniques experiments showed around 46% and 67% removal of CBZ respectively.

Keywords: Azraq bentonite, carbamazepine, pharmaceutical wastewater, zeolite.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2705
1095 Design Optimization Methodology of CMOS Active Mixers for Multi-Standard Receivers

Authors: S. Douss, F. Touati, M. Loulou

Abstract:

A design flow of multi-standard down-conversion CMOS mixers for three modern standards: Global System Mobile, Digital Enhanced Cordless Telephone and Universal Mobile Telecommunication Systems is presented. Three active mixer-s structures are studied. The first is based on the Gilbert cell which gives a tolerable noise figure and linearity with a low conversion gain. The second and third structures use the current bleeding and charge injection techniques in order to increase the conversion gain. An improvement of about 2 dB of the conversion gain is achieved without a considerable degradation of the other characteristics. The models used for noise figure, conversion gain and IIP3 used are studied. This study describes the nature of trade-offs inherent in such structures and gives insights that help in identifying which structure is better for given conditions.

Keywords: Active mixer, Radio-frequency transceiver, Multistandardfront end, Gilbert cell, current bleeding, charge injection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2490
1094 Analysis of Electrocardiograph (ECG) Signal for the Detection of Abnormalities Using MATLAB

Authors: Durgesh Kumar Ojha, Monica Subashini

Abstract:

The proposed method is to study and analyze Electrocardiograph (ECG) waveform to detect abnormalities present with reference to P, Q, R and S peaks. The first phase includes the acquisition of real time ECG data. In the next phase, generation of signals followed by pre-processing. Thirdly, the procured ECG signal is subjected to feature extraction. The extracted features detect abnormal peaks present in the waveform Thus the normal and abnormal ECG signal could be differentiated based on the features extracted. The work is implemented in the most familiar multipurpose tool, MATLAB. This software efficiently uses algorithms and techniques for detection of any abnormalities present in the ECG signal. Proper utilization of MATLAB functions (both built-in and user defined) can lead us to work with ECG signals for processing and analysis in real time applications. The simulation would help in improving the accuracy and the hardware could be built conveniently.

Keywords: ECG Waveform, Peak Detection, Arrhythmia, Matlab.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12008
1093 The Analysis of the Impact of Urbanization on Urban Meteorology from Urban Growth Management Perspective

Authors: Hansung Wan, Hyungkwan Cho, Kiho Sung, Hongkyu Kim

Abstract:

The amount of urban artificial heat which affects the urban temperature rise in urban meteorology was investigated in order to clarify the relationships between urbanization and urban meteorology in this study. The results of calculation to identify how urban temperate was increased through the establishment of a model for measuring the amount of urban artificial heat and theoretical testing revealed that the amount of urban artificial heat increased urban temperature by plus or minus 0.23 ˚ C in 2007 compared with 1996, statistical methods (correlation and regression analysis) to clarify the relationships between urbanization and urban weather were as follows. New design techniques and urban growth management are necessary from urban growth management point of view suggested from this research at city design phase to decrease urban temperature rise and urban torrential rain which can produce urban disaster in terms of urban meteorology by urbanization.

Keywords: The amount of urban artificial heat, Urban growth management, Urbanization, Urban meteorology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1593
1092 Evaluation Method for Information Security Levels of CIIP (Critical Information Infrastructure Protection)

Authors: Soon-Tai Park, Jong-Whoi Shin, Bog-Ki Min, Ik-Sub Lee, Gang-Shin Lee, Jae-Il Lee

Abstract:

As the information age matures, major social infrastructures such as communication, finance, military and energy, have become ever more dependent on information communication systems. And since these infrastructures are connected to the Internet, electronic intrusions such as hacking and viruses have become a new security threat. Especially, disturbance or neutralization of a major social infrastructure can result in extensive material damage and social disorder. To address this issue, many nations around the world are researching and developing various techniques and information security policies as a government-wide effort to protect their infrastructures from newly emerging threats. This paper proposes an evaluation method for information security levels of CIIP (Critical Information Infrastructure Protection), which can enhance the security level of critical information infrastructure by checking the current security status and establish security measures accordingly to protect infrastructures effectively.

Keywords: Information Security Evaluation Methodology, Critical Information Infrastructure Protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1783
1091 A Real-Time Image Change Detection System

Authors: Madina Hamiane, Amina Khunji

Abstract:

Detecting changes in multiple images of the same scene has recently seen increased interest due to the many contemporary applications including smart security systems, smart homes, remote sensing, surveillance, medical diagnosis, weather forecasting, speed and distance measurement, post-disaster forensics and much more. These applications differ in the scale, nature, and speed of change. This paper presents an application of image processing techniques to implement a real-time change detection system. Change is identified by comparing the RGB representation of two consecutive frames captured in real-time. The detection threshold can be controlled to account for various luminance levels. The comparison result is passed through a filter before decision making to reduce false positives, especially at lower luminance conditions. The system is implemented with a MATLAB Graphical User interface with several controls to manage its operation and performance.

Keywords: Image change detection, Image processing, image filtering, thresholding, B/W quantization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2563
1090 Dynamic Fuzzy-Neural Network Controller for Induction Motor Drive

Authors: M. Zerikat, M. Bendjebbar, N. Benouzza

Abstract:

In this paper, a novel approach for robust trajectory tracking of induction motor drive is presented. By combining variable structure systems theory with fuzzy logic concept and neural network techniques, a new algorithm is developed. Fuzzy logic was used for the adaptation of the learning algorithm to improve the robustness of learning and operating of the neural network. The developed control algorithm is robust to parameter variations and external influences. It also assures precise trajectory tracking with the prescribed dynamics. The algorithm was verified by simulation and the results obtained demonstrate the effectiveness of the designed controller of induction motor drives which considered as highly non linear dynamic complex systems and variable characteristics over the operating conditions.

Keywords: Induction motor, fuzzy-logic control, neural network control, indirect field oriented control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2461
1089 Grid Coordination with Marketmaker Agents

Authors: Xin Bai, Kresimir Sivoncik, Damla Turgut, Ladislau Bölöni

Abstract:

Market based models are frequently used in the resource allocation on the computational grid. However, as the size of the grid grows, it becomes difficult for the customer to negotiate directly with all the providers. Middle agents are introduced to mediate between the providers and customers and facilitate the resource allocation process. The most frequently deployed middle agents are the matchmakers and the brokers. The matchmaking agent finds possible candidate providers who can satisfy the requirements of the consumers, after which the customer directly negotiates with the candidates. The broker agents are mediating the negotiation with the providers in real time. In this paper we present a new type of middle agent, the marketmaker. Its operation is based on two parallel operations - through the investment process the marketmaker is acquiring resources and resource reservations in large quantities, while through the resale process it sells them to the customers. The operation of the marketmaker is based on the fact that through its global view of the grid it can perform a more efficient resource allocation than the one possible in one-to-one negotiations between the customers and providers. We present the operation and algorithms governing the operation of the marketmaker agent, contrasting it with the matchmaker and broker agents. Through a series of simulations in the task oriented domain we compare the operation of the three agents types. We find that the use of marketmaker agent leads to a better performance in the allocation of large tasks and a significant reduction of the messaging overhead.

Keywords: grid computing, autonomous agents, market-basedgrid

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1530
1088 Concepts Extraction from Discharge Notes using Association Rule Mining

Authors: Basak Oguz Yolcular

Abstract:

A large amount of valuable information is available in plain text clinical reports. New techniques and technologies are applied to extract information from these reports. In this study, we developed a domain based software system to transform 600 Otorhinolaryngology discharge notes to a structured form for extracting clinical data from the discharge notes. In order to decrease the system process time discharge notes were transformed into a data table after preprocessing. Several word lists were constituted to identify common section in the discharge notes, including patient history, age, problems, and diagnosis etc. N-gram method was used for discovering terms co-Occurrences within each section. Using this method a dataset of concept candidates has been generated for the validation step, and then Predictive Apriori algorithm for Association Rule Mining (ARM) was applied to validate candidate concepts.

Keywords: association rule mining, otorhinolaryngology, predictive apriori, text mining

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614
1087 Studying the Causes and Affecting Factors of Motorcycle Accidents A Case Study on the Road Accidents in Zanjan Province (IRAN) - 2007

Authors: A. Beheshti, S. Salkhordeh, H. Amini

Abstract:

Based on statistics released by Islamic Republic of Iran Police (IRIP), from among the total 9555 motorcycle accidents that happened in 2007, 857 riders died and 11219 one got injured. If we also consider the death toll and injuries of other vehicles' accidents resulted from traffic violation by motorcycle riders, then paying attention to the motorcycle accidents seems to be very necessary. Therefore, in this study we tried to investigate the traits and issues related to production, application, and training, along with causes of motorcycle accidents from 4 perspectives of road, human, environment and vehicle and also based on statistical and geographical analysis of accident-sheets prepared by Iran Road Patrol Department (IRPD). Unfamiliarity of riders with regulations and techniques of motorcycling, disuse of safety equipments, inefficiency of roads and design of junctions for safe trafficking of motorcycles and finally the lack of sufficient control of responsible organizations are among the major causes which lead to these accidents.

Keywords: Motorcycle, Motorcycle riders, Road accidents, Statistical analysis of accidents.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1582
1086 Structural Characteristics of Three-Dimensional Random Packing of Aggregates with Wide Size Distribution

Authors: Kasthurirangan Gopalakrishnan, Naga Shashidhar

Abstract:

The mechanical properties of granular solids are dependent on the flow of stresses from one particle to another through inter-particle contact. Although some experimental methods have been used to study the inter-particle contacts in the past, preliminary work with these techniques indicated that they do not have the necessary resolution to distinguish between those contacts that transmit the load and those that do not, especially for systems with a wide distribution of particle sizes. In this research, computer simulations are used to study the nature and distribution of contacts in a compact with wide particle size distribution, representative of aggregate size distribution used in asphalt pavement construction. The packing fraction, the mean number of contacts and the distribution of contacts were studied for different scenarios. A methodology to distinguish and compute the fraction of load-bearing particles and the fraction of space-filling particles (particles that do not transmit any force) is needed for further investigation.

Keywords: Computer simulation, three-dimensional particlepacking, coordination number, asphalt concrete, aggregates.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2112
1085 Application of CFD for Air Flow Analysis underneath Natural Ventilation with Forced Convection in Roof Attic

Authors: C. Nutphuang, S. Chirarattananon, V.D. Hien

Abstract:

In research on natural ventilation, and passive cooling with forced convection, is essential to know how heat flows in a solid object and the pattern of temperature distribution on their surfaces, and eventually how air flows through and convects heat from the surfaces of steel under roof. This paper presents some results from running the computational fluid dynamic program (CFD) by comparison between natural ventilation and forced convection within roof attic that is received directly from solar radiation. The CFD program for modeling air flow inside roof attic has been modified to allow as two cases. First case, the analysis under natural ventilation, is closed area in roof attic and second case, the analysis under forced convection, is opened area in roof attic. These extend of all cases to available predictions of variations such as temperature, pressure, and mass flow rate distributions in each case within roof attic. The comparison shows that this CFD program is an effective model for predicting air flow of temperature and heat transfer coefficient distribution within roof attic. The result shows that forced convection can help to reduce heat transfer through roof attic and an around area of steel core has temperature inner zone lower than natural ventilation type. The different temperature on the steel core of roof attic of two cases was 10-15 oK.

Keywords: CFD program, natural ventilation, forcedconvection, heat transfer, air flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2223
1084 A Neuro-Fuzzy Approach Based Voting Scheme for Fault Tolerant Systems Using Artificial Bee Colony Training

Authors: D. Uma Devi, P. Seetha Ramaiah

Abstract:

Voting algorithms are extensively used to make decisions in fault tolerant systems where each redundant module gives inconsistent outputs. Popular voting algorithms include majority voting, weighted voting, and inexact majority voters. Each of these techniques suffers from scenarios where agreements do not exist for the given voter inputs. This has been successfully overcome in literature using fuzzy theory. Our previous work concentrated on a neuro-fuzzy algorithm where training using the neuro system substantially improved the prediction result of the voting system. Weight training of Neural Network is sub-optimal. This study proposes to optimize the weights of the Neural Network using Artificial Bee Colony algorithm. Experimental results show the proposed system improves the decision making of the voting algorithms.

Keywords: Voting algorithms, Fault tolerance, Fault masking, Neuro-Fuzzy System (NFS), Artificial Bee Colony (ABC)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2655
1083 Modelling Medieval Vaults: Digital Simulation of the North Transept Vault of St Mary, Nantwich, England

Authors: N. Webb, A. Buchanan

Abstract:

Digital and virtual heritage is often associated with the recreation of lost artefacts and architecture; however, we can also investigate works that were not completed, using digital tools and techniques. Here we explore physical evidence of a fourteenth-century Gothic vault located in the north transept of St Mary’s church in Nantwich, Cheshire, using existing springer stones that are built into the walls as a starting point. Digital surveying tools are used to document the architecture, followed by an analysis process to hypothesise and simulate possible design solutions, had the vault been completed. A number of options, both two-dimensionally and three-dimensionally, are discussed based on comparison with examples of other contemporary vaults, thus adding another specimen to the corpus of vault designs. Dissemination methods such as digital models and 3D prints are also explored as possible resources for demonstrating what the finished vault might have looked like for heritage interpretation and other purposes.

Keywords: Digital simulation, heritage interpretation, medieval vaults, virtual heritage, 3D scanning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1178
1082 Automatic Inspection of Percussion Caps by Means of Combined 2D and 3D Machine Vision Techniques

Authors: A. Tellaeche, R. Arana, I.Maurtua

Abstract:

The exhaustive quality control is becoming more and more important when commercializing competitive products in the world's globalized market. Taken this affirmation as an undeniable truth, it becomes critical in certain sector markets that need to offer the highest restrictions in quality terms. One of these examples is the percussion cap mass production, a critical element assembled in firearm ammunition. These elements, built in great quantities at a very high speed, must achieve a minimum tolerance deviation in their fabrication, due to their vital importance in firing the piece of ammunition where they are built in. This paper outlines a machine vision development for the 100% inspection of percussion caps obtaining data from 2D and 3D simultaneous images. The acquisition speed and precision of these images from a metallic reflective piece as a percussion cap, the accuracy of the measures taken from these images and the multiple fabrication errors detected make the main findings of this work.

Keywords: critical tolerance, high speed decision makingsimultaneous 2D/3D machine vision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537
1081 Cloud Monitoring and Performance Optimization Ensuring High Availability and Security

Authors: Inayat Ur Rehman, Georgia Sakellari

Abstract:

Cloud computing has evolved into a vital technology for businesses, offering scalability, flexibility, and cost-effectiveness. However, maintaining high availability and optimal performance in the cloud is crucial for reliable services. This paper explores the significance of cloud monitoring and performance optimization in sustaining the high availability of cloud-based systems. It discusses diverse monitoring tools, techniques, and best practices for continually assessing the health and performance of cloud resources. The paper also delves into performance optimization strategies, including resource allocation, load balancing, and auto-scaling, to ensure efficient resource utilization and responsiveness. Addressing potential challenges in cloud monitoring and optimization, the paper offers insights into data security and privacy considerations. Through this thorough analysis, the paper aims to underscore the importance of cloud monitoring and performance optimization for ensuring a seamless and highly available cloud computing environment.

Keywords: Cloud computing, cloud monitoring, performance optimization, high availability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 74
1080 Modelling Customer's Attitude Towards E-Government Services

Authors: Norazah Mohd Suki, T Ramayah

Abstract:

e-Government structures permits the government to operate in a more transparent and accountable manner of which it increases the power of the individual in relation to that of the government. This paper identifies the factors that determine customer-s attitude towards e-Government services using a theoretical model based on the Technology Acceptance Model. Data relating to the constructs were collected from 200 respondents. The research model was tested using Structural Equation Modeling (SEM) techniques via the Analysis of Moment Structure (AMOS 16) computer software. SEM is a comprehensive approach to testing hypotheses about relations among observed and latent variables. The proposed model fits the data well. The results demonstrated that e- Government services acceptance can be explained in terms of compatibility and attitude towards e-Government services. The setup of the e-Government services will be compatible with the way users work and are more likely to adopt e-Government services owing to their familiarity with the Internet for various official, personal, and recreational uses. In addition, managerial implications for government policy makers, government agencies, and system developers are also discussed.

Keywords: E-government, structural equation modelling, attitude, service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2203
1079 Using Automated Database Reverse Engineering for Database Integration

Authors: M. R. Abbasifard, M. Rahgozar, A. Bayati, P. Pournemati

Abstract:

One important problem in today organizations is the existence of non-integrated information systems, inconsistency and lack of suitable correlations between legacy and modern systems. One main solution is to transfer the local databases into a global one. In this regards we need to extract the data structures from the legacy systems and integrate them with the new technology systems. In legacy systems, huge amounts of a data are stored in legacy databases. They require particular attention since they need more efforts to be normalized, reformatted and moved to the modern database environments. Designing the new integrated (global) database architecture and applying the reverse engineering requires data normalization. This paper proposes the use of database reverse engineering in order to integrate legacy and modern databases in organizations. The suggested approach consists of methods and techniques for generating data transformation rules needed for the data structure normalization.

Keywords: Reverse Engineering, Database Integration, System Integration, Data Structure Normalization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1852