Search results for: peak measures.
741 Effect of Cooled EGR in Combustion Characteristics of a Direct Injection CI Engine Fuelled with Biodiesel Blend
Authors: Sankar Chandrasekar, Rana Niranchan V.S., Joseph Sidharth Leon
Abstract:
As the demand and prices of various petroleum products have been on the rise in recent years, there is a growing need for alternative fuels. Biodiesel, which consists of alkyl monoesters of fatty acids from vegetable oils and animal fats, is considered as an alternative to petroleum diesel. Biodiesel has comparable performance with that of diesel and has lower brake specific fuel consumption than diesel with significant reduction in emissions of CO, hydrocarbons (HC) and smoke with however, a slight increase in NOx emissions. This paper analyzes the effect of cooled exhaust gas recirculation in the combustion characteristics of a direct injection compression ignition engine using biodiesel blended fuel as opposed to the conventional system. The combustion parameters such as cylinder pressure, heat release rate, delay period and peak pressure were analyzed at various loads. The maximum cylinder pressure reduces as the fraction of biodiesel increases in the blend the maximum rate of pressure rise was found to be higher for diesel at higher engine loads.
Keywords: Cylinder pressure, delay period, EGR, heat release.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731740 Quantitative Evaluation of Frameworks for Web Applications
Authors: Thirumalai Selvi, N. V. Balasubramanian, P. Sheik Abdul Khader
Abstract:
An empirical study of web applications that use software frameworks is presented here. The analysis is based on two approaches. In the first, developers using such frameworks are required, based on their experience, to assign weights to parameters such as database connection. In the second approach, a performance testing tool, OpenSTA, is used to compute start time and other such measures. From such an analysis, it is concluded that open source software is superior to proprietary software. The motivation behind this research is to examine ways in which a quantitative assessment can be made of software in general and frameworks in particular. Concepts such as metrics and architectural styles are discussed along with previously published research.Keywords: Metrics, Frameworks, Performance Testing, WebApplications, Open Source.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1754739 Transient Analysis of Central Region Void Fraction in a 3x3 Rod Bundle under Bubbly and Cap/Slug Flows
Authors: Ya-Chi Yu, Pei-Syuan Ruan, Shao-Wen Chen, Yu-Hsien Chang, Jin-Der Lee, Jong-Rong Wang, Chunkuan Shih
Abstract:
This study analyzed the transient signals of central region void fraction of air-water two-phase flow in a 3x3 rod bundle. Experimental tests were carried out utilizing a vertical rod bundle test section along with a set of air-water supply/flow control system, and the transient signals of the central region void fraction were collected through the electrical conductivity sensors as well as visualized via high speed photography. By converting the electric signals, transient void fraction can be obtained through the voltage ratios. With a fixed superficial water velocity (Jf=0.094 m/s), two different superficial air velocities (Jg=0.094 m/s and 0.236 m/s) were tested and presented, which were corresponding to the flow conditions of bubbly flows and cap/slug flows, respectively. The time averaged central region void fraction was obtained as 0.109-0.122 with 0.028 standard deviation for the selected bubbly flow and 0.188-0.221with 0.101 standard deviation for the selected cap/slug flow, respectively. Through Fast Fourier Transform (FFT) analysis, no clear frequency peak was found in bubbly flow, while two dominant frequencies were identified around 1.6 Hz and 2.5 Hz in the present cap/slug flow.Keywords: Central region, rod bundles, transient void fraction, two-phase flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 694738 An Efficient Method of Shot Cut Detection
Authors: Lenka Krulikovská, Jaroslav Polec
Abstract:
In this paper we present a method of abrupt cut detection with a novel logic of frames- comparison. Actual frame is compared with its motion estimated prediction instead of comparison with successive frame. Four different similarity metrics were employed to estimate the resemblance of compared frames. Obtained results were evaluated by standard used measures of test accuracy and compared with existing approach. Based on the results, we claim the proposed method is more effective and Pearson correlation coefficient obtained the best results among chosen similarity metrics.
Keywords: Abrupt cut, mutual information, shot cut detection, Pearson correlation coefficient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1934737 Evaluation of Research in the Field of Energy Efficiency and MCA Methods Using Publications Databases
Authors: Juan Sepúlveda
Abstract:
Energy is a fundamental component in sustainability, the access and use of this resource is related with economic growth, social improvements, and environmental impacts. In this sense, energy efficiency has been studied as a factor that enhances the positive impacts of energy in communities; however, the implementation of efficiency requires strong policy and strategies that usually rely on individual measures focused in independent dimensions. In this paper, the problem of energy efficiency as a multi-objective problem is studied, using scientometric analysis to discover trends and patterns that allow to identify the main variables and study approximations related with a further development of models to integrate energy efficiency and MCA into policy making for small communities.Keywords: Energy efficiency, MCA, Scientometrics, trends.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751736 Query Reformulation Guided by External Resource for Information Retrieval
Authors: Mohammed El Amine Abderrahim
Abstract:
Reformulating the user query is a technique that aims to improve the performance of an Information Retrieval System (IRS) in terms of precision and recall. This paper tries to evaluate the technique of query reformulation guided by an external resource for Arabic texts. To do this, various precision and recall measures were conducted and two corpora with different external resources like Arabic WordNet (AWN) and the Arabic Dictionary (thesaurus) of Meaning (ADM) were used. Examination of the obtained results will allow us to measure the real contribution of this reformulation technique in improving the IRS performance.
Keywords: Arabic NLP, Arabic Information Retrieval, Arabic WordNet, Query Expansion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1401735 The Model Establishment and Analysis of TRACE/FRAPTRAN for Chinshan Nuclear Power Plant Spent Fuel Pool
Authors: J. R. Wang, H. T. Lin, Y. S. Tseng, W. Y. Li, H. C. Chen, S. W. Chen, C. Shih
Abstract:
TRACE is developed by U.S. NRC for the nuclear power plants (NPPs) safety analysis. We focus on the establishment and application of TRACE/FRAPTRAN/SNAP models for Chinshan NPP (BWR/4) spent fuel pool in this research. The geometry is 12.17 m × 7.87 m × 11.61 m for the spent fuel pool. In this study, there are three TRACE/SNAP models: one-channel, two-channel, and multi-channel TRACE/SNAP model. Additionally, the cooling system failure of the spent fuel pool was simulated and analyzed by using the above models. According to the analysis results, the peak cladding temperature response was more accurate in the multi-channel TRACE/SNAP model. The results depicted that the uncovered of the fuels occurred at 2.7 day after the cooling system failed. In order to estimate the detailed fuel rods performance, FRAPTRAN code was used in this research. According to the results of FRAPTRAN, the highest cladding temperature located on the node 21 of the fuel rod (the highest node at node 23) and the cladding burst roughly after 3.7 day.Keywords: TRACE, FRAPTRAN, SNAP, spent fuel pool.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1417734 Influence of Ambiguity Cluster on Quality Improvement in Image Compression
Authors: Safaa Al-Ali, Ahmad Shahin, Fadi Chakik
Abstract:
Image coding based on clustering provides immediate access to targeted features of interest in a high quality decoded image. This approach is useful for intelligent devices, as well as for multimedia content-based description standards. The result of image clustering cannot be precise in some positions especially on pixels with edge information which produce ambiguity among the clusters. Even with a good enhancement operator based on PDE, the quality of the decoded image will highly depend on the clustering process. In this paper, we introduce an ambiguity cluster in image coding to represent pixels with vagueness properties. The presence of such cluster allows preserving some details inherent to edges as well for uncertain pixels. It will also be very useful during the decoding phase in which an anisotropic diffusion operator, such as Perona-Malik, enhances the quality of the restored image. This work also offers a comparative study to demonstrate the effectiveness of a fuzzy clustering technique in detecting the ambiguity cluster without losing lot of the essential image information. Several experiments have been carried out to demonstrate the usefulness of ambiguity concept in image compression. The coding results and the performance of the proposed algorithms are discussed in terms of the peak signal-tonoise ratio and the quantity of ambiguous pixels.Keywords: Ambiguity Cluster, Anisotropic Diffusion, Fuzzy Clustering, Image Compression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1569733 Efficient Design of Distribution Logistics by Using a Model-Based Decision Support System
Abstract:
The design of distribution logistics has a decisive impact on a company's logistics costs and performance. Hence, such solutions make an essential contribution to corporate success. This article describes a decision support system for analyzing the potential of distribution logistics in terms of logistics costs and performance. In contrast to previous procedures of business process re-engineering (BPR), this method maps distribution logistics holistically under variable distribution structures. Combined with qualitative measures the decision support system will contribute to a more efficient design of distribution logistics.
Keywords: Decision support system distribution logistics, potential analyses, supply chain management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1823732 Topology Preservation in SOM
Authors: E. Arsuaga Uriarte, F. Díaz Martín
Abstract:
The SOM has several beneficial features which make it a useful method for data mining. One of the most important features is the ability to preserve the topology in the projection. There are several measures that can be used to quantify the goodness of the map in order to obtain the optimal projection, including the average quantization error and many topological errors. Many researches have studied how the topology preservation should be measured. One option consists of using the topographic error which considers the ratio of data vectors for which the first and second best BMUs are not adjacent. In this work we present a study of the behaviour of the topographic error in different kinds of maps. We have found that this error devaluates the rectangular maps and we have studied the reasons why this happens. Finally, we suggest a new topological error to improve the deficiency of the topographic error.Keywords: Map lattice, Self-Organizing Map, topographic error, topology preservation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3012731 A Study of Visual Attention in Diagnosing Cerebellar Tumours
Authors: Kuryati Kipli, Kasumawati Lias, Dayang Azra Awang Mat, Al-Khalid Othman, Ade Syaheda Wani Marzuki, Nurdiani Zamhari
Abstract:
Visual attention allows user to select the most relevant information to ongoing behaviour. This paper presents a study on; i) the performance of people measurements, ii) accurateness of people measurement of the peaks that correspond to chemical quantities from the Magnetic Resonance Spectroscopy (MRS) graphs and iii) affects of people measurements to the algorithm-based diagnosis. Participant-s eye-movement was recorded using eye-tracker tool (Eyelink II). This experiment involves three participants for examining 20 MRS graphs to estimate the peaks of chemical quantities which indicate the abnormalities associated with Cerebellar Tumours (CT). The status of each MRS is verified by using decision algorithm. Analysis involves determination of humans-s eye movement pattern in measuring the peak of spectrograms, scan path and determining the relationship of distributions of fixation durations with the accuracy of measurement. In particular, the eye-tracking data revealed which aspects of the spectrogram received more visual attention and in what order they were viewed. This preliminary investigation provides a proof of concept for use of the eye tracking technology as the basis for expanded CT diagnosis.Keywords: eye tracking, fixation durations, pattern, scan paths, spectrograms, visual.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1432730 Characterization of Ajebo Kaolinite Clay for Production of Natural Pozzolan
Authors: Gbenga M. Ayininuola, Olasunkanmi A. Adekitan
Abstract:
Calcined kaolinite clay (CKC) is a pozzolanic material that is current drawing research attention. This work investigates the conditions for the best performance of a CKC from a kaolinite clay source in Ajebo, Abeokuta (southwest Nigeria) known for its commercial availability. Samples from this source were subjected to X-ray diffractometry (XRD) and differential scanning calorimetry (DSC). XRD shows that kaolinite is the main mineral in the clay source. This mineral is responsible for the pozzolanic behavior of CKC. DSC indicates that the transformation from the clay to CKC occurred between 550 and 750 oC. Using this temperature range, clay samples were milled and different CKC samples were produced in an electric muffle furnace using temperatures of 550, 600, 650, 700, 750 and 800 oC respectively for 1 hour each. This was also repeated for 2 hours. The degree of de-hydroxylation (dtg) and strength activity index (SAI) were also determined for each of the CKC samples. The dtg and SAI tests were repeated two more times for each sample and averages were taken. Results showed that peak dtg occurred at 750 oC for 1 hour calcining combination (94.27%) whereas marginal differences were recorded at some lower temperatures (90.97% for 650 oC for 2 hours; 91.05% for 700 oC for 1 hour and 92.77% for 700 oC for 2 hours). Optimum SAI was reported at 700 oC for 1 hour (99.05%). Rating SAI as a better parameter than dtg, 700 oC for 1 hour combination was adopted as the best calcining condition. The paper recommends the adoption of this clay source for pozzolan production by adopting the calcining conditions established in this work.
Keywords: Calcined kaolinite clay, calcination, optimum-calcining conditions, pozzolanity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314729 Spatial and Temporal Variability of Fog Over the Indo-Gangetic Plains, India
Authors: Sanjay Kumar Srivastava, Anu Rani Sharma, Kamna Sachdeva
Abstract:
The aim of the paper is to analyze the characteristics of winter fog in terms of its trend and spatial-temporal variability over Indo-Gangetic plains. The study reveals that during last four and half decades (1971-2015), an alarming increasing trend in fog frequency has been observed during the winter months of December and January over the study area. The frequency of fog has increased by 118.4% during the peak winter months of December and January. It has also been observed that on an average central part of IGP has 66.29% fog days followed by west IGP with 41.94% fog days. Further, Empirical Orthogonal Function (EOF) decomposition and Mann-Kendall variation analysis are used to analyze the spatial and temporal patterns of winter fog. The findings have significant implications for the further research of fog over IGP and formulate robust strategies to adapt the fog variability and mitigate its effects. The decision by Delhi Government to implement odd-even scheme to restrict the use of private vehicles in order to reduce pollution and improve quality of air may result in increasing the alarming increasing trend of fog over Delhi and its surrounding areas regions of IGP.
Keywords: Fog, climatology, spatial variability, temporal variability, empirical orthogonal function, visibility, Mann-Kendall test, variation point.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654728 The Correlation between Peer Aggression and Peer Victimization: Are Aggressors Victims Too?
Authors: Glenn M. Calaguas
Abstract:
To investigate the possible correlation between peer aggression and peer victimization, 148 sixth-graders were asked to respond to the Reduced Aggression and Victimization Scales (RAVS). RAVS measures the frequency of reporting aggressive behaviors or of being victimized during the previous week prior to the survey. The scales are composed of six items each. Each point represents one instance of aggression or victimization. Specifically, the Pearson Product-Moment Correlation Coefficient (PMCC) was used to determine the correlations between the scores of the sixthgraders in the two scales, both in individual items and total scores. Positive correlations were established and correlations were significant at the 0.01 levels.
Keywords: correlation, peer aggression, peer victimization, sixth-graders.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2445727 Internet Optimization by Negotiating Traffic Times
Authors: Carlos Gonzalez
Abstract:
This paper describes a system to optimize the use of the internet by clients requiring downloading of videos at peak hours. The system consists of a web server belonging to a provider of video contents, a provider of internet communications and a software application running on a client’s computer. The client using the application software will communicate to the video provider a list of the client’s future video demands. The video provider calculates which videos are going to be more in demand for download in the immediate future, and proceeds to request the internet provider the most optimal hours to do the downloading. The times of the downloading will be sent to the application software, which will use the information of pre-established hours negotiated between the video provider and the internet provider to download those videos. The videos will be saved in a special protected section of the user’s hard disk, which will only be accessed by the application software in the client’s computer. When the client is ready to see a video, the application will search the list of current existent videos in the area of the hard disk; if it does exist, it will use this video directly without the need for internet access. We found that the best way to optimize the download traffic of videos is by negotiation between the internet communication provider and the video content provider.
Keywords: Internet optimization, video download, future demands, secure storage.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 534726 A Simple Adaptive Atomic Decomposition Voice Activity Detector Implemented by Matching Pursuit
Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic
Abstract:
A simple adaptive voice activity detector (VAD) is implemented using Gabor and gammatone atomic decomposition of speech for high Gaussian noise environments. Matching pursuit is used for atomic decomposition, and is shown to achieve optimal speech detection capability at high data compression rates for low signal to noise ratios. The most active dictionary elements found by matching pursuit are used for the signal reconstruction so that the algorithm adapts to the individual speakers dominant time-frequency characteristics. Speech has a high peak to average ratio enabling matching pursuit greedy heuristic of highest inner products to isolate high energy speech components in high noise environments. Gabor and gammatone atoms are both investigated with identical logarithmically spaced center frequencies, and similar bandwidths. The algorithm performs equally well for both Gabor and gammatone atoms with no significant statistical differences. The algorithm achieves 70% accuracy at a 0 dB SNR, 90% accuracy at a 5 dB SNR and 98% accuracy at a 20dB SNR using 30d B SNR as a reference for voice activity.Keywords: Atomic Decomposition, Gabor, Gammatone, Matching Pursuit, Voice Activity Detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1792725 Narrowband Speech Hiding using Vector Quantization
Authors: Driss Guerchi, Fatiha Djebbar
Abstract:
In this work we introduce an efficient method to limit the impact of the hiding process on the quality of the cover speech. Vector quantization of the speech spectral information reduces drastically the number of the secret speech parameters to be embedded in the cover signal. Compared to scalar hiding, vector quantization hiding technique provides a stego signal that is indistinguishable from the cover speech. The objective and subjective performance measures reveal that the current hiding technique attracts no suspicion about the presence of the secret message in the stego speech, while being able to recover an intelligible copy of the secret message at the receiver side.Keywords: Speech steganography, LSF vector quantization, fast Fourier transform
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1515724 NDENet: End-to-End Nighttime Dehazing and Enhancement
Authors: H. Baskar, A. S. Chakravarthy, P. Garg, D. Goel, A. S. Raj, K. Kumar, Lakshya, R. Parvatham, V. Sushant, B. Kumar Rout
Abstract:
In this paper, we present a computer vision task called nighttime dehaze-enhancement. This task aims to jointly perform dehazing and lightness enhancement. Our task fundamentally differs from nighttime dehazing – our goal is to jointly dehaze and enhance scenes, while nighttime dehazing aims to dehaze scenes under a nighttime setting. In order to facilitate further research on this task, we release a benchmark dataset called Reside-β Night dataset, consisting of 4122 nighttime hazed images from 2061 scenes and 2061 ground truth images. Moreover, we also propose a network called NDENet (Nighttime Dehaze-Enhancement Network), which jointly performs dehazing and low-light enhancement in an end-to-end manner. We evaluate our method on the proposed benchmark and achieve Structural Index Similarity (SSIM) of 0.8962 and Peak Signal to Noise Ratio (PSNR) of 26.25. We also compare our network with other baseline networks on our benchmark to demonstrate the effectiveness of our approach. We believe that nighttime dehaze-enhancement is an essential task particularly for autonomous navigation applications, and hope that our work will open up new frontiers in research. The code for our network is made publicly available.
Keywords: Dehazing, image enhancement, nighttime, computer vision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 674723 A Comparative Case Study of the Impact of Square and Yurt-Shape Buildings on Energy Efficiency
Authors: Valeriya Tyo, Serikbolat Yessengabulov
Abstract:
Regions with extreme climate conditions such as Astana city require energy saving measures to increase energy performance of buildings which are responsible for more than 40% of total energy consumption. Identification of optimal building geometry is one of key factors to be considered. Architectural form of a building has impact on space heating and cooling energy use, however the interrelationship between the geometry and resultant energy use is not always readily apparent. This paper presents a comparative case study of two prototypical buildings with compact building shape to assess its impact on energy performance.Keywords: Building geometry, energy efficiency, heat gain, heat loss.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2531722 Study on the Seismic Response of Slope under Pulse-Like Ground Motion
Authors: Peter Antwi Buah, Yingbin Zhang, Jianxian He, Chenlin Xiang, Delali Atsu Y. Bakah
Abstract:
Near-fault ground motions with velocity pulses are considered to cause significant damage to structures or slopes compared to ordinary ground motions without velocity pulses. The double pulsed pulse-like ground motion is well known to be stronger than the single pulse. This research has numerically justified this perspective by studying the dynamic response of a homogeneous rock slope subjected to four pulse-like and two non-pulse-like ground motions using the Fast Lagrangian Analysis of Continua in 3 Dimensions (FLAC3D) software. Two of the pulse-like ground motions just have a single pulse. The results show that near-fault ground motions with velocity pulses can cause a higher dynamic response than regular ground motions. The amplification of the peak ground acceleration (PGA) in horizontal direction increases with the increase of the slope elevation. The seismic response of the slope under double pulse ground motion is stronger than that of the single pulse ground motion. The PGV amplification factor under the effect of the non-pulse-like records is also smaller than those under the pulse-like records. The velocity pulse strengthens the earthquake damage to the slope, which results in producing a stronger dynamic response.
Keywords: Velocity pulses, dynamic response, PGV magnification effect, elevation effect, double pulse.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 437721 Estimation of Forest Fire Emission in Thailand by Using Remote Sensing Information
Authors: A. Junpen, S. Garivait, S. Bonnet, A. Pongpullponsak
Abstract:
The forest fires in Thailand are annual occurrence which is the cause of air pollutions. This study intended to estimate the emission from forest fire during 2005-2009 using MODerateresolution Imaging Spectro-radiometer (MODIS) sensor aboard the Terra and Aqua satellites, experimental data, and statistical data. The forest fire emission is estimated using equation established by Seiler and Crutzen in 1982. The spatial and temporal variation of forest fire emission is analyzed and displayed in the form of grid density map. From the satellite data analysis suggested between 2005 and 2009, the number of fire hotspots occurred 86,877 fire hotspots with a significant highest (more than 80% of fire hotspots) in the deciduous forest. The peak period of the forest fire is in January to May. The estimation on the emissions from forest fires during 2005 to 2009 indicated that the amount of CO, CO2, CH4, and N2O was about 3,133,845 tons, 47,610.337 tons, 204,905 tons, and 6,027 tons, respectively, or about 6,171,264 tons of CO2eq. They also emitted 256,132 tons of PM10. The year 2007 was found to be the year when the emissions were the largest. Annually, March is the period that has the maximum amount of forest fire emissions. The areas with high density of forest fire emission were the forests situated in the northern, the western, and the upper northeastern parts of the country.
Keywords: Emissions, Forest fire, Remote sensing information.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2194720 Evaluation of Expected Annual Loss Probabilities of RC Moment Resisting Frames
Authors: Saemee Jun, Dong-Hyeon Shin, Tae-Sang Ahn, Hyung-Joon Kim
Abstract:
Building loss estimation methodologies which have been advanced considerably in recent decades are usually used to estimate socio and economic impacts resulting from seismic structural damage. In accordance with these methods, this paper presents the evaluation of an annual loss probability of a reinforced concrete moment resisting frame designed according to Korean Building Code. The annual loss probability is defined by (1) a fragility curve obtained from a capacity spectrum method which is similar to a method adopted from HAZUS, and (2) a seismic hazard curve derived from annual frequencies of exceedance per peak ground acceleration. Seismic fragilities are computed to calculate the annual loss probability of a certain structure using functions depending on structural capacity, seismic demand, structural response and the probability of exceeding damage state thresholds. This study carried out a nonlinear static analysis to obtain the capacity of a RC moment resisting frame selected as a prototype building. The analysis results show that the probability of being extensive structural damage in the prototype building is expected to 0.01% in a year.
Keywords: Expected annual loss, Loss estimation, RC structure, Fragility analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2375719 An Experimental and Numerical Investigation on Gas Hydrate Plug Flow in the Inclined Pipes and Bends
Authors: M. M. Shabani, O. J. Nydal, R. Larsen
Abstract:
Gas hydrates can agglomerate and block multiphase oil and gas pipelines when water is present at hydrate forming conditions. Using "Cold Flow Technology", the aim is to condition gas hydrates so that they can be transported as a slurry mixture without a risk of agglomeration. During the pipeline shut down however, hydrate particles may settle in bends and build hydrate plugs. An experimental setup has been designed and constructed to study the flow of such plugs at start up operations. Experiments have been performed using model fluid and model hydrate particles. The propagations of initial plugs in a bend were recorded with impedance probes along the pipe. The experimental results show a dispersion of the plug front. A peak in pressure drop was also recorded when the plugs were passing the bend. The evolutions of the plugs have been simulated by numerical integration of the incompressible mass balance equations, with an imposed mixture velocity. The slip between particles and carrier fluid has been calculated using a drag relation together with a particle-fluid force balance.
Keywords: Cold Flow Technology, Gas Hydrate Plug Flow Experiments, One Dimensional Incompressible Two Fluid Model, Slurry Flow in Inclined Pipes and Bends, Transient Slurry Flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2115718 Quad Tree Decomposition Based Analysis of Compressed Image Data Communication for Lossy and Lossless Using WSN
Authors: N. Muthukumaran, R. Ravi
Abstract:
The Quad Tree Decomposition based performance analysis of compressed image data communication for lossy and lossless through wireless sensor network is presented. Images have considerably higher storage requirement than text. While transmitting a multimedia content there is chance of the packets being dropped due to noise and interference. At the receiver end the packets that carry valuable information might be damaged or lost due to noise, interference and congestion. In order to avoid the valuable information from being dropped various retransmission schemes have been proposed. In this proposed scheme QTD is used. QTD is an image segmentation method that divides the image into homogeneous areas. In this proposed scheme involves analysis of parameters such as compression ratio, peak signal to noise ratio, mean square error, bits per pixel in compressed image and analysis of difficulties during data packet communication in Wireless Sensor Networks. By considering the above, this paper is to use the QTD to improve the compression ratio as well as visual quality and the algorithm in MATLAB 7.1 and NS2 Simulator software tool.
Keywords: Image compression, Compression Ratio, Quad tree decomposition, Wireless sensor networks, NS2 simulator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2391717 On Improving Breast Cancer Prediction Using GRNN-CP
Authors: Kefaya Qaddoum
Abstract:
The aim of this study is to predict breast cancer and to construct a supportive model that will stimulate a more reliable prediction as a factor that is fundamental for public health. In this study, we utilize general regression neural networks (GRNN) to replace the normal predictions with prediction periods to achieve a reasonable percentage of confidence. The mechanism employed here utilises a machine learning system called conformal prediction (CP), in order to assign consistent confidence measures to predictions, which are combined with GRNN. We apply the resulting algorithm to the problem of breast cancer diagnosis. The results show that the prediction constructed by this method is reasonable and could be useful in practice.
Keywords: Neural network, conformal prediction, cancer classification, regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 839716 Optimal ECG Sampling Frequency for Multiscale Entropy-Based HRV
Authors: Manjit Singh
Abstract:
Multiscale entropy (MSE) is an extensively used index to provide a general understanding of multiple complexity of physiologic mechanism of heart rate variability (HRV) that operates on a wide range of time scales. Accurate selection of electrocardiogram (ECG) sampling frequency is an essential concern for clinically significant HRV quantification; high ECG sampling rate increase memory requirements and processing time, whereas low sampling rate degrade signal quality and results in clinically misinterpreted HRV. In this work, the impact of ECG sampling frequency on MSE based HRV have been quantified. MSE measures are found to be sensitive to ECG sampling frequency and effect of sampling frequency will be a function of time scale.Keywords: ECG, heart rate variability, HRV, multiscale entropy, sampling frequency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1352715 Assessing drought Vulnerability of Bulgarian Agriculture through Model Simulations
Authors: Z. Popova, L. S. Pereira, М. Ivanova, P. Alexandrova, K. Doneva, V. Alexandrov, M. Kercheva
Abstract:
This study assesses the vulnerability of Bulgarian agriculture to drought using the WINISAREG model and seasonal standard precipitation index SPI(2) for the period 1951-2004. This model was previously validated for maize on soils of different water holding capacity (TAW) in various locations. Simulations are performed for Plovdiv, Stara Zagora and Sofia. Results relative to Plovdiv show that in soils of large TAW (180 mm m-1) net irrigation requirements (NIRs) range 0-40 mm in wet years and 350-380 mm in dry years. In soils of small TAW (116 mm m-1), NIRs reach 440 mm in the very dry year. NIRs in Sofia are about 80 mm smaller. Rainfed maize is associated with great yield variability (29%Keywords: Drought vulnerability, ISAREG simulation model, South Bulgaria, SPI-index
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1742714 Economic Neoliberalism: Property Right and Redistribution Policy
Authors: Aleksandar Savanović
Abstract:
In this paper, we will analyze the relationship between the neo-liberal concept of property rights and redistribution policy. This issue is back in the focus of interest due to the crisis 2008. The crisis has reaffirmed the influence of the state on the free-market processes. The interference of the state with property relations reopened a classical question: is it legitimate to redistribute resources of a man in favor of another man with taxes? The dominant view is that the neoliberal philosophy of natural rights is incompatible with redistributive measures. In principle, this view can be accepted. However, when we look into the details of the theory of natural rights proposed by some coryphaei of neoliberal philosophy, such as Hayek, Nozick, Buchanan and Rothbard, we can see that it is not such an unequivocal view.
Keywords: Economic neoliberalism, natural law, property, redistribution
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2078713 Mining News Sites to Create Special Domain News Collections
Authors: David B. Bracewell, Fuji Ren, Shingo Kuroiwa
Abstract:
We present a method to create special domain collections from news sites. The method only requires a single sample article as a seed. No prior corpus statistics are needed and the method is applicable to multiple languages. We examine various similarity measures and the creation of document collections for English and Japanese. The main contributions are as follows. First, the algorithm can build special domain collections from as little as one sample document. Second, unlike other algorithms it does not require a second “general" corpus to compute statistics. Third, in our testing the algorithm outperformed others in creating collections made up of highly relevant articles.Keywords: Information Retrieval, News, Special DomainCollections,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1488712 Application of Stochastic Models to Annual Extreme Streamflow Data
Authors: Karim Hamidi Machekposhti, Hossein Sedghi
Abstract:
This study was designed to find the best stochastic model (using of time series analysis) for annual extreme streamflow (peak and maximum streamflow) of Karkheh River at Iran. The Auto-regressive Integrated Moving Average (ARIMA) model used to simulate these series and forecast those in future. For the analysis, annual extreme streamflow data of Jelogir Majin station (above of Karkheh dam reservoir) for the years 1958–2005 were used. A visual inspection of the time plot gives a little increasing trend; therefore, series is not stationary. The stationarity observed in Auto-Correlation Function (ACF) and Partial Auto-Correlation Function (PACF) plots of annual extreme streamflow was removed using first order differencing (d=1) in order to the development of the ARIMA model. Interestingly, the ARIMA(4,1,1) model developed was found to be most suitable for simulating annual extreme streamflow for Karkheh River. The model was found to be appropriate to forecast ten years of annual extreme streamflow and assist decision makers to establish priorities for water demand. The Statistical Analysis System (SAS) and Statistical Package for the Social Sciences (SPSS) codes were used to determinate of the best model for this series.Keywords: Stochastic models, ARIMA, extreme streamflow, Karkheh River.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 722