Search results for: seismic time history analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13729

Search results for: seismic time history analysis

12649 A New Analytic Solution for the Heat Conduction with Time-Dependent Heat Transfer Coefficient

Authors: Te Wen Tu, Sen Yung Lee

Abstract:

An alternative approach is proposed to develop the analytic solution for one dimensional heat conduction with one mixed type boundary condition and general time-dependent heat transfer coefficient. In this study, the physic meaning of the solution procedure is revealed. It is shown that the shifting function takes the physic meaning of the reciprocal of Biot function in the initial time. Numerical results show the accuracy of this study. Comparing with those given in the existing literature, the difference is less than 0.3%.

Keywords: Analytic solution, heat transfer coefficient, shifting function method, time-dependent boundary condition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2998
12648 Relative Radiometric Correction of Cloudy Multitemporal Satellite Imagery

Authors: Seema Biday, Udhav Bhosle

Abstract:

Repeated observation of a given area over time yields potential for many forms of change detection analysis. These repeated observations are confounded in terms of radiometric consistency due to changes in sensor calibration over time, differences in illumination, observation angles and variation in atmospheric effects. This paper demonstrates applicability of an empirical relative radiometric normalization method to a set of multitemporal cloudy images acquired by Resourcesat1 LISS III sensor. Objective of this study is to detect and remove cloud cover and normalize an image radiometrically. Cloud detection is achieved by using Average Brightness Threshold (ABT) algorithm. The detected cloud is removed and replaced with data from another images of the same area. After cloud removal, the proposed normalization method is applied to reduce the radiometric influence caused by non surface factors. This process identifies landscape elements whose reflectance values are nearly constant over time, i.e. the subset of non-changing pixels are identified using frequency based correlation technique. The quality of radiometric normalization is statistically assessed by R2 value and mean square error (MSE) between each pair of analogous band.

Keywords: Correlation, Frequency domain, Multitemporal, Relative Radiometric Correction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1968
12647 On the Verification of Power Nap Associated with Stage 2 Sleep and Its Application

Authors: Jetsada Arnin, Yodchanan Wongsawat

Abstract:

One of the most important causes of accidents is driver fatigue. To reduce the accidental rate, the driver needs a quick nap when feeling sleepy. Hence, searching for the minimum time period of nap is a very challenging problem. The purpose of this paper is twofold, i.e. to investigate the possible fastest time period for nap and its relationship with stage 2 sleep, and to develop an automatic stage 2 sleep detection and alarm device. The experiment for this investigation is designed with 21 subjects. It yields the result that waking up the subjects after getting into stage 2 sleep for 3-5 minutes can efficiently reduce the sleepiness. Furthermore, the automatic stage 2 sleep detection and alarm device yields the real-time detection accuracy of approximately 85% which is comparable with the commercial sleep lab system.

Keywords: Stage 2 sleep, nap, sleep detection, real-time, EEG

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1446
12646 Development of EPID-based Real time Dose Verification for Dynamic IMRT

Authors: Todsaporn Fuangrod, Daryl J. O'Connor, Boyd MC McCurdy, Peter B. Greer

Abstract:

An electronic portal image device (EPID) has become a method of patient-specific IMRT dose verification for radiotherapy. Research studies have focused on pre and post-treatment verification, however, there are currently no interventional procedures using EPID dosimetry that measure the dose in real time as a mechanism to ensure that overdoses do not occur and underdoses are detected as soon as is practically possible. As a result, an EPID-based real time dose verification system for dynamic IMRT was developed and was implemented with MATLAB/Simulink. The EPID image acquisition was set to continuous acquisition mode at 1.4 images per second. The system defined the time constraint gap, or execution gap at the image acquisition time, so that every calculation must be completed before the next image capture is completed. In addition, the <=-evaluation method was used for dose comparison, with two types of comparison processes; individual image and cumulative dose comparison monitored. The outputs of the system are the <=-map, the percent of <=<1, and mean-<= versus time, all in real time. Two strategies were used to test the system, including an error detection test and a clinical data test. The system can monitor the actual dose delivery compared with the treatment plan data or previous treatment dose delivery that means a radiation therapist is able to switch off the machine when the error is detected.

Keywords: real-time dose verification, EPID dosimetry, simulation, dynamic IMRT

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2169
12645 Effect of Atmospheric Turbulence on AcquisitionTime of Ground to Deep Space Optical Communication System

Authors: Hemani Kaushal, V.K.Jain, Subrat Kar

Abstract:

The performance of ground to deep space optical communication systems is degraded by distortion of the beam as it propagates through the turbulent atmosphere. Turbulence causes fluctuations in the intensity of the received signal which ultimately affects the acquisition time required to acquire and locate the spaceborne target using narrow laser beam. In this paper, performance of free-space optical (FSO) communication system in atmospheric turbulence has been analyzed in terms of acquisition time for coherent and non-coherent modulation schemes. Numerical results presented in graphical and tabular forms show that the acquisition time increases with the increase in turbulence level. This is true for both schemes. The BPSK has lowest acquisition time among all schemes. In non-coherent schemes, M-PPM performs better than the other schemes. With the increase in M, acquisition time becomes lower, but at the cost of increase in system complexity.

Keywords: Atmospheric Turbulence, Acquisition Time, BinaryPhase Shift Keying (BPSK), Free-Space Optical (FSO)Communication System, M-ary Pulse Position Modulation (M-PPM), Coherent/Non-coherent Modulation Schemes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1762
12644 Applicability of Linearized Model of Synchronous Generator for Power System Stability Analysis

Authors: J. Ritonja, B. Grcar

Abstract:

For the synchronous generator simulation and analysis and for the power system stabilizer design and synthesis a mathematical model of synchronous generator is needed. The model has to accurately describe dynamics of oscillations, while at the same time has to be transparent enough for an analysis and sufficiently simplified for design of control system. To study the oscillations of the synchronous generator against to the rest of the power system, the model of the synchronous machine connected to an infinite bus through a transmission line having resistance and inductance is needed. In this paper, the linearized reduced order dynamic model of the synchronous generator connected to the infinite bus is presented and analysed in details. This model accurately describes dynamics of the synchronous generator only in a small vicinity of an equilibrium state. With the digression from the selected equilibrium point the accuracy of this model is decreasing considerably. In this paper, the equations’ descriptions and the parameters’ determinations for the linearized reduced order mathematical model of the synchronous generator are explained and summarized and represent the useful origin for works in the areas of synchronous generators’ dynamic behaviour analysis and synchronous generator’s control systems design and synthesis. The main contribution of this paper represents the detailed analysis of the accuracy of the linearized reduced order dynamic model in the entire synchronous generator’s operating range. Borders of the areas where the linearized reduced order mathematical model represents accurate description of the synchronous generator’s dynamics are determined with the systemic numerical analysis. The thorough eigenvalue analysis of the linearized models in the entire operating range is performed. In the paper, the parameters of the linearized reduced order dynamic model of the laboratory salient poles synchronous generator were determined and used for the analysis. The theoretical conclusions were confirmed with the agreement of experimental and simulation results.

Keywords: Eigenvalue analysis, mathematical model, power system stability, synchronous generator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568
12643 Estimation of Asphalt Pavement Surfaces Using Image Analysis Technique

Authors: Mohammad A. Khasawneh

Abstract:

Asphalt concrete pavements gradually lose their skid resistance causing safety problems especially under wet conditions and high driving speeds. In order to enact the actual field polishing and wearing process of asphalt pavement surfaces in a laboratory setting, several laboratory-scale accelerated polishing devices were developed by different agencies. To mimic the actual process, friction and texture measuring devices are needed to quantify surface deterioration at different polishing intervals that reflect different stages of the pavement life. The test could still be considered lengthy and to some extent labor-intensive. Therefore, there is a need to come up with another method that can assist in investigating the bituminous pavement surface characteristics in a practical and time-efficient test procedure.

The purpose of this paper is to utilize a well-developed image analysis technique to characterize asphalt pavement surfaces without the need to use conventional friction and texture measuring devices in an attempt to shorten and simplify the polishing procedure in the lab.

Promising findings showed the possibility of using image analysis in lieu of the labor-sensitive-variable-in-nature friction and texture measurements. It was found that the exposed aggregate surface area of asphalt specimens made from limestone and gravel aggregates produced solid evidence of the validity of this method in describing asphalt pavement surfaces. Image analysis results correlated well with the British Pendulum Numbers (BPN), Polish Values (PV) and Mean Texture Depth (MTD) values.

Keywords: Friction, Image Analysis, Polishing, Statistical Analysis, Texture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2544
12642 Contingency Screening Using Risk Factor Considering Transmission Line Outage

Authors: M. Marsadek, A. Mohamed

Abstract:

Power system security analysis is the most time demanding process due to large number of possible contingencies that need to be analyzed.  In a power system, any contingency resulting in security violation such as line overload or low voltage may occur for a number of reasons at any time.  To efficiently rank a contingency, both probability and the extent of security violation must be considered so as not to underestimate the risk associated with the contingency. This paper proposed a contingency ranking method that take into account the probabilistic nature of power system and the severity of contingency by using a newly developed method based on risk factor.  The proposed technique is implemented on IEEE 24-bus system.

Keywords: Line overload, low voltage, probability, risk factor, severity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1211
12641 Neural Network Evaluation of FRP Strengthened RC Buildings Subjected to Near-Fault Ground Motions having Fling Step

Authors: Alireza Mortezaei, Kimia Mortezaei

Abstract:

Recordings from recent earthquakes have provided evidence that ground motions in the near field of a rupturing fault differ from ordinary ground motions, as they can contain a large energy, or “directivity" pulse. This pulse can cause considerable damage during an earthquake, especially to structures with natural periods close to those of the pulse. Failures of modern engineered structures observed within the near-fault region in recent earthquakes have revealed the vulnerability of existing RC buildings against pulse-type ground motions. This may be due to the fact that these modern structures had been designed primarily using the design spectra of available standards, which have been developed using stochastic processes with relatively long duration that characterizes more distant ground motions. Many recently designed and constructed buildings may therefore require strengthening in order to perform well when subjected to near-fault ground motions. Fiber Reinforced Polymers are considered to be a viable alternative, due to their relatively easy and quick installation, low life cycle costs and zero maintenance requirements. The objective of this paper is to investigate the adequacy of Artificial Neural Networks (ANN) to determine the three dimensional dynamic response of FRP strengthened RC buildings under the near-fault ground motions. For this purpose, one ANN model is proposed to estimate the base shear force, base bending moments and roof displacement of buildings in two directions. A training set of 168 and a validation set of 21 buildings are produced from FEA analysis results of the dynamic response of RC buildings under the near-fault earthquakes. It is demonstrated that the neural network based approach is highly successful in determining the response.

Keywords: Seismic evaluation, FRP, neural network, near-fault ground motion

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1722
12640 Case Study of Bus Tourist-s Sightseeing Time in a New Sightseeing Spot

Authors: Takayuki Nanashima, Yoshiyuki Higuchi, Masao Ohta, Takashi Kuroda

Abstract:

As a result of traffic congestion caused by sightseeing and shuttle buses using park-and-ride parking lot near sightseeing spot, the waiting time for tourist increases. In this paper, when bus parking lot near sightseeing spot are overcrowded and full, a model for tourists getting off a bus on a congested road and transfer to the sightseeing spot by foot is proposed and verified. A model of getting off a bus on a congested road when the sightseeing parking lot is overcrowded was considered by the case analysis. As a result, effectiveness of the model of getting off a bus on a congested road could be quantitatively verified for times when parking capacity is exceeded and the bus parking lot next to the sightseeing spot is overcrowded.

Keywords: Transportation demand management, Park-and-ride, Traffic congestion, Tourist satisfaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1788
12639 Automated Segmentation of ECG Signals using Piecewise Derivative Dynamic Time Warping

Authors: Ali Zifan, Mohammad Hassan Moradi, Sohrab Saberi, Farzad Towhidkhah

Abstract:

Electrocardiogram (ECG) segmentation is necessary to help reduce the time consuming task of manually annotating ECG-s. Several algorithms have been developed to segment the ECG automatically. We first review several of such methods, and then present a new single lead segmentation method based on Adaptive piecewise constant approximation (APCA) and Piecewise derivative dynamic time warping (PDDTW). The results are tested on the QT database. We compared our results to Laguna-s two lead method. Our proposed approach has a comparable mean error, but yields a slightly higher standard deviation than Laguna-s method.

Keywords: Adaptive Piecewise Constant Approximation, Dynamic programming, ECG segmentation, Piecewise DerivativeDynamic Time Warping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2048
12638 A Combination of Similarity Ranking and Time for Social Research Paper Searching

Authors: P. Jomsri

Abstract:

Nowadays social media are important tools for web resource discovery. The performance and capabilities of web searches are vital, especially search results from social research paper bookmarking. This paper proposes a new algorithm for ranking method that is a combination of similarity ranking with paper posted time or CSTRank. The paper posted time is static ranking for improving search results. For this particular study, the paper posted time is combined with similarity ranking to produce a better ranking than other methods such as similarity ranking or SimRank. The retrieval performance of combination rankings is evaluated using mean values of NDCG. The evaluation in the experiments implies that the chosen CSTRank ranking by using weight score at ratio 90:10 can improve the efficiency of research paper searching on social bookmarking websites.

Keywords: combination ranking, information retrieval, time, similarity ranking, static ranking, weight score

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645
12637 A Robust Method for Hand Tracking Using Mean-shift Algorithm and Kalman Filter in Stereo Color Image Sequences

Authors: Mahmoud Elmezain, Ayoub Al-Hamadi, Robert Niese, Bernd Michaelis

Abstract:

Real-time hand tracking is a challenging task in many computer vision applications such as gesture recognition. This paper proposes a robust method for hand tracking in a complex environment using Mean-shift analysis and Kalman filter in conjunction with 3D depth map. The depth information solve the overlapping problem between hands and face, which is obtained by passive stereo measuring based on cross correlation and the known calibration data of the cameras. Mean-shift analysis uses the gradient of Bhattacharyya coefficient as a similarity function to derive the candidate of the hand that is most similar to a given hand target model. And then, Kalman filter is used to estimate the position of the hand target. The results of hand tracking, tested on various video sequences, are robust to changes in shape as well as partial occlusion.

Keywords: Computer Vision and Image Analysis, Object Tracking, Gesture Recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2894
12636 Determining the Best Fitting Distributions for Minimum Flows of Streams in Gediz Basin

Authors: Naci Büyükkaracığan

Abstract:

Today, the need for water sources is swiftly increasing due to population growth. At the same time, it is known that some regions will face with shortage of water and drought because of the global warming and climate change. In this context, evaluation and analysis of hydrological data such as the observed trends, drought and flood prediction of short term flow has great deal of importance. The most accurate selection probability distribution is important to describe the low flow statistics for the studies related to drought analysis. As in many basins In Turkey, Gediz River basin will be affected enough by the drought and will decrease the amount of used water. The aim of this study is to derive appropriate probability distributions for frequency analysis of annual minimum flows at 6 gauging stations of the Gediz Basin. After applying 10 different probability distributions, six different parameter estimation methods and 3 fitness test, the Pearson 3 distribution and general extreme values distributions were found to give optimal results.

Keywords: Gediz Basin, goodness-of-fit tests, Minimum flows, probability distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2495
12635 Web Based Real Time Laboratory Applications of Analog and Digital Communication Courses with Lab VIEW Access

Authors: Ayse Yayla, Aynur Akar

Abstract:

Developments in scientific and technical area cause to use new methods and techniques in education, as is the case in all fields. Especially, the internet contributes a variety of new methods to design virtual and real time laboratory applications in education. In this study, a real time virtual laboratory is designed and implemented for analog and digital communications laboratory experiments by using Lab VIEW program for Marmara University Electronics-Communication Department. In this application, students can access the virtual laboratory web site and perform their experiments without any limitation of time and location so as the students can observe the signals by changing the parameters of the experiment and evaluate the results.

Keywords: Virtual laboratory, LabVIEW, ModulationTechniques

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2660
12634 An ICA Algorithm for Separation of Convolutive Mixture of Speech Signals

Authors: Rajkishore Prasad, Hiroshi Saruwatari, Kiyohiro Shikano

Abstract:

This paper describes Independent Component Analysis (ICA) based fixed-point algorithm for the blind separation of the convolutive mixture of speech, picked-up by a linear microphone array. The proposed algorithm extracts independent sources by non- Gaussianizing the Time-Frequency Series of Speech (TFSS) in a deflationary way. The degree of non-Gaussianization is measured by negentropy. The relative performances of algorithm under random initialization and Null beamformer (NBF) based initialization are studied. It has been found that an NBF based initial value gives speedy convergence as well as better separation performance

Keywords: Blind signal separation, independent component analysis, negentropy, convolutive mixture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
12633 Roll of Membership functions in Fuzzy Logic for Prediction of Shoot Length of Mustard Plant Based on Residual Analysis

Authors: Satyendra Nath Mandal, J. Pal Choudhury, Dilip De, S. R. Bhadra Chaudhuri

Abstract:

The selection for plantation of a particular type of mustard plant depending on its productivity (pod yield) at the stage of maturity. The growth of mustard plant dependent on some parameters of that plant, these are shoot length, number of leaves, number of roots and roots length etc. As the plant is growing, some leaves may be fall down and some new leaves may come, so it can not gives the idea to develop the relationship with the seeds weight at mature stage of that plant. It is not possible to find the number of roots and root length of mustard plant at growing stage that will be harmful of this plant as roots goes deeper to deeper inside the land. Only the value of shoot length which increases in course of time can be measured at different time instances. Weather parameters are maximum and minimum humidity, rain fall, maximum and minimum temperature may effect the growth of the plant. The parameters of pollution, water, soil, distance and crop management may be dominant factors of growth of plant and its productivity. Considering all parameters, the growth of the plant is very uncertain, fuzzy environment can be considered for the prediction of shoot length at maturity of the plant. Fuzzification plays a greater role for fuzzification of data, which is based on certain membership functions. Here an effort has been made to fuzzify the original data based on gaussian function, triangular function, s-function, Trapezoidal and L –function. After that all fuzzified data are defuzzified to get normal form. Finally the error analysis (calculation of forecasting error and average error) indicates the membership function appropriate for fuzzification of data and use to predict the shoot length at maturity. The result is also verified using residual (Absolute Residual, Maximum of Absolute Residual, Mean Absolute Residual, Mean of Mean Absolute Residual, Median of Absolute Residual and Standard Deviation) analysis.

Keywords: Fuzzification, defuzzification, gaussian function, triangular function, trapezoidal function, s-function, , membership function, residual analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2309
12632 Comparison of Artificial Neural Network Architectures in the Task of Tourism Time Series Forecast

Authors: João Paulo Teixeira, Paula Odete Fernandes

Abstract:

The authors have been developing several models based on artificial neural networks, linear regression models, Box- Jenkins methodology and ARIMA models to predict the time series of tourism. The time series consist in the “Monthly Number of Guest Nights in the Hotels" of one region. Several comparisons between the different type models have been experimented as well as the features used at the entrance of the models. The Artificial Neural Network (ANN) models have always had their performance at the top of the best models. Usually the feed-forward architecture was used due to their huge application and results. In this paper the author made a comparison between different architectures of the ANNs using simply the same input. Therefore, the traditional feed-forward architecture, the cascade forwards, a recurrent Elman architecture and a radial based architecture were discussed and compared based on the task of predicting the mentioned time series.

Keywords: Artificial Neural Network Architectures, time series forecast, tourism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1866
12631 Resveratrol Incorporated Liposomes Prepared from Pegylated Phospholipids and Cholesterol

Authors: Mont Kumpugdee-Vollrath, Khaled Abdallah

Abstract:

Liposomes and pegylated liposomes were widely used as drug delivery system in pharmaceutical field since a long time. However, in the former time, polyethylene glycol (PEG) was connected into phospholipid after the liposomes were already prepared. In this paper, we intend to study the possibility of applying phospholipids which already connected with PEG and then they were used to prepare liposomes. The model drug resveratrol was used because it can be applied against different diseases. Cholesterol was applied to stabilize the membrane of liposomes. The thin film technique in a laboratory scale was a preparation method. The liposomes were then characterized by nanoparticle tracking analysis (NTA), photon correlation spectroscopy (PCS) and light microscopic techniques. The stable liposomes can be produced and the particle sizes after filtration were in nanometers. The 2- and 3-chains-PEG-phospholipid (PL) caused in smaller particle size than the 4-chains-PEG-PL. Liposomes from PL 90G and cholesterol were stable during storage at 8 °C of 56 days because the particle sizes measured by PCS were almost not changed. There was almost no leakage of resveratrol from liposomes PL 90G with cholesterol after diffusion test in dialysis tube for 28 days. All liposomes showed the sustained release during measuring time of 270 min. The maximum release amount of 16-20% was detected with liposomes from 2- and 3-chains-PEG-PL. The other liposomes gave max. release amount of resveratrol only of 10%. The release kinetic can be explained by Korsmeyer-Peppas equation. 

Keywords: Liposome, NTA, resveratrol, pegylation, cholesterol.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1040
12630 Stability Verification for Bilateral Teleoperation System with Variable Time Delay

Authors: M. Sallam, A. Ramadan, M. Fanni, M. Abdellatif

Abstract:

Time delay in bilateral teleoperation system was introduced as a sufficient reason to make the system unstable or certainly degrade the system performance. In this paper, simulations and experimental results of implementing p-like control scheme, under different ranges of variable time delay, will be presented to verify a certain criteria, which guarantee the system stability and position tracking. The system consists of two Phantom premium 1.5A devices. One of them acts as a master and the other acts as a slave. The study includes deriving the Phantom kinematic and dynamic model, establishing the link between the two Phantoms over Simulink in Matlab, and verifying the stability criteria with simulations and real experiments.

Keywords: bilateral teleoperation, Phantom premium 1.5, varying time delay

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1511
12629 Searching for Forensic Evidence in a Compromised Virtual Web Server against SQL Injection Attacks and PHP Web Shell

Authors: Gigih Supriyatno

Abstract:

SQL injection is one of the most common types of attacks and has a very critical impact on web servers. In the worst case, an attacker can perform post-exploitation after a successful SQL injection attack. In the case of forensics web servers, web server analysis is closely related to log file analysis. But sometimes large file sizes and different log types make it difficult for investigators to look for traces of attackers on the server. The purpose of this paper is to help investigator take appropriate steps to investigate when the web server gets attacked. We use attack scenarios using SQL injection attacks including PHP backdoor injection as post-exploitation. We perform post-mortem analysis of web server logs based on Hypertext Transfer Protocol (HTTP) POST and HTTP GET method approaches that are characteristic of SQL injection attacks. In addition, we also propose structured analysis method between the web server application log file, database application, and other additional logs that exist on the webserver. This method makes the investigator more structured to analyze the log file so as to produce evidence of attack with acceptable time. There is also the possibility that other attack techniques can be detected with this method. On the other side, it can help web administrators to prepare their systems for the forensic readiness.

Keywords: Web forensic, SQL injection, web shell, investigation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1235
12628 Simulation Modeling for Analysis and Evaluation of the Internal Handling Fleet System at Shahid Rajaee Container Port

Authors: Parham Azimi, Mohammad Reza Ghanbari

Abstract:

The dramatic increasing of sea-freight container transportations and the developing trends for using containers in the multimodal handling systems through the sea, rail, road and land in nowadays market cause general managers of container terminals to face challenges such as increasing demand, competitive situation, new investments and expansion of new activities and need to use new methods to fulfil effective operations both along quayside and within the yard. Among these issues, minimizing the turnaround time of vessels is considered to be the first aim of every container port system. Regarding the complex structure of container ports, this paper presents a simulation model that calculates the number of trucks needed in the Iranian Shahid Rajaee Container Port for handling containers between the berth and the yard. In this research, some important criteria such as vessel turnaround time, gantry crane utilization and truck utilization have been considered. By analyzing the results of the model, it has been shown that increasing the number of trucks to 66 units has a significant effect on the performance indices of the port and can increase the capacity of loading and unloading up to 10.8%.

Keywords: Container Terminal, Gantry Crane Utilization, Simulation, Vessel Turnaround Time

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1864
12627 Evolving Knowledge Extraction from Online Resources

Authors: Zhibo Xiao, Tharini Nayanika de Silva, Kezhi Mao

Abstract:

In this paper, we present an evolving knowledge extraction system named AKEOS (Automatic Knowledge Extraction from Online Sources). AKEOS consists of two modules, including a one-time learning module and an evolving learning module. The one-time learning module takes in user input query, and automatically harvests knowledge from online unstructured resources in an unsupervised way. The output of the one-time learning is a structured vector representing the harvested knowledge. The evolving learning module automatically schedules and performs repeated one-time learning to extract the newest information and track the development of an event. In addition, the evolving learning module summarizes the knowledge learned at different time points to produce a final knowledge vector about the event. With the evolving learning, we are able to visualize the key information of the event, discover the trends, and track the development of an event.

Keywords: Evolving learning, knowledge extraction, knowledge graph, text mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 927
12626 Improving Taint Analysis of Android Applications Using Finite State Machines

Authors: Assad Maalouf, Lunjin Lu, James Lynott

Abstract:

We present a taint analysis that can automatically detect when string operations result in a string that is free of taints, where all the tainted patterns have been removed. This is an improvement on the conservative behavior of previous taint analyzers, where a string operation on a tainted string always leads to a tainted string unless the operation is manually marked as a sanitizer. The taint analysis is built on top of a string analysis that uses finite state automata to approximate the sets of values that string variables can take during the execution of a program. The proposed approach has been implemented as an extension of FlowDroid and experimental results show that the resulting taint analyzer is much more precise than the original FlowDroid.

Keywords: Android, static analysis, string analysis, taint analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 628
12625 Automated ECG Segmentation Using Piecewise Derivative Dynamic Time Warping

Authors: Ali Zifan, Sohrab Saberi, Mohammad Hassan Moradi, Farzad Towhidkhah

Abstract:

Electrocardiogram (ECG) segmentation is necessary to help reduce the time consuming task of manually annotating ECG's. Several algorithms have been developed to segment the ECG automatically. We first review several of such methods, and then present a new single lead segmentation method based on Adaptive piecewise constant approximation (APCA) and Piecewise derivative dynamic time warping (PDDTW). The results are tested on the QT database. We compared our results to Laguna's two lead method. Our proposed approach has a comparable mean error, but yields a slightly higher standard deviation than Laguna's method.

Keywords: Adaptive Piecewise Constant Approximation, Dynamic programming, ECG segmentation, Piecewise Derivative Dynamic Time Warping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2368
12624 Morphological Description of Cervical Cell Images for the Pathological Recognition

Authors: N. Lassouaoui, L. Hamami, N. Nouali

Abstract:

The tracking allows to detect the tumor affections of cervical cancer, it is particularly complex and consuming time, because it consists in seeking some abnormal cells among a cluster of normal cells. In this paper, we present our proposed computer system for helping the doctors in tracking the cervical cancer. Knowing that the diagnosis of the malignancy is based in the set of atypical morphological details of all cells, herein, we present an unsupervised genetic algorithm for the separation of cell components since the diagnosis is doing by analysis of the core and the cytoplasm. We give also the various algorithms used for computing the morphological characteristics of cells (Ratio core/cytoplasm, cellular deformity, ...) necessary for the recognition of illness.

Keywords: Cervical cell, morphological analysis, recognition, segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921
12623 Heuristics Analysis for Distributed Scheduling using MONARC Simulation Tool

Authors: Florin Pop

Abstract:

Simulation is a very powerful method used for highperformance and high-quality design in distributed system, and now maybe the only one, considering the heterogeneity, complexity and cost of distributed systems. In Grid environments, foe example, it is hard and even impossible to perform scheduler performance evaluation in a repeatable and controllable manner as resources and users are distributed across multiple organizations with their own policies. In addition, Grid test-beds are limited and creating an adequately-sized test-bed is expensive and time consuming. Scalability, reliability and fault-tolerance become important requirements for distributed systems in order to support distributed computation. A distributed system with such characteristics is called dependable. Large environments, like Cloud, offer unique advantages, such as low cost, dependability and satisfy QoS for all users. Resource management in large environments address performant scheduling algorithm guided by QoS constrains. This paper presents the performance evaluation of scheduling heuristics guided by different optimization criteria. The algorithms for distributed scheduling are analyzed in order to satisfy users constrains considering in the same time independent capabilities of resources. This analysis acts like a profiling step for algorithm calibration. The performance evaluation is based on simulation. The simulator is MONARC, a powerful tool for large scale distributed systems simulation. The novelty of this paper consists in synthetic analysis results that offer guidelines for scheduler service configuration and sustain the empirical-based decision. The results could be used in decisions regarding optimizations to existing Grid DAG Scheduling and for selecting the proper algorithm for DAG scheduling in various actual situations.

Keywords: Scheduling, Simulation, Performance Evaluation, QoS, Distributed Systems, MONARC

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737
12622 Estimation of Time -Varying Linear Regression with Unknown Time -Volatility via Continuous Generalization of the Akaike Information Criterion

Authors: Elena Ezhova, Vadim Mottl, Olga Krasotkina

Abstract:

The problem of estimating time-varying regression is inevitably concerned with the necessity to choose the appropriate level of model volatility - ranging from the full stationarity of instant regression models to their absolute independence of each other. In the stationary case the number of regression coefficients to be estimated equals that of regressors, whereas the absence of any smoothness assumptions augments the dimension of the unknown vector by the factor of the time-series length. The Akaike Information Criterion is a commonly adopted means of adjusting a model to the given data set within a succession of nested parametric model classes, but its crucial restriction is that the classes are rigidly defined by the growing integer-valued dimension of the unknown vector. To make the Kullback information maximization principle underlying the classical AIC applicable to the problem of time-varying regression estimation, we extend it onto a wider class of data models in which the dimension of the parameter is fixed, but the freedom of its values is softly constrained by a family of continuously nested a priori probability distributions.

Keywords: Time varying regression, time-volatility of regression coefficients, Akaike Information Criterion (AIC), Kullback information maximization principle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1520
12621 Development of Maintenance Schedule and Root Cause Analysis Based on Computerized Maintenance Management System for a Fertilizer Plant

Authors: Sanjeev Kumar

Abstract:

This paper deals with development of Computerized Maintenance Management System (CMMS) for a fertilizer plant. The software is advanced, easy to use, less complex, less expensive and also less time consuming. It consists of number of modules like detailed information of equipment, maintenance procedures, work order and employees detail. The objectives of CMMS are to reduce overall downtime, overall yearly maintenance cost and occurrence of failures of the equipment and to get day-by-day maintenance plan and strategy. In this regard, the behavioral chart for urea prilling unit at Fertilizer plant has been developed in form of Root Cause Analysis (RCA). Besides this, a maintenance program has also been proposed and used for the purpose of maintenance planning of the urea prilling unit. The outcome of software has been consulted with the concerned plant individuals and found to be extremely favorable for improving the performance level of the concerned plant.

Keywords: Computerized maintenance management system, root cause analysis, maintenance schedule, urea prilling system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1158
12620 The Possibility of Solving a 3x3 Rubik’s Cube under 3 Seconds

Authors: Chung To Kong, Siu Ming Yiu

Abstract:

Rubik's cube was invented in 1974. Since then, speedcubers all over the world try their best to break the world record again and again. The newest record is 3.47 seconds. There are many factors that affect the timing including turns per second (tps), algorithm, finger trick, and hardware of the cube. In this paper, the lower bound of the cube solving time will be discussed using convex optimization. Extended analysis of the world records will be used to understand how to improve the timing. With the understanding of each part of the solving step, the paper suggests a list of speed improvement technique. Based on the analysis of the world record, there is a high possibility that the 3 seconds mark will be broken soon.

Keywords: Rubik’s cube, convex optimization, speed cubing, CFOP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 794