Search results for: real time data.
12495 Remote Monitoring and Control System of Potentiostat Based on the Internet of Things
Authors: Liang Zhao, Guangwen Wang, Guichang Liu
Abstract:
Constant potometer is an important component of pipeline anti-corrosion systems in the chemical industry. Based on Internet of Things (IoT) technology, Programmable Logic Controller (PLC) technology and database technology, this paper developed a set of a constant potometer remote monitoring management system. The remote monitoring and remote adjustment of the working status of the constant potometer are realized. The system has real-time data display, historical data query, alarm push management, user permission management, and supporting Web access and mobile client application (APP) access. The actual engineering project test results show the stability of the system, which can be widely used in cathodic protection systems.
Keywords: Internet of Things, pipe corrosion protection, potentiostat, remote monitoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 97212494 The Application of Real Options to Capital Budgeting
Authors: George Yungchih Wang
Abstract:
Real options theory suggests that managerial flexibility embedded within irreversible investments can account for a significant value in project valuation. Although the argument has become the dominant focus of capital investment theory over decades, yet recent survey literature in capital budgeting indicates that corporate practitioners still do not explicitly apply real options in investment decisions. In this paper, we explore how real options decision criteria can be transformed into equivalent capital budgeting criteria under the consideration of uncertainty, assuming that underlying stochastic process follows a geometric Brownian motion (GBM), a mixed diffusion-jump (MX), or a mean-reverting process (MR). These equivalent valuation techniques can be readily decomposed into conventional investment rules and “option impacts", the latter of which describe the impacts on optimal investment rules with the option value considered. Based on numerical analysis and Monte Carlo simulation, three major findings are derived. First, it is shown that real options could be successfully integrated into the mindset of conventional capital budgeting. Second, the inclusion of option impacts tends to delay investment. It is indicated that the delay effect is the most significant under a GBM process and the least significant under a MR process. Third, it is optimal to adopt the new capital budgeting criteria in investment decision-making and adopting a suboptimal investment rule without considering real options could lead to a substantial loss in value.
Keywords: real options, capital budgeting, geometric Brownianmotion, mixed diffusion-jump, mean-reverting process
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 277012493 Weighted Data Replication Strategy for Data Grid Considering Economic Approach
Authors: N. Mansouri, A. Asadi
Abstract:
Data Grid is a geographically distributed environment that deals with data intensive application in scientific and enterprise computing. Data replication is a common method used to achieve efficient and fault-tolerant data access in Grids. In this paper, a dynamic data replication strategy, called Enhanced Latest Access Largest Weight (ELALW) is proposed. This strategy is an enhanced version of Latest Access Largest Weight strategy. However, replication should be used wisely because the storage capacity of each Grid site is limited. Thus, it is important to design an effective strategy for the replication replacement task. ELALW replaces replicas based on the number of requests in future, the size of the replica, and the number of copies of the file. It also improves access latency by selecting the best replica when various sites hold replicas. The proposed replica selection selects the best replica location from among the many replicas based on response time that can be determined by considering the data transfer time, the storage access latency, the replica requests that waiting in the storage queue and the distance between nodes. Simulation results utilizing the OptorSim show our replication strategy achieve better performance overall than other strategies in terms of job execution time, effective network usage and storage resource usage.
Keywords: Data grid, data replication, simulation, replica selection, replica placement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 210912492 Efficient Scheduling Algorithm for QoS Support in High Speed Downlink Packet Access Networks
Authors: MohammadReza HeidariNezhad, Zuriati Ahmad Zukarnain, Nur Izura Udzir, Mohamed Othman
Abstract:
In this paper, we propose APO, a new packet scheduling scheme with Quality of Service (QoS) support for hybrid of real and non-real time services in HSDPA networks. The APO scheduling algorithm is based on the effective channel anticipation model. In contrast to the traditional schemes, the proposed method is implemented based on a cyclic non-work-conserving discipline. Simulation results indicated that proposed scheme has good capability to maximize the channel usage efficiency in compared to another exist scheduling methods. Simulation results demonstrate the effectiveness of the proposed algorithm.Keywords: Scheduling Algorithm, Quality of Service, HSDPA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 159012491 Comparison of Different k-NN Models for Speed Prediction in an Urban Traffic Network
Authors: Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
A database that records average traffic speeds measured at five-minute intervals for all the links in the traffic network of a metropolitan city. While learning from this data the models that can predict future traffic speed would be beneficial for the applications such as the car navigation system, building predictive models for every link becomes a nontrivial job if the number of links in a given network is huge. An advantage of adopting k-nearest neighbor (k-NN) as predictive models is that it does not require any explicit model building. Instead, k-NN takes a long time to make a prediction because it needs to search for the k-nearest neighbors in the database at prediction time. In this paper, we investigate how much we can speed up k-NN in making traffic speed predictions by reducing the amount of data to be searched for without a significant sacrifice of prediction accuracy. The rationale behind this is that we had a better look at only the recent data because the traffic patterns not only repeat daily or weekly but also change over time. In our experiments, we build several different k-NN models employing different sets of features which are the current and past traffic speeds of the target link and the neighbor links in its up/down-stream. The performances of these models are compared by measuring the average prediction accuracy and the average time taken to make a prediction using various amounts of data.Keywords: Big data, k-NN, machine learning, traffic speed prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 137612490 Secure and Efficient Transmission of Aggregated Data for Mobile Wireless Sensor Networks
Authors: A. Krishna Veni, R.Geetha
Abstract:
Wireless Sensor Networks (WSNs) are suitable for many scenarios in the real world. The retrieval of data is made efficient by the data aggregation techniques. Many techniques for the data aggregation are offered and most of the existing schemes are not energy efficient and secure. However, the existing techniques use the traditional clustering approach where there is a delay during the packet transmission since there is no proper scheduling. The presented system uses the Velocity Energy-efficient and Link-aware Cluster-Tree (VELCT) scheme in which there is a Data Collection Tree (DCT) which improves the lifetime of the network. The VELCT scheme and the construction of DCT reduce the delay and traffic. The network lifetime can be increased by avoiding the frequent change in cluster topology. Secure and Efficient Transmission of Aggregated data (SETA) improves the security of the data transmission via the trust value of the nodes prior the aggregation of data. Since SETA considers the data only from the trustworthy nodes for aggregation, it is more secure in transmitting the data thereby improving the accuracy of aggregated data.
Keywords: Aggregation, lifetime, network security, wireless sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 121712489 Study of the Vertical Handoff in Heterogeneous Networks and Implement Based On Opnet
Authors: W. Benaatou, A. Latif
Abstract:
In this document we studied more in detail the Performances of the vertical handover in the networks WLAN, WiMAX, UMTS before studying of it the Procedure of Handoff Vertical, the whole buckled by simulations putting forward the performances of the handover in the heterogeneous networks. The goal of Vertical Handover is to carry out several accesses in real-time in the heterogeneous networks. This makes it possible a user to use several networks (such as WLAN UMTS andWiMAX) in parallel, and the system to commutate automatically at another basic station, without disconnecting itself, as if there were no cut and with little loss of data as possible.
Keywords: Vertical handoff, WLAN, UMTS, WIMAX, Heterogeneous.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 222412488 A Modified Fuzzy C-Means Algorithm for Natural Data Exploration
Authors: Binu Thomas, Raju G., Sonam Wangmo
Abstract:
In Data mining, Fuzzy clustering algorithms have demonstrated advantage over crisp clustering algorithms in dealing with the challenges posed by large collections of vague and uncertain natural data. This paper reviews concept of fuzzy logic and fuzzy clustering. The classical fuzzy c-means algorithm is presented and its limitations are highlighted. Based on the study of the fuzzy c-means algorithm and its extensions, we propose a modification to the cmeans algorithm to overcome the limitations of it in calculating the new cluster centers and in finding the membership values with natural data. The efficiency of the new modified method is demonstrated on real data collected for Bhutan-s Gross National Happiness (GNH) program.Keywords: Adaptive fuzzy clustering, clustering, fuzzy logic, fuzzy clustering, c-means.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 199012487 A Novel Application of Network Equivalencing Method in Time Domain to Precise Calculation of Dead Time in Power Transmission Title
Authors: J. Moshtagh, L. Eslami
Abstract:
Various studies have showed that about 90% of single line to ground faults occurred on High voltage transmission lines have transient nature. This type of faults is cleared by temporary outage (by the single phase auto-reclosure). The interval between opening and reclosing of the faulted phase circuit breakers is named “Dead Time” that is varying about several hundred milliseconds. For adjustment of traditional single phase auto-reclosures that usually are not intelligent, it is necessary to calculate the dead time in the off-line condition precisely. If the dead time used in adjustment of single phase auto-reclosure is less than the real dead time, the reclosing of circuit breakers threats the power systems seriously. So in this paper a novel approach for precise calculation of dead time in power transmission lines based on the network equivalencing in time domain is presented. This approach has extremely higher precision in comparison with the traditional method based on Thevenin equivalent circuit. For comparison between the proposed approach in this paper and the traditional method, a comprehensive simulation by EMTP-ATP is performed on an extensive power network.
Keywords: Dead Time, Network Equivalencing, High Voltage Transmission Lines, Single Phase Auto-Reclosure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 158112486 FPGA-based Systems for Evolvable Hardware
Authors: Cyrille Lambert, Tatiana Kalganova, Emanuele Stomeo
Abstract:
Since 1992, year where Hugo de Garis has published the first paper on Evolvable Hardware (EHW), a period of intense creativity has followed. It has been actively researched, developed and applied to various problems. Different approaches have been proposed that created three main classifications: extrinsic, mixtrinsic and intrinsic EHW. Each of these solutions has a real interest. Nevertheless, although the extrinsic evolution generates some excellent results, the intrinsic systems are not so advanced. This paper suggests 3 possible solutions to implement the run-time configuration intrinsic EHW system: FPGA-based Run-Time Configuration system, JBits-based Run-Time Configuration system and Multi-board functional-level Run-Time Configuration system. The main characteristic of the proposed architectures is that they are implemented on Field Programmable Gate Array. A comparison of proposed solutions demonstrates that multi-board functional-level run-time configuration is superior in terms of scalability, flexibility and the implementation easiness.Keywords: Evolvable hardware, evolutionary computation, FPGA systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 245112485 Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring
Authors: Youngji Yoo, Cheong-Sool Park, Jun Seok Kim, Young-Hak Lee, Sung-Shick Kim, Jun-Geol Baek
Abstract:
In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.Keywords: Semiconductor, non-normal mixed process data, clustering, Statistical Quality Control (SQC), regression tree, Pearson distribution system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 178012484 Development of a Real-Time Brain-Computer Interface for Interactive Robot Therapy: An Exploration of EEG and EMG Features during Hypnosis
Authors: Maryam Alimardani, Kazuo Hiraki
Abstract:
This study presents a framework for development of a new generation of therapy robots that can interact with users by monitoring their physiological and mental states. Here, we focused on one of the controversial methods of therapy, hypnotherapy. Hypnosis has shown to be useful in treatment of many clinical conditions. But, even for healthy people, it can be used as an effective technique for relaxation or enhancement of memory and concentration. Our aim is to develop a robot that collects information about user’s mental and physical states using electroencephalogram (EEG) and electromyography (EMG) signals and performs costeffective hypnosis at the comfort of user’s house. The presented framework consists of three main steps: (1) Find the EEG-correlates of mind state before, during, and after hypnosis and establish a cognitive model for state changes, (2) Develop a system that can track the changes in EEG and EMG activities in real time and determines if the user is ready for suggestion, and (3) Implement our system in a humanoid robot that will talk and conduct hypnosis on users based on their mental states. This paper presents a pilot study in regard to the first stage, detection of EEG and EMG features during hypnosis.Keywords: Hypnosis, EEG, robotherapy, brain-computer interface.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156912483 Impact of Crises on Official Statistics: A Case Study of Environmental Statistics at Statistical Centre for the Cooperation Council for the Arab Countries of the Gulf during the COVID-19 Pandemic
Authors: Ibtihaj Al-Siyabi
Abstract:
The crisis of COVID-19 posed enormous challenges to the statistical providers. While official statistics were disrupted by the pandemic and related containment measures, there was a growing and pressing need for real-time data and statistics to inform decisions. This paper gives an account of the way the pandemic impacted the operations of the National Statistical Offices (NSOs) in general in terms of data collection and methods used, and the main challenges encountered by them based on international surveys. It highlights the performance of the Statistical Centre for the Cooperation Council for the Arab Countries of the Gulf, GCC-STAT, and its responsiveness to the pandemic placing special emphasis on environmental statistics. The paper concludes by confirming the GCC-STAT’s resilience and success in facing the challenges.
Keywords: NSO, COVID-19, pandemic, National Statistical Offices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22612482 Optical Flow Based System for Cross Traffic Alert
Authors: Giuseppe Spampinato, Salvatore Curti, Ivana Guarneri, Arcangelo Bruna
Abstract:
This document describes an advanced system and methodology for Cross Traffic Alert (CTA), able to detect vehicles that move into the vehicle driving path from the left or right side. The camera is supposed to be not only on a vehicle still, e.g. at a traffic light or at an intersection, but also moving slowly, e.g. in a car park. In all of the aforementioned conditions, a driver’s short loss of concentration or distraction can easily lead to a serious accident. A valid support to avoid these kinds of car crashes is represented by the proposed system. It is an extension of our previous work, related to a clustering system, which only works on fixed cameras. Just a vanish point calculation and simple optical flow filtering, to eliminate motion vectors due to the car relative movement, is performed to let the system achieve high performances with different scenarios, cameras and resolutions. The proposed system just uses as input the optical flow, which is hardware implemented in the proposed platform and since the elaboration of the whole system is really speed and power consumption, it is inserted directly in the camera framework, allowing to execute all the processing in real-time.
Keywords: Clustering, cross traffic alert, optical flow, real time, vanishing point.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 81112481 An Iterative Updating Method for Damped Gyroscopic Systems
Authors: Yongxin Yuan
Abstract:
The problem of updating damped gyroscopic systems using measured modal data can be mathematically formulated as following two problems. Problem I: Given Ma ∈ Rn×n, Λ = diag{λ1, ··· , λp} ∈ Cp×p, X = [x1, ··· , xp] ∈ Cn×p, where p<n and both Λ and X are closed under complex conjugation in the sense that λ2j = λ¯2j−1 ∈ C, x2j = ¯x2j−1 ∈ Cn for j = 1, ··· , l, and λk ∈ R, xk ∈ Rn for k = 2l + 1, ··· , p, find real-valued symmetric matrices D,K and a real-valued skew-symmetric matrix G (that is, GT = −G) such that MaXΛ2 + (D + G)XΛ + KX = 0. Problem II: Given real-valued symmetric matrices Da, Ka ∈ Rn×n and a real-valued skew-symmetric matrix Ga, find (D, ˆ G, ˆ Kˆ ) ∈ SE such that Dˆ −Da2+Gˆ−Ga2+Kˆ −Ka2 = min(D,G,K)∈SE (D− Da2 + G − Ga2 + K − Ka2), where SE is the solution set of Problem I and · is the Frobenius norm. This paper presents an iterative algorithm to solve Problem I and Problem II. By using the proposed iterative method, a solution of Problem I can be obtained within finite iteration steps in the absence of roundoff errors, and the minimum Frobenius norm solution of Problem I can be obtained by choosing a special kind of initial matrices. Moreover, the optimal approximation solution (D, ˆ G, ˆ Kˆ ) of Problem II can be obtained by finding the minimum Frobenius norm solution of a changed Problem I. A numerical example shows that the introduced iterative algorithm is quite efficient.
Keywords: Model updating, iterative algorithm, gyroscopic system, partially prescribed spectral data, optimal approximation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 144112480 Does Leisure Time Use Contribute to a Wage Increase of the Thai People?
Authors: Siriwan Saksiriruthai
Abstract:
This paper develops models to analyze the relationship between leisure time and wage change. Using Thailand-s Time Use Survey and Labor Force Survey data, the estimation of wage changes in response to leisure time change indicates that media receiving, personal care and social participation and volunteer activities are the ones that significantly raise hourly wages. Thus, the finding suggests the stimulation in time use for media access to enhance knowledge and productivity, personal care for attractiveness and healthiness in order to raise productivity, and social activities to develop connections for possible future opportunities including wage increase. These activities should be promoted for productive leisure time and for welfare improvement.Keywords: Leisure, wage, time use, Thailand.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 158812479 Simultaneous Tuning of Static Var Compensator and Power System Stabilizer Employing Real- Coded Genetic Algorithm
Authors: S. Panda, N. P. Patidar, R. Singh
Abstract:
Power system stability enhancement by simultaneous tuning of a Power System Stabilizer (PSS) and a Static Var Compensator (SVC)-based controller is thoroughly investigated in this paper. The coordination among the proposed damping stabilizers and the SVC internal voltage regulators has also been taken into consideration. The design problem is formulated as an optimization problem with a time-domain simulation-based objective function and Real-Coded Genetic Algorithm (RCGA) is employed to search for optimal controller parameters. The proposed stabilizers are tested on a weakly connected power system with different disturbances and loading conditions. The nonlinear simulation results are presented to show the effectiveness and robustness of the proposed control schemes over a wide range of loading conditions and disturbances. Further, the proposed design approach is found to be robust and improves stability effectively even under small disturbance and unbalanced fault conditions.
Keywords: Real-Coded Genetic Algorithm (RCGA), Static Var Compensator (SVC), Power System Stabilizer (PSS), Low Frequency Oscillations, Power System Stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 225512478 Voltage Problem Location Classification Using Performance of Least Squares Support Vector Machine LS-SVM and Learning Vector Quantization LVQ
Authors: Khaled Abduesslam. M, Mohammed Ali, Basher H Alsdai, Muhammad Nizam, Inayati
Abstract:
This paper presents the voltage problem location classification using performance of Least Squares Support Vector Machine (LS-SVM) and Learning Vector Quantization (LVQ) in electrical power system for proper voltage problem location implemented by IEEE 39 bus New- England. The data was collected from the time domain simulation by using Power System Analysis Toolbox (PSAT). Outputs from simulation data such as voltage, phase angle, real power and reactive power were taken as input to estimate voltage stability at particular buses based on Power Transfer Stability Index (PTSI).The simulation data was carried out on the IEEE 39 bus test system by considering load bus increased on the system. To verify of the proposed LS-SVM its performance was compared to Learning Vector Quantization (LVQ). The results showed that LS-SVM is faster and better as compared to LVQ. The results also demonstrated that the LS-SVM was estimated by 0% misclassification whereas LVQ had 7.69% misclassification.
Keywords: IEEE 39 bus, Least Squares Support Vector Machine, Learning Vector Quantization, Voltage Collapse.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 240512477 Designing Transcutaneous Inductive Powering Links for Implanted Micro-System Device
Authors: Saad Mutashar Abbas, M. A. Hannan, S. A. Samad, A. Hussain
Abstract:
This paper presented a proposed design for transcutaneous inductive powering links. The design used to transfer power and data to the implanted devices such as implanted Microsystems to stimulate and monitoring the nerves and muscles. The system operated with low band frequency 13.56 MHZ according to industrial- scientific – medical (ISM) band to avoid the tissue heating. For external part, the modulation index is 13 % and the modulation rate 7.3% with data rate 1 Mbit/s assuming Tbit=1us. The system has been designed using 0.35-μm fabricated CMOS technology. The mathematical model is given and the design is simulated using OrCAD P Spice 16.2 software tool and for real-time simulation the electronic workbench MULISIM 11 has been used. The novel circular plane (pancake) coils was simulated using ANSOFT- HFss software.Keywords: Implanted devices, ASK techniques, Class-E power amplifier, Inductive powering and low-frequency ISM band.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 260212476 Applying a Noise Reduction Method to Reveal Chaos in the River Flow Time Series
Authors: Mohammad H. Fattahi
Abstract:
Chaotic analysis has been performed on the river flow time series before and after applying the wavelet based de-noising techniques in order to investigate the noise content effects on chaotic nature of flow series. In this study, 38 years of monthly runoff data of three gauging stations were used. Gauging stations were located in Ghar-e-Aghaj river basin, Fars province, Iran. Noise level of time series was estimated with the aid of Gaussian kernel algorithm. This step was found to be crucial in preventing removal of the vital data such as memory, correlation and trend from the time series in addition to the noise during de-noising process.
Keywords: Chaotic behavior, wavelet, noise reduction, river flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 209512475 A New Implementation of PCA for Fast Face Detection
Authors: Hazem M. El-Bakry
Abstract:
Principal Component Analysis (PCA) has many different important applications especially in pattern detection such as face detection / recognition. Therefore, for real time applications, the response time is required to be as small as possible. In this paper, new implementation of PCA for fast face detection is presented. Such new implementation is designed based on cross correlation in the frequency domain between the input image and eigenvectors (weights). Simulation results show that the proposed implementation of PCA is faster than conventional one.Keywords: Fast Face Detection, PCA, Cross Correlation, Frequency Domain
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 179712474 Encryption Efficiency Analysis and Security Evaluation of RC6 Block Cipher for Digital Images
Authors: Hossam El-din H. Ahmed, Hamdy M. Kalash, Osama S. Farag Allah
Abstract:
This paper investigates the encryption efficiency of RC6 block cipher application to digital images, providing a new mathematical measure for encryption efficiency, which we will call the encryption quality instead of visual inspection, The encryption quality of RC6 block cipher is investigated among its several design parameters such as word size, number of rounds, and secret key length and the optimal choices for the best values of such design parameters are given. Also, the security analysis of RC6 block cipher for digital images is investigated from strict cryptographic viewpoint. The security estimations of RC6 block cipher for digital images against brute-force, statistical, and differential attacks are explored. Experiments are made to test the security of RC6 block cipher for digital images against all aforementioned types of attacks. Experiments and results verify and prove that RC6 block cipher is highly secure for real-time image encryption from cryptographic viewpoint. Thorough experimental tests are carried out with detailed analysis, demonstrating the high security of RC6 block cipher algorithm. So, RC6 block cipher can be considered to be a real-time secure symmetric encryption for digital images.
Keywords: Block cipher, Image encryption, Encryption quality, and Security analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 242512473 Hardware Centric Machine Vision for High Precision Center of Gravity Calculation
Authors: Xin Cheng, Benny Thörnberg, Abdul Waheed Malik, Najeem Lawal
Abstract:
We present a hardware oriented method for real-time measurements of object-s position in video. The targeted application area is light spots used as references for robotic navigation. Different algorithms for dynamic thresholding are explored in combination with component labeling and Center Of Gravity (COG) for highest possible precision versus Signal-to-Noise Ratio (SNR). This method was developed with a low hardware cost in focus having only one convolution operation required for preprocessing of data.Keywords: Dynamic thresholding, segmentation, position measurement, sub-pixel precision, center of gravity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 235312472 Forecasting Enrollment Model Based on First-Order Fuzzy Time Series
Authors: Melike Şah, Konstantin Y.Degtiarev
Abstract:
This paper proposes a novel improvement of forecasting approach based on using time-invariant fuzzy time series. In contrast to traditional forecasting methods, fuzzy time series can be also applied to problems, in which historical data are linguistic values. It is shown that proposed time-invariant method improves the performance of forecasting process. Further, the effect of using different number of fuzzy sets is tested as well. As with the most of cited papers, historical enrollment of the University of Alabama is used in this study to illustrate the forecasting process. Subsequently, the performance of the proposed method is compared with existing fuzzy time series time-invariant models based on forecasting accuracy. It reveals a certain performance superiority of the proposed method over methods described in the literature.
Keywords: Forecasting, fuzzy time series, linguistic values, student enrollment, time-invariant model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 221912471 Sampling of Variables in Discrete-Event Simulation using the Example of Inventory Evolutions in Job-Shop-Systems Based on Deterministic and Non-Deterministic Data
Authors: Bernd Scholz-Reiter, Christian Toonen, Jan Topi Tervo, Dennis Lappe
Abstract:
Time series analysis often requires data that represents the evolution of an observed variable in equidistant time steps. In order to collect this data sampling is applied. While continuous signals may be sampled, analyzed and reconstructed applying Shannon-s sampling theorem, time-discrete signals have to be dealt with differently. In this article we consider the discrete-event simulation (DES) of job-shop-systems and study the effects of different sampling rates on data quality regarding completeness and accuracy of reconstructed inventory evolutions. At this we discuss deterministic as well as non-deterministic behavior of system variables. Error curves are deployed to illustrate and discuss the sampling rate-s impact and to derive recommendations for its wellfounded choice.Keywords: discrete-event simulation, job-shop-system, sampling rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 182712470 Improving Fault Resilience and Reconstruction of Overlay Multicast Tree Using Leaving Time of Participants
Authors: Bhed Bahadur Bista
Abstract:
Network layer multicast, i.e. IP multicast, even after many years of research, development and standardization, is not deployed in large scale due to both technical (e.g. upgrading of routers) and political (e.g. policy making and negotiation) issues. Researchers looked for alternatives and proposed application/overlay multicast where multicast functions are handled by end hosts, not network layer routers. Member hosts wishing to receive multicast data form a multicast delivery tree. The intermediate hosts in the tree act as routers also, i.e. they forward data to the lower hosts in the tree. Unlike IP multicast, where a router cannot leave the tree until all members below it leave, in overlay multicast any member can leave the tree at any time thus disjoining the tree and disrupting the data dissemination. All the disrupted hosts have to rejoin the tree. This characteristic of the overlay multicast causes multicast tree unstable, data loss and rejoin overhead. In this paper, we propose that each node sets its leaving time from the tree and sends join request to a number of nodes in the tree. The nodes in the tree will reject the request if their leaving time is earlier than the requesting node otherwise they will accept the request. The node can join at one of the accepting nodes. This makes the tree more stable as the nodes will join the tree according to their leaving time, earliest leaving time node being at the leaf of the tree. Some intermediate nodes may not follow their leaving time and leave earlier than their leaving time thus disrupting the tree. For this, we propose a proactive recovery mechanism so that disrupted nodes can rejoin the tree at predetermined nodes immediately. We have shown by simulation that there is less overhead when joining the multicast tree and the recovery time of the disrupted nodes is much less than the previous works. KeywordsKeywords: Network layer multicast, Fault Resilience, IP multicast
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 138712469 Survival Model for Partly Interval-Censored Data with Application to Anti D in Rhesus D Negative Studies
Authors: F. A. M. Elfaki, Amar Abobakar, M. Azram, M. Usman
Abstract:
This paper discusses regression analysis of partly interval-censored failure time data, which is occur in many fields including demographical, epidemiological, financial, medical and sociological studies. For the problem, we focus on the situation where the survival time of interest can be described by the additive hazards model in the present of partly interval-censored. A major advantage of the approach is its simplicity and it can be easily implemented by using R software. Simulation studies are conducted which indicate that the approach performs well for practical situations and comparable to the existing methods. The methodology is applied to a set of partly interval-censored failure time data arising from anti D in Rhesus D negative studies.
Keywords: Anti D in Rhesus D negative, Cox’s model, EM algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 169312468 An Intelligent Nondestructive Testing System of Ultrasonic Infrared Thermal Imaging Based on Embedded Linux
Authors: Hao Mi, Ming Yang, Tian-yue Yang
Abstract:
Ultrasonic infrared nondestructive testing is a kind of testing method with high speed, accuracy and localization. However, there are still some problems, such as the detection requires manual real-time field judgment, the methods of result storage and viewing are still primitive. An intelligent non-destructive detection system based on embedded linux is put forward in this paper. The hardware part of the detection system is based on the ARM (Advanced Reduced Instruction Set Computer Machine) core and an embedded linux system is built to realize image processing and defect detection of thermal images. The CLAHE algorithm and the Butterworth filter are used to process the thermal image, and then the boa server and CGI (Common Gateway Interface) technology are used to transmit the test results to the display terminal through the network for real-time monitoring and remote monitoring. The system also liberates labor and eliminates the obstacle of manual judgment. According to the experiment result, the system provides a convenient and quick solution for industrial non-destructive testing.Keywords: Remote monitoring, non-destructive testing, embedded linux system, image processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 96612467 Biometrics Authorize Me!
Authors: João Nóbrega Brites Moita
Abstract:
Can biometrics do what everyone is expecting it will? And more importantly, should it be doing it? Biometrics is the buzzword “on the mouth" of everyone, who are trying to use this technology in a variety of applications. But all this “hype" about biometrics can be dangerous without a careful evaluation of the real needs of each application. In this paper I-ll try to focus on the dangers of using the right technology at the right time in the wrong place.Keywords: Authentication, Authorization, Biometrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 137812466 Study on the Presence of Protozoal Coinfections among Patients with Pneumocystis jirovecii Pneumonia in Bulgaria
Authors: N. Tsvetkova, R. Harizanov A. Ivanova, I. Rainova, N. Yancheva-Petrova, D. Strashimirov, R. Enikova, M. Videnova, E. Kaneva, I. Kaftandjiev, V. Levterova, I. Simeonovski, N. Yanev, G. Hinkov
Abstract:
The Pneumocystis jirovecii (P. jirovecii) and protozoan of the genera Acanthamoeba, Cryptosporidium, and Toxoplasma gondii are opportunistic pathogens that can cause life-threatening infections in immunocompromised patients. Aim of the study was to evaluate the coinfection rate with opportunistic protozoal agents among Bulgarian patients diagnosed with P. jirovecii pneumonia. 38 pulmonary samples were collected from 38 patients (28 HIV-infected) with P. jirovecii infection. P. jirovecii DNA was detected by real-time PCR targeting the large mitochondrial subunit ribosomal RNA gene. Acanthamoeba was determined by genus-specific conventional PCR assay. Real-time PCR for the detection of a Toxoplasma gondii and Cryptosporidium DNA fragment was used. Pneumocystis DNA was detected in all 38 specimens; 28 (73.7%) were from HIV-infected patients. Three (10,7%) of them were coinfected with T. gondii and 1 (3.6%) with Cryptosporidium. In the group of non-HIV-infected (n = 10), Cryptosporidium DNA was detected in an infant (10%). Acanthamoeba DNA was not found in the tested samples. The current study showed a relatively low rate of coinfections of Cryptosporidium spp./T. gondii and P. jirovecii in the Bulgarian patients studied.
Keywords: Coinfection, opportunistic protozoal agents, Pneumocystis jirovecii, pulmonary infections.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 236