Search results for: Continuous query processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2452

Search results for: Continuous query processing

472 The Use of Rice Husk Ash as a Stabilizing Agent in Lateritic Clay Soil

Authors: J. O. Akinyele, R. W. Salim, K. O. Oikelome, O. T. Olateju

Abstract:

Rice Husk (RH) is the major byproduct in the processing of paddy rice. The management of this waste has become a big challenge to some of the rice producers, some of these wastes are left in open dumps while some are burn in the open space, and these two actions have been contributing to environmental pollution. This study evaluates an alternative waste management of this agricultural product for use as a civil engineering material. The RH was burn in a controlled environment to form Rice Husk Ash (RHA). The RHA was mix with lateritic clay at 0, 2, 4, 6, 8, and 10% proportion by weight. Chemical test was conducted on the open burn and controlled burn RHA with the lateritic clay. Physical test such as particle size distribution, Atterberg limits test, and density test were carried out on the mix material. The chemical composition obtained for the RHA showed that the total percentage compositions of Fe2O3, SiO2 and Al2O3 were found to be above 70% (class “F” pozzolan) which qualifies it as a very good pozzolan. The coefficient of uniformity (Cu) was 8 and coefficient of curvature (Cc) was 2 for the soil sample. The Plasticity Index (PI) for the 0, 2, 4, 6, 8. 10% was 21.0, 18.8, 16.7, 14.4, 12.4 and 10.7 respectively. The work concluded that RHA can be effectively used in hydraulic barriers and as a stabilizing agent in soil stabilization.

Keywords: Rice husk ash, pozzolans, paddy rice, lateritic clay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2834
471 A Materialized View Approach to Support Aggregation Operations over Long Periods in Sensor Networks

Authors: Minsoo Lee, Julee Choi, Sookyung Song

Abstract:

The increasing interest on processing data created by sensor networks has evolved into approaches to implement sensor networks as databases. The aggregation operator, which calculates a value from a large group of data such as computing averages or sums, etc. is an essential function that needs to be provided when implementing such sensor network databases. This work proposes to add the DURING clause into TinySQL to calculate values during a specific long period and suggests a way to implement the aggregation service in sensor networks by applying materialized view and incremental view maintenance techniques that is used in data warehouses. In sensor networks, data values are passed from child nodes to parent nodes and an aggregation value is computed at the root node. As such root nodes need to be memory efficient and low powered, it becomes a problem to recompute aggregate values from all past and current data. Therefore, applying incremental view maintenance techniques can reduce the memory consumption and support fast computation of aggregate values.

Keywords: Aggregation, Incremental View Maintenance, Materialized view, Sensor Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1541
470 Removing Ocular Artifacts from EEG Signals using Adaptive Filtering and ARMAX Modeling

Authors: Parisa Shooshtari, Gelareh Mohamadi, Behnam Molaee Ardekani, Mohammad Bagher Shamsollahi

Abstract:

EEG signal is one of the oldest measures of brain activity that has been used vastly for clinical diagnoses and biomedical researches. However, EEG signals are highly contaminated with various artifacts, both from the subject and from equipment interferences. Among these various kinds of artifacts, ocular noise is the most important one. Since many applications such as BCI require online and real-time processing of EEG signal, it is ideal if the removal of artifacts is performed in an online fashion. Recently, some methods for online ocular artifact removing have been proposed. One of these methods is ARMAX modeling of EEG signal. This method assumes that the recorded EEG signal is a combination of EOG artifacts and the background EEG. Then the background EEG is estimated via estimation of ARMAX parameters. The other recently proposed method is based on adaptive filtering. This method uses EOG signal as the reference input and subtracts EOG artifacts from recorded EEG signals. In this paper we investigate the efficiency of each method for removing of EOG artifacts. A comparison is made between these two methods. Our undertaken conclusion from this comparison is that adaptive filtering method has better results compared with the results achieved by ARMAX modeling.

Keywords: Ocular Artifacts, EEG, Adaptive Filtering, ARMAX

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904
469 The Effect of Drying Conditions on the Presence of Volatile Compounds in Cranberries

Authors: Karina Ruse, Martins Sabovics, Tatjana Rakcejeva, Lija Dukalska, Ruta Galoburda, Laima Berzina

Abstract:

the research was accomplished on fresh in Latvia wild growing cranberries and cranberry cultivars. The aim of the study was to evaluate effect of pretreatment method and drying conditions on the volatile compounds composition in cranberries. Berries pre-treatment methods were: perforation, halving and steam-blanching. The berries before drying in a cabinet drier were pre-treated using all three methods, in microwave vacuum drier – using a steam-blanching and halving. Volatile compounds in cranberries were analysed using GC-MS of extracts obtained by SPME. During present research 21 various volatile compounds were detected in fresh cranberries: the cultivar 'Steven' - 15, 'Bergman' and 'Early black' – 13, 'Ben Lear' and 'Pilgrim' – 11 and wild cranberries – 14 volatile compounds. In dried cranberries 20 volatile compounds were detected. Mathematical data processing allows drawing a conclusion that there exists the significant influence of cranberry cultivar, pre-treatment method and drying condition on volatile compounds in berries and new volatile compound formation.

Keywords: volatile compounds, cranberries, convective drier, microwave-vacuum drier

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2594
468 Energy Fields as Alternative Cures for Viral Diseases

Authors: S. Amirhassan Monadjemi, Narges Zarrabi, Naser Neamatbakhsh

Abstract:

As days go by, we hear more and more about HIV, Ebola, Bird Flu and other dreadful viruses which were unknown a few decades ago. In both detecting and fighting viral diseases ordinary methods have come across some basic and important difficulties. Vaccination is by a sense introduction of the virus to the immune system before the occurrence of the real case infection. It is very successful against some viruses (e.g. Poliomyelitis), while totally ineffective against some others (e.g. HIV or Hepatitis-C). On the other hand, Anti-virus drugs are mostly some tools to control and not to cure a viral disease. This could be a good motivation to try alternative treatments. In this study, some key features of possible physical-based alternative treatments for viral diseases are presented. Electrification of body parts or fluids (especially blood) with micro electric signals with adjusted current or frequency is also studied. The main approach of this study is to find a suitable energy field, with appropriate parameters that are able to kill or deactivate viruses. This would be a lengthy, multi-disciplinary research which needs the contribution of virology, physics, and signal processing experts. It should be mentioned that all the claims made by alternative cures researchers must be tested carefully and are not advisable at the time being.

Keywords: Alternative Cure, Viral disease, HIV, signals, energy filed.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1954
467 Development of the Maturity Sensor Prototype and Method of Its Placement in the Structure

Authors: Ye. B. Utepov, A. S. Tulebekova, A. B. Kazkeyev

Abstract:

Maturity sensors are used to determine concrete strength by the non-destructive method. The method of placement of the maturity sensors determines their number required for a certain frame of a monolithic building. This paper proposes a cheap prototype of an embedded wireless sensor for monitoring concrete structures, as well as an alternative strategy for placing sensors based on the transitional boundaries of the temperature distribution of concrete curing, which were determined by building a heat map of the temperature distribution, where unknown values are calculated by the method of inverse distance weighing. The developed prototype can simultaneously measure temperature and relative humidity over a smartphone-controlled time interval. It implements a maturity method to assess the in-situ strength of concrete, which is considered an alternative to the traditional shock impulse and compression testing method used in Kazakhstan. The prototype was tested in laboratory and field conditions. The tests were aimed at studying the effect of internal and external temperature and relative humidity on concrete's strength gain. Based on an experimentally poured concrete slab with randomly integrated maturity sensors, it the transition boundaries form elliptical forms were determined. Temperature distribution over the largest diameter of the ellipses was plotted, resulting in correct and inverted parabolas. As a result, the distance between the closest opposite crossing points of the parabolas is accepted as the maximum permissible step for setting the maturity sensors. The proposed placement strategy can be applied to sensors that measure various continuous phenomena such as relative humidity. Prototype testing has also revealed Bluetooth inconvenience due to weak signal and inability to access multiple prototypes simultaneously. For this reason, further prototype upgrades are planned in the future work.

Keywords: Heat map, placement strategy, temperature and relative humidity, wireless embedded sensor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 365
466 Development of a Simple laser-based 2D Compensating System for the Contouring Accuracy of Machine Tools

Authors: Wen-Yuh Jywe, Bor-Jeng Lin, Jing-Chung Shen, Jeng-Dao Lee, Hsueh-Liang Huang, Ming-Chen Cho

Abstract:

The dynamical contouring error is a critical element for the accuracy of machine tools. The contouring error is defined as the difference between the processing actual path and commanded path, which is implemented by following the command curves from feeding driving system in machine tools. The contouring error is resulted from various factors, such as the external loads, friction, inertia moment, feed rate, speed control, servo control, and etc. Thus, the study proposes a 2D compensating system for the contouring accuracy of machine tools. Optical method is adopted by using stable frequency laser diode and the high precision position sensor detector (PSD) to performno-contact measurement. Results show the related accuracy of position sensor detector (PSD) of 2D contouring accuracy compensating system was ±1.5 μm for a calculated range of ±3 mm, and improvement accuracy is over 80% at high-speed feed rate.

Keywords: Position sensor detector, laser diode, contouring accuracy, machine tool.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1798
465 Multiple Job Shop-Scheduling using Hybrid Heuristic Algorithm

Authors: R.A.Mahdavinejad

Abstract:

In this paper, multi-processors job shop scheduling problems are solved by a heuristic algorithm based on the hybrid of priority dispatching rules according to an ant colony optimization algorithm. The objective function is to minimize the makespan, i.e. total completion time, in which a simultanous presence of various kinds of ferons is allowed. By using the suitable hybrid of priority dispatching rules, the process of finding the best solution will be improved. Ant colony optimization algorithm, not only promote the ability of this proposed algorithm, but also decreases the total working time because of decreasing in setup times and modifying the working production line. Thus, the similar work has the same production lines. Other advantage of this algorithm is that the similar machines (not the same) can be considered. So, these machines are able to process a job with different processing and setup times. According to this capability and from this algorithm evaluation point of view, a number of test problems are solved and the associated results are analyzed. The results show a significant decrease in throughput time. It also shows that, this algorithm is able to recognize the bottleneck machine and to schedule jobs in an efficient way.

Keywords: Job shops scheduling, Priority dispatching rules, Makespan, Hybrid heuristic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1670
464 Multi-models Approach for Describing and Verifying Constraints Based Interactive Systems

Authors: Mamoun Sqali, Mohamed Wassim Trojet

Abstract:

The requirements analysis, modeling, and simulation have consistently been one of the main challenges during the development of complex systems. The scenarios and the state machines are two successful models to describe the behavior of an interactive system. The scenarios represent examples of system execution in the form of sequences of messages exchanged between objects and are a partial view of the system. In contrast, state machines can represent the overall system behavior. The automation of processing scenarios in the state machines provide some answers to various problems such as system behavior validation and scenarios consistency checking. In this paper, we propose a method for translating scenarios in state machines represented by Discreet EVent Specification and procedure to detect implied scenarios. Each induced DEVS model represents the behavior of an object of the system. The global system behavior is described by coupling the atomic DEVS models and validated through simulation. We improve the validation process with integrating formal methods to eliminate logical inconsistencies in the global model. For that end, we use the Z notation.

Keywords: Scenarios, DEVS, synthesis, validation and verification, simulation, formal verification, z notation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1387
463 Novel Security Strategy for Real Time Digital Videos

Authors: Prakash Devale, R. S. Prasad, Amol Dhumane, Pritesh Patil

Abstract:

Now a days video data embedding approach is a very challenging and interesting task towards keeping real time video data secure. We can implement and use this technique with high-level applications. As the rate-distortion of any image is not confirmed, because the gain provided by accurate image frame segmentation are balanced by the inefficiency of coding objects of arbitrary shape, with a lot factors like losses that depend on both the coding scheme and the object structure. By using rate controller in association with the encoder one can dynamically adjust the target bitrate. This paper discusses about to keep secure videos by mixing signature data with negligible distortion in the original video, and to keep steganographic video as closely as possible to the quality of the original video. In this discussion we propose the method for embedding the signature data into separate video frames by the use of block Discrete Cosine Transform. These frames are then encoded by real time encoding H.264 scheme concepts. After processing, at receiver end recovery of original video and the signature data is proposed.

Keywords: Data Hiding, Digital Watermarking, video coding H.264, Rate Control, Block DCT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563
462 Advanced Materials Based on Ethylene-Propylene-Diene Terpolymers and Organically Modified Montmorillonite

Authors: M. D. Stelescu, E. Manaila, G. Pelin, M. Georgescu, M. Sonmez

Abstract:

This paper presents studies on the development and characterization of nanocomposites based on ethylene-propylene terpolymer rubber (EPDM), chlorobutyl rubber (IIR-Cl) and organically modified montmorillonite (OMMT). Mixtures were made containing 0, 3 and 6 phr (parts per 100 parts rubber) OMMT, respectively. They were obtained by melt intercalation in an internal mixer - Plasti-Corder Brabender, in suitable blending parameters, at high temperature for 11 minutes. Curing agents were embedded on a laboratory roller at 70-100 ºC, friction 1:1.1, processing time 5 minutes. Rubber specimens were obtained by compression, using a hydraulic press at 165 ºC and a pressing force of 300 kN. Curing time, determined using the Monsanto rheometer, decreases with the increased amount of OMMT in the mixtures. At the same time, it was noticed that mixtures containing OMMT show improvement in physical-mechanical properties. These types of nanocomposites may be used to obtain rubber seals for the space application or for other areas of application.

Keywords: Chlorobutyl rubber, ethylene-propylene-diene terpolymers, montmorillonite, rubber seals, space application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 732
461 Optimization of Petroleum Refinery Configuration Design with Logic Propositions

Authors: Cheng Seong Khor, Xiao Qi Yeoh

Abstract:

This work concerns the topological optimization problem for determining the optimal petroleum refinery configuration. We are interested in further investigating and hopefully advancing the existing optimization approaches and strategies employing logic propositions to conceptual process synthesis problems. In particular, we seek to contribute to this increasingly exciting area of chemical process modeling by addressing the following potentially important issues: (a) how the formulation of design specifications in a mixed-logical-and-integer optimization model can be employed in a synthesis problem to enrich the problem representation by incorporating past design experience, engineering knowledge, and heuristics; and (b) how structural specifications on the interconnectivity relationships by space (states) and by function (tasks) in a superstructure should be properly formulated within a mixed-integer linear programming (MILP) model. The proposed modeling technique is illustrated on a case study involving the alternative processing routes of naphtha, in which significant improvement in the solution quality is obtained.

Keywords: Mixed-integer linear programming (MILP), petroleum refinery, process synthesis, superstructure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1730
460 Cooperative Sensing for Wireless Sensor Networks

Authors: Julien Romieux, Fabio Verdicchio

Abstract:

Wireless Sensor Networks (WSNs), which sense environmental data with battery-powered nodes, require multi-hop communication. This power-demanding task adds an extra workload that is unfairly distributed across the network. As a result, nodes run out of battery at different times: this requires an impractical individual node maintenance scheme. Therefore we investigate a new Cooperative Sensing approach that extends the WSN operational life and allows a more practical network maintenance scheme (where all nodes deplete their batteries almost at the same time). We propose a novel cooperative algorithm that derives a piecewise representation of the sensed signal while controlling approximation accuracy. Simulations show that our algorithm increases WSN operational life and spreads communication workload evenly. Results convey a counterintuitive conclusion: distributing workload fairly amongst nodes may not decrease the network power consumption and yet extend the WSN operational life. This is achieved as our cooperative approach decreases the workload of the most burdened cluster in the network.

Keywords: Cooperative signal processing, power management, signal representation, signal approximation, wireless sensor networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786
459 Computational Assistance of the Research, Using Dynamic Vector Logistics of Processes for Critical Infrastructure Subjects Continuity

Authors: J. Urbánek Jiří, Krahulec Josef, Johanidesová Jitka, F. Urbánek Jiří

Abstract:

This paper deals with using of prevailing operation system MS Office (SmartArt...) for mathematical models, using DYVELOP (Dynamic Vector Logistics of Processes) method. It serves for crisis situations investigation and modelling within the organizations of critical infrastructure. In first part of paper, it will be introduced entities, operators, and actors of DYVELOP method. It uses just three operators of Boolean algebra and four types of the entities: the Environments, the Process Systems, the Cases, and the Controlling. The Process Systems (PrS) have five “brothers”: Management PrS, Transformation PrS, Logistic PrS, Event PrS and Operation PrS. The Cases have three “sisters”: Process Cell Case, Use Case, and Activity Case. They all need for the controlling of their functions special Ctrl actors, except ENV – it can do without Ctrl. Model´s maps are named the Blazons and they are able mathematically - graphically express the relationships among entities, actors and processes. In second part of this paper, the rich blazons of DYVELOP method will be used for the discovering and modelling of the cycling cases and their phases. The blazons need live PowerPoint presentation for better comprehension of this paper mission. The crisis management of energetic crisis infrastructure organization is obliged to use the cycles for successful coping of crisis situations. Several times cycling of these cases is necessary condition for the encompassment for both emergency events and the mitigation of organization´s damages. Uninterrupted and continuous cycling process brings for crisis management fruitfulness and it is good indicator and controlling actor of organizational continuity and its sustainable development advanced possibilities. The research reliable rules are derived for the safety and reliable continuity of energetic critical infrastructure organization in the crisis situation.

Keywords: Blazons, computational assistance, DYVELOP method, critical infrastructure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
458 The Reason of Principles of Construction Engineering and Management Being Necessary for Contracting Firms and Their Projects Managers

Authors: Mamoon Mousa Atout

Abstract:

The industries of construction are in continuous growth not only in Middle East rejoin but almost all over the world. For the last fifteen years, big expansion and increase of different types of projects has been observed. Many infrastructural projects have been developed, high rise buildings, big shopping malls, power sub-stations, roads, bridges, schools, universities and developing many of new cities with full and complete facilities. The growth and enlargement of the mentioned developed projects has been accomplished through many international and local contracting organizations. Senior management of these organizations depend on their qualified and experienced team whom are aware of the implications of project management, construction management, engineering management and resource management during tendering till final completion of the project. This research aims to find out why reasons of principles of construction engineering and management are necessary for contracting firms and their managers. Principles of construction management help contracting organizations to accomplish and deliver projects without delay. This can be maintained by establishing guidelines’ details for updating the adopted system of construction management that they have through qualified and experienced project managers. The research focuses on benefits of other essential skills of projects planning, monitoring and control. Defining roles and responsibilities of contractor project managers during tendering and execution is a part of the investigated factors that will be analyzed. Other skills like optimizing and utilizing the obtainable project resources to deliver the project within time, cost and quality will be also investigated to find out how these factors are affecting the performance of contracting firms, projects managers and projects. The conclusion of the research will help senior management team and the contractors project managers about the benefits of implications and benefits construction management system and its effect upon the performance and knowledge of contract values that they have, and the optimal profit margin of the firm it.

Keywords: Construction management, contracting firms, project managers, planning processes, roles and responsibilities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737
457 A Structural Support Vector Machine Approach for Biometric Recognition

Authors: Vishal Awasthi, Atul Kumar Agnihotri

Abstract:

Face is a non-intrusive strong biometrics for identification of original and dummy facial by different artificial means. Face recognition is extremely important in the contexts of computer vision, psychology, surveillance, pattern recognition, neural network, content based video processing. The availability of a widespread face database is crucial to test the performance of these face recognition algorithms. The openly available face databases include face images with a wide range of poses, illumination, gestures and face occlusions but there is no dummy face database accessible in public domain. This paper presents a face detection algorithm based on the image segmentation in terms of distance from a fixed point and template matching methods. This proposed work is having the most appropriate number of nodal points resulting in most appropriate outcomes in terms of face recognition and detection. The time taken to identify and extract distinctive facial features is improved in the range of 90 to 110 sec. with the increment of efficiency by 3%.

Keywords: Face recognition, Principal Component Analysis, PCA, Linear Discriminant Analysis, LDA, Improved Support Vector Machine, iSVM, elastic bunch mapping technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 498
456 Field Programmable Gate Array Based Infinite Impulse Response Filter Using Multipliers

Authors: Rajesh Mehra, Bharti Thakur

Abstract:

In this paper, an Infinite Impulse Response (IIR) filter has been designed and simulated on an Field Programmable Gate Arrays (FPGA). The implementation is based on Multiply Add and Accumulate (MAC) algorithm which uses multiply operations for design implementation. Parallel Pipelined structure is used to implement the proposed IIR Filter taking optimal advantage of the look up table of target device. The designed filter has been synthesized on Digital Signal Processor (DSP) slice based FPGA to perform multiplier function of MAC unit. The DSP slices are useful to enhance the speed performance. The proposed design is simulated with Matlab, synthesized with Xilinx Synthesis Tool, and implemented on FPGA devices. The Virtex 5 FPGA based design can operate at an estimated frequency of 81.5 MHz as compared to 40.5 MHz in case of Spartan 3 ADSP based design. The Virtex 5 based implementation also consumes less slices and slice flip flops of target FPGA in comparison to Spartan 3 ADSP based implementation to provide cost effective solution for signal processing applications.

Keywords: Butterworth, DSP, IIR, MAC, FPGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1871
455 The Guideline of Overall Competitive Advantage Promotion with Key Success Paths

Authors: M. F. Wu, F. T. Cheng, C. S. Wu, M. C. Tan

Abstract:

It is a critical time to upgrade technology and increase value added with manufacturing skills developing and management strategies that will highly satisfy the customers need in the precision machinery global market. In recent years, the supply side, each precision machinery manufacturers in each country are facing the pressures of price reducing from the demand side voices that pushes the high-end precision machinery manufacturers adopts low-cost and high-quality strategy to retrieve the market. Because of the trend of the global market, the manufacturers must take price reducing strategies and upgrade technology of low-end machinery for differentiations to consolidate the market.By using six key success factors (KSFs), customer perceived value, customer satisfaction, customer service, product design, product effectiveness and machine structure quality are causal conditions to explore the impact of competitive advantage of the enterprise, such as overall profitability and product pricing power. This research uses key success paths (KSPs) approach and f/s QCA software to explore various combinations of causal relationships, so as to fully understand the performance level of KSFs and business objectives in order to achieve competitive advantage. In this study, the combination of a causal relationships, are called Key Success Paths (KSPs). The key success paths guide the enterprise to achieve the specific outcomes of business. The findings of this study indicate that there are thirteen KSPs to achieve the overall profitability, sixteen KSPs to achieve the product pricing power and seventeen KSPs to achieve both overall profitability and pricing power of the enterprise. The KSPs provide the directions of resources integration and allocation, improve utilization efficiency of limited resources to realize the continuous vision of the enterprise.

Keywords: Precision Machinery Industry, Key Success Factors (KSPs), Key Success Paths (KSPs), Overall Profitability, Product Pricing Power, Competitive Advantages.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1857
454 A Review on Comparative Analysis of Path Planning and Collision Avoidance Algorithms

Authors: Divya Agarwal, Pushpendra S. Bharti

Abstract:

Autonomous mobile robots (AMR) are expected as smart tools for operations in every automation industry. Path planning and obstacle avoidance is the backbone of AMR as robots have to reach their goal location avoiding obstacles while traversing through optimized path defined according to some criteria such as distance, time or energy. Path planning can be classified into global and local path planning where environmental information is known and unknown/partially known, respectively. A number of sensors are used for data collection. A number of algorithms such as artificial potential field (APF), rapidly exploring random trees (RRT), bidirectional RRT, Fuzzy approach, Purepursuit, A* algorithm, vector field histogram (VFH) and modified local path planning algorithm, etc. have been used in the last three decades for path planning and obstacle avoidance for AMR. This paper makes an attempt to review some of the path planning and obstacle avoidance algorithms used in the field of AMR. The review includes comparative analysis of simulation and mathematical computations of path planning and obstacle avoidance algorithms using MATLAB 2018a. From the review, it could be concluded that different algorithms may complete the same task (i.e. with a different set of instructions) in less or more time, space, effort, etc.

Keywords: Autonomous mobile robots, obstacle avoidance, path planning, and processing time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696
453 Effect Comparison of Speckle Noise Reduction Filters on 2D-Echocardigraphic Images

Authors: Faten A. Dawood, Rahmita W. Rahmat, Suhaini B. Kadiman, Lili N. Abdullah, Mohd D. Zamrin

Abstract:

Echocardiography imaging is one of the most common diagnostic tests that are widely used for assessing the abnormalities of the regional heart ventricle function. The main goal of the image enhancement task in 2D-echocardiography (2DE) is to solve two major anatomical structure problems; speckle noise and low quality. Therefore, speckle noise reduction is one of the important steps that used as a pre-processing to reduce the distortion effects in 2DE image segmentation. In this paper, we present the common filters that based on some form of low-pass spatial smoothing filters such as Mean, Gaussian, and Median. The Laplacian filter was used as a high-pass sharpening filter. A comparative analysis was presented to test the effectiveness of these filters after being applied to original 2DE images of 4-chamber and 2-chamber views. Three statistical quantity measures: root mean square error (RMSE), peak signal-to-ratio (PSNR) and signal-tonoise ratio (SNR) are used to evaluate the filter performance quantitatively on the output enhanced image.

Keywords: Gaussian operator, median filter, speckle texture, peak signal-to-ratio

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1997
452 Stability Analysis and Controller Design of Further Development of MIMOS II for Space Applications with Focus on the Extended Lyapunov Method: Part I

Authors: Mohammad Beyki, Justus Pawlak, Robert Patzke, Franz Renz

Abstract:

In the context of planetary exploration, the MIMOS II (miniaturized M¨ossbauer spectrometer) serves as a proven and reliable measuring instrument. The transmission behaviour of the electronics in the M¨ossbauer spectroscopy is further developed and optimized. For this purpose, the overall electronics is split into three parts. This elaboration deals exclusively with the first part of the signal chain for the evaluation of photons in experiments with gamma radiation. Parallel to the analysis of the electronics, an additional method for analysing the stability of linear and non-linear systems is presented: The extended method of Lyapunov’s stability criteria. The design helps to weigh advantages and disadvantages against other simulated circuits in order to optimize the MIMOS II for the terestric and extraterestric measurment. Finally, after stability analysis, the controller design according to Ackermann is performed, achieving the best possible optimization of the output variable through a skillful pole assignment.

Keywords: Controller design for MIMOS II, stability analysis, M¨ossbauer spectroscopy, electronic signal amplifier, light processing technology, photocurrent, transimpedance amplifier, extended Lyapunov method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 59
451 PAPR Reduction Method for OFDM Signalby Using Dummy Sub-carriers

Authors: Pisit Boonsrimuang, Arjin Numsomran, Tawil Paungma, Hideo Kobayashi

Abstract:

One of the disadvantages of using OFDM is the larger peak to averaged power ratio (PAPR) in its time domain signal. The larger PAPR signal would course the fatal degradation of bit error rate performance (BER) due to the inter-modulation noise in the nonlinear channel. This paper proposes an improved DSI (Dummy Sequence Insertion) method, which can achieve the better PAPR and BER performances. The feature of proposed method is to optimize the phase of each dummy sub-carrier so as to reduce the PAPR performance by changing all predetermined phase coefficients in the time domain signal, which is calculated for data sub-carriers and dummy sub-carriers separately. To achieve the better PAPR performance, this paper also proposes to employ the time-frequency domain swapping algorithm for fine adjustment of phase coefficient of the dummy subcarriers, which can achieve the less complexity of processing and achieves the better PAPR and BER performances than those for the conventional DSI method. This paper presents various computer simulation results to verify the effectiveness of proposed method as comparing with the conventional methods in the non-linear channel.

Keywords: OFDM, PAPR, dummy sub-carriers, non-linear

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1546
450 Speech Recognition Using Scaly Neural Networks

Authors: Akram M. Othman, May H. Riadh

Abstract:

This research work is aimed at speech recognition using scaly neural networks. A small vocabulary of 11 words were established first, these words are “word, file, open, print, exit, edit, cut, copy, paste, doc1, doc2". These chosen words involved with executing some computer functions such as opening a file, print certain text document, cutting, copying, pasting, editing and exit. It introduced to the computer then subjected to feature extraction process using LPC (linear prediction coefficients). These features are used as input to an artificial neural network in speaker dependent mode. Half of the words are used for training the artificial neural network and the other half are used for testing the system; those are used for information retrieval. The system components are consist of three parts, speech processing and feature extraction, training and testing by using neural networks and information retrieval. The retrieve process proved to be 79.5-88% successful, which is quite acceptable, considering the variation to surrounding, state of the person, and the microphone type.

Keywords: Feature extraction, Liner prediction coefficients, neural network, Speech Recognition, Scaly ANN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1739
449 A Real-Time Specific Weed Recognition System Using Statistical Methods

Authors: Imran Ahmed, Muhammad Islam, Syed Inayat Ali Shah, Awais Adnan

Abstract:

The identification and classification of weeds are of major technical and economical importance in the agricultural industry. To automate these activities, like in shape, color and texture, weed control system is feasible. The goal of this paper is to build a real-time, machine vision weed control system that can detect weed locations. In order to accomplish this objective, a real-time robotic system is developed to identify and locate outdoor plants using machine vision technology and pattern recognition. The algorithm is developed to classify images into broad and narrow class for real-time selective herbicide application. The developed algorithm has been tested on weeds at various locations, which have shown that the algorithm to be very effectiveness in weed identification. Further the results show a very reliable performance on weeds under varying field conditions. The analysis of the results shows over 90 percent classification accuracy over 140 sample images (broad and narrow) with 70 samples from each category of weeds.

Keywords: Weed detection, Image Processing, real-timerecognition, Standard Deviation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2266
448 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: Predictive analysis, big data, predictive analysis algorithms. CART algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1076
447 Hydrologic Balance and Surface Water Resources of the Cheliff-Zahrez Basin

Authors: Mehaiguene Madjid, Touhari Fadhila, Meddi Mohamed

Abstract:

The Cheliff basin offers a good hydrological example for the possibility of studying the problem which elucidated in the future, because of the unclearity in several aspects and hydraulic installation. Thus, our study of the Cheliff basin is divided into two principal parts: The spatial evaluation of the precipitation: also, the understanding of the modes of the reconstitution of the resource in water supposes a good knowledge of the structuring of the precipitation fields in the studied space. In the goal of a good knowledge of revitalizes them in water and their management integrated one judged necessary to establish a precipitation card of the Cheliff basin for a good understanding of the evolution of the resource in water in the basin and that goes will serve as basis for all study of hydraulic planning in the Cheliff basin. Then, the establishment of the precipitation card of the Cheliff basin answered a direct need of setting to the disposition of the researchers for the region and a document of reference that will be completed therefore and actualized. The hydrological study, based on the statistical hydrometric data processing will lead us to specify the hydrological terms of the assessment hydrological and to clarify the fundamental aspects of the annual flow, seasonal, extreme and thus of their variability and resources surface water.

Keywords: Hydrological assessment, surface water resources, Cheliff, Algeria.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1048
446 MHD Non-Newtonian Nanofluid Flow over a Permeable Stretching Sheet with Heat Generation and Velocity Slip

Authors: Rama Bhargava, Mania Goyal

Abstract:

The problem of magnetohydrodynamics boundary layer flow and heat transfer on a permeable stretching surface in a second grade nanofluid under the effect of heat generation and partial slip is studied theoretically. The Brownian motion and thermophoresis effects are also considered. The boundary layer equations governed by the PDE’s are transformed into a set of ODE’s with the help of local similarity transformations. The differential equations are solved by variational finite element method. The effects of different controlling parameters on the flow field and heat transfer characteristics are examined. The numerical results for the dimensionless velocity, temperature and nanoparticle volume fraction as well as the reduced Nusselt and Sherwood number have been presented graphically. The comparison confirmed excellent agreement. The present study is of great interest in coating and suspensions, cooling of metallic plate, oils and grease, paper production, coal water or coal-oil slurries, heat exchangers technology, materials processing exploiting.

Keywords: Viscoelastic nanofluid, partial slip, stretching sheet, heat generation/absorption, MHD flow, FEM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3276
445 Cost Sensitive Feature Selection in Decision-Theoretic Rough Set Models for Customer Churn Prediction: The Case of Telecommunication Sector Customers

Authors: Emel Kızılkaya Aydogan, Mihrimah Ozmen, Yılmaz Delice

Abstract:

In recent days, there is a change and the ongoing development of the telecommunications sector in the global market. In this sector, churn analysis techniques are commonly used for analysing why some customers terminate their service subscriptions prematurely. In addition, customer churn is utmost significant in this sector since it causes to important business loss. Many companies make various researches in order to prevent losses while increasing customer loyalty. Although a large quantity of accumulated data is available in this sector, their usefulness is limited by data quality and relevance. In this paper, a cost-sensitive feature selection framework is developed aiming to obtain the feature reducts to predict customer churn. The framework is a cost based optional pre-processing stage to remove redundant features for churn management. In addition, this cost-based feature selection algorithm is applied in a telecommunication company in Turkey and the results obtained with this algorithm.

Keywords: Churn prediction, data mining, decision-theoretic rough set, feature selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763
444 Alignment of Emission Gamma Ray Sources with Nai(Ti) Scintillation Detectors by Two Laser Beams to Pre-Operation using Alternating Minimization Technique

Authors: Abbas Ali Mahmood Karwi

Abstract:

Accurate timing alignment and stability is important to maximize the true counts and minimize the random counts in positron emission tomography So signals output from detectors must be centering with the two isotopes to pre-operation and fed signals into four units of pulse-processing units, each unit can accept up to eight inputs. The dual source computed tomography consist two units on the left for 15 detector signals of Cs-137 isotope and two units on the right are for 15 detectors signals of Co-60 isotope. The gamma spectrum consisting of either single or multiple photo peaks. This allows for the use of energy discrimination electronic hardware associated with the data acquisition system to acquire photon counts data with a specific energy, even if poor energy resolution detectors are used. This also helps to avoid counting of the Compton scatter counts especially if a single discrete gamma photo peak is emitted by the source as in the case of Cs-137. In this study the polyenergetic version of the alternating minimization algorithm is applied to the dual energy gamma computed tomography problem.

Keywords: Alignment, Spectrum, Laser, Detectors, Image

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1612
443 Influence of Ti, B, and Sr on Microstructure, Mechanical and Tribological Properties of as Cast, Cast Aged, and Forge Aged A356 Alloy – A Comparative Study

Authors: R. V. Kurahatti, D. G. Mallapur, K. Rajendra Udupa

Abstract:

In the present work, a comparative study on the microstructure and mechanical properties of as cast, cast aged and forged aged A356 alloy has been investigated. The study reveals that mechanical properties of A356 alloy are highly influenced by melt treatment and solid state processing. Cast aged alloys achieve highest strength and hardness compared to as cast and forge aged ones. Ones treated with combined addition of grain refiners and modifiers achieve maximum strength and hardness. Cast aged A356 alloy possesses higher wear resistance compared to as cast and forge aged ones. Forging improves both strength and ductility of alloys over as cast ones. However, the improvement in ductility is perceptible only for properly grain refined and modified alloys. Ones refined with 0.65% Al-3Ti shows highest improvement in ductility while ones treated with 0.20% Al-10Sr exhibits less improvement in ductility.

Keywords: Forged A356 alloy, Grain refinement, Modification, Wear

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2684