Search results for: vegetative filter strip modeling system
20484 Modelling the Photovoltaic Pump Output Using Empirical Data from Local Conditions in the Vhembe District
Authors: C. Matasane, C. Dwarika, R. Naidoo
Abstract:
The mathematical analysis on radiation obtained and the development of the solar photovoltaic (PV) array groundwater pumping is needed in the rural areas of Thohoyandou, Limpopo Province for sizing and power performance subject to the climate conditions within the area. A simple methodology approach is developed for the directed coupled solar, controller and submersible ground water pump system. The system consists of a PV array, pump controller and submerged pump, battery backup and charger controller. For this reason, the theoretical solar radiation obtained for optimal predictions and system performance in order to achieve different design and operating parameters. Here the examination of the PV schematic module in a Direct Current (DC) application is used for obtainable maximum solar power energy for water pumping. In this paper, a simple efficient photovoltaic water pumping system is presented with its theoretical studies and mathematical modeling of photovoltaics (PV) system.Keywords: renewable energy sources, solar groundwater pumping, theoretical and mathematical analysis of photovoltaic (PV) system, theoretical solar radiation
Procedia PDF Downloads 37920483 Review of the Model-Based Supply Chain Management Research in the Construction Industry
Authors: Aspasia Koutsokosta, Stefanos Katsavounis
Abstract:
This paper reviews the model-based qualitative and quantitative Operations Management research in the context of Construction Supply Chain Management (CSCM). Construction industry has been traditionally blamed for low productivity, cost and time overruns, waste, high fragmentation and adversarial relationships. The construction industry has been slower than other industries to employ the Supply Chain Management (SCM) concept and develop models that support the decision-making and planning. However the last decade there is a distinct shift from a project-based to a supply-based approach of construction management. CSCM comes up as a new promising management tool of construction operations and improves the performance of construction projects in terms of cost, time and quality. Modeling the Construction Supply Chain (CSC) offers the means to reap the benefits of SCM, make informed decisions and gain competitive advantage. Different modeling approaches and methodologies have been applied in the multi-disciplinary and heterogeneous research field of CSCM. The literature review reveals that a considerable percentage of CSC modeling accommodates conceptual or process models which discuss general management frameworks and do not relate to acknowledged soft OR methods. We particularly focus on the model-based quantitative research and categorize the CSCM models depending on their scope, mathematical formulation, structure, objectives, solution approach, software used and decision level. Although over the last few years there has been clearly an increase of research papers on quantitative CSC models, we identify that the relevant literature is very fragmented with limited applications of simulation, mathematical programming and simulation-based optimization. Most applications are project-specific or study only parts of the supply system. Thus, some complex interdependencies within construction are neglected and the implementation of the integrated supply chain management is hindered. We conclude this paper by giving future research directions and emphasizing the need to develop robust mathematical optimization models for the CSC. We stress that CSC modeling needs a multi-dimensional, system-wide and long-term perspective. Finally, prior applications of SCM to other industries have to be taken into account in order to model CSCs, but not without the consequential reform of generic concepts to match the unique characteristics of the construction industry.Keywords: construction supply chain management, modeling, operations research, optimization, simulation
Procedia PDF Downloads 50620482 Using Morlet Wavelet Filter to Denoising Geoelectric ‘Disturbances’ Map of Moroccan Phosphate Deposit ‘Disturbances’
Authors: Saad Bakkali
Abstract:
Morocco is a major producer of phosphate, with an annual output of 19 million tons and reserves in excess of 35 billion cubic meters. This represents more than 75% of world reserves. Resistivity surveys have been successfully used in the Oulad Abdoun phosphate basin. A Schlumberger resistivity survey over an area of 50 hectares was carried out. A new field procedure based on analytic signal response of resistivity data was tested to deal with the presence of phosphate deposit disturbances. A resistivity map was expected to allow the electrical resistivity signal to be imaged in 2D. 2D wavelet is standard tool in the interpretation of geophysical potential field data. Wavelet transform is particularly suitable in denoising, filtering and analyzing geophysical data singularities. Wavelet transform tools are applied to analysis of a moroccan phosphate deposit ‘disturbances’. Wavelet approach applied to modeling surface phosphate “disturbances” was found to be consistently useful.Keywords: resistivity, Schlumberger, phosphate, wavelet, Morocco
Procedia PDF Downloads 42420481 Developing Laser Spot Position Determination and PRF Code Detection with Quadrant Detector
Authors: Mohamed Fathy Heweage, Xiao Wen, Ayman Mokhtar, Ahmed Eldamarawy
Abstract:
In this paper, we are interested in modeling, simulation, and measurement of the laser spot position with a quadrant detector. We enhance detection and tracking of semi-laser weapon decoding system based on microcontroller. The system receives the reflected pulse through quadrant detector and processes the laser pulses through a processing circuit, a microcontroller decoding laser pulse reflected by the target. The seeker accuracy will be enhanced by the decoding system, the laser detection time based on the receiving pulses number is reduced, a gate is used to limit the laser pulse width. The model is implemented based on Pulse Repetition Frequency (PRF) technique with two microcontroller units (MCU). MCU1 generates laser pulses with different codes. MCU2 decodes the laser code and locks the system at the specific code. The codes EW selected based on the two selector switches. The system is implemented and tested in Proteus ISIS software. The implementation of the full position determination circuit with the detector is produced. General system for the spot position determination was performed with the laser PRF for incident radiation and the mechanical system for adjusting system at different angles. The system test results show that the system can detect the laser code with only three received pulses based on the narrow gate signal, and good agreement between simulation and measured system performance is obtained.Keywords: four quadrant detector, pulse code detection, laser guided weapons, pulse repetition frequency (PRF), Atmega 32 microcontrollers
Procedia PDF Downloads 39720480 An Enhanced SAR-Based Tsunami Detection System
Authors: Jean-Pierre Dubois, Jihad S. Daba, H. Karam, J. Abdallah
Abstract:
Tsunami early detection and warning systems have proved to be of ultimate importance, especially after the destructive tsunami that hit Japan in March 2012. Such systems are crucial to inform the authorities of any risk of a tsunami and of the degree of its danger in order to make the right decision and notify the public of the actions they need to take to save their lives. The purpose of this research is to enhance existing tsunami detection and warning systems. We first propose an automated and miniaturized model of an early tsunami detection and warning system. The model for the operation of a tsunami warning system is simulated using the data acquisition toolbox of Matlab and measurements acquired from specified internet pages due to the lack of the required real-life sensors, both seismic and hydrologic, and building a graphical user interface for the system. In the second phase of this work, we implement various satellite image filtering schemes to enhance the acquired synthetic aperture radar images of the tsunami affected region that are masked by speckle noise. This enables us to conduct a post-tsunami damage extent study and calculate the percentage damage. We conclude by proposing improvements to the existing telecommunication infrastructure of existing warning tsunami systems using a migration to IP-based networks and fiber optics links.Keywords: detection, GIS, GSN, GTS, GPS, speckle noise, synthetic aperture radar, tsunami, wiener filter
Procedia PDF Downloads 39820479 Study and Conservation of Cultural and Natural Heritages with the Use of Laser Scanner and Processing System for 3D Modeling Spatial Data
Authors: Julia Desiree Velastegui Caceres, Luis Alejandro Velastegui Caceres, Oswaldo Padilla, Eduardo Kirby, Francisco Guerrero, Theofilos Toulkeridis
Abstract:
It is fundamental to conserve sites of natural and cultural heritage with any available technique or existing methodology of preservation in order to sustain them for the following generations. We propose a further skill to protect the actual view of such sites, in which with high technology instrumentation we are able to digitally preserve natural and cultural heritages applied in Ecuador. In this project the use of laser technology is presented for three-dimensional models, with high accuracy in a relatively short period of time. In Ecuador so far, there are not any records on the use and processing of data obtained by this new technological trend. The importance of the project is the description of the methodology of the laser scanner system using the Faro Laser Scanner Focus 3D 120, the method for 3D modeling of geospatial data and the development of virtual environments in the areas of Cultural and Natural Heritage. In order to inform users this trend in technology in which three-dimensional models are generated, the use of such tools has been developed to be able to be displayed in all kinds of digitally formats. The results of the obtained 3D models allows to demonstrate that this technology is extremely useful in these areas, but also indicating that each data campaign needs an individual slightly different proceeding starting with the data capture and processing to obtain finally the chosen virtual environments.Keywords: laser scanner system, 3D model, cultural heritage, natural heritage
Procedia PDF Downloads 31220478 Finite Element Modeling of Integral Abutment Bridge for Lateral Displacement
Authors: M. Naji, A. R. Khalim, M. Naji
Abstract:
Integral Abutment Bridges (IAB) are defined as simple or multiple span bridges in which the bridge deck is cast monolithically with the abutment walls. This kind of bridges are becoming very popular due to different aspects such as good response under seismic loading, low initial costs, elimination of bearings and less maintenance. However, the main issue related to the analysis of this type of structures is dealing with soil-structure interaction of the abutment walls and the supporting piles. A two-dimensional, non-linear finite element (FE) model of an integral abutment bridge has been developed to study the effect of lateral time history displacement loading on the soil system.Keywords: integral abutment bridge, soil structure interaction, finite element modeling, soil-pile interaction
Procedia PDF Downloads 29320477 Bipolar Impulse Noise Removal and Edge Preservation in Color Images and Video Using Improved Kuwahara Filter
Authors: Reji Thankachan, Varsha PS
Abstract:
Both image capturing devices and human visual systems are nonlinear. Hence nonlinear filtering methods outperforms its linear counterpart in many applications. Linear methods are unable to remove impulsive noise in images by preserving its edges and fine details. In addition, linear algorithms are unable to remove signal dependent or multiplicative noise in images. This paper presents an approach to denoise and smoothen the Bipolar impulse noised images and videos using improved Kuwahara filter. It involves a 2 stage algorithm which includes a noise detection followed by filtering. Numerous simulation demonstrate that proposed method outperforms the existing method by eliminating the painting like flattening effect along the local feature direction while preserving edge with improvement in PSNR and MSE.Keywords: bipolar impulse noise, Kuwahara, PSNR MSE, PDF
Procedia PDF Downloads 50020476 Mitigating Denial of Service Attacks in Information Centric Networking
Authors: Bander Alzahrani
Abstract:
Information-centric networking (ICN) using architectures such as Publish-Subscribe Internet Routing Paradigm (PSIRP) is one of the promising candidates for a future Internet, has recently been under the spotlight by the research community to investigate the possibility of redesigning the current Internet architecture to solve many issues such as routing scalability, security, and quality of services issues.. The Bloom filter-based forwarding is a source-routing approach that is used in the PSIRP architecture. This mechanism is vulnerable to brute force attacks which may lead to denial-of-service (DoS) attacks. In this work, we present a new forwarding approach that keeps the advantages of Bloom filter-based forwarding while mitigates attacks on the forwarding mechanism. In practice, we introduce a special type of forwarding nodes called Edge-FW to be placed at the edge of the network. The role of these node is to add an extra security layer by validating and inspecting packets at the edge of the network against brute-force attacks and check whether the packet contains a legitimate forwarding identifier (FId) or not. We leverage Certificateless Aggregate Signature (CLAS) scheme with a small size of 64-bit which is used to sign the FId. Hence, this signature becomes bound to a specific FId. Therefore, malicious nodes that inject packets with random FIds will be easily detected and dropped at the Edge-FW node when the signature verification fails. Our preliminary security analysis suggests that with the proposed approach, the forwarding plane is able to resist attacks such as DoS with very high probability.Keywords: bloom filter, certificateless aggregate signature, denial-of-service, information centric network
Procedia PDF Downloads 19920475 Liver and Liver Lesion Segmentation From Abdominal CT Scans
Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid
Abstract:
The interpretation of medical images benefits from anatomical and physiological priors to optimize computer- aided diagnosis applications. Segmentation of liver and liver lesion is regarded as a major primary step in computer aided diagnosis of liver diseases. Precise liver segmentation in abdominal CT images is one of the most important steps for the computer-aided diagnosis of liver pathology. In this papers, a semi- automated method for medical image data is presented for the liver and liver lesion segmentation data using mathematical morphology. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological filters to extract the liver. The second step consists to detect the liver lesion. In this task; we proposed a new method developed for the semi-automatic segmentation of the liver and hepatic lesions. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to improve the quality of the original image and image gradient by applying the spatial filter followed by the morphological filters. The second step consists to calculate the internal and external markers of the liver and hepatic lesions. Thereafter we proceed to the liver and hepatic lesions segmentation by the watershed transform controlled by markers. The validation of the developed algorithm is done using several images. Obtained results show the good performances of our proposed algorithmKeywords: anisotropic diffusion filter, CT images, hepatic lesion segmentation, Liver segmentation, morphological filter, the watershed algorithm
Procedia PDF Downloads 45420474 Atmospheric Transport Modeling of Radio-Xenon Detections Possibly Related to the Announced Nuclear Test in North Korea on February 12, 2013
Authors: Kobi Kutsher
Abstract:
On February 12th 2013, monitoring stations of the Preparatory Commission of the Comprehensive Nuclear Test-Ban Treaty Organization (CTBTO) detected a seismic event with explosion-like underground characteristics in the Democratic People’s Republic of Korea (DPRK). The location was found to be in the vicinity of the two previous announced nuclear tests in 2006 and 2009.The nuclear test was also announced by the government of the DPRK.After an underground nuclear explosion, radioactive fission products (mostly noble gases) can seep through layers of rock and sediment until they escape into the atmosphere. The fission products are dispersed in the atmosphere and may be detected thousands of kilometers downwind from the test site. Indeed, more than 7 weeks after the explosion, unusual detections of noble gases was reported at the radionuclide station in Takasaki, Japan. The radionuclide station is a part of the International Monitoring System, operated to verify the CTBT. This study provides an estimation of the possible source region and the total radioactivity of the release using Atmospheric Transport Modeling.Keywords: atmospheric transport modeling, CTBTO, nuclear tests, radioactive fission products
Procedia PDF Downloads 42820473 Fast and Robust Long-term Tracking with Effective Searching Model
Authors: Thang V. Kieu, Long P. Nguyen
Abstract:
Kernelized Correlation Filter (KCF) based trackers have gained a lot of attention recently because of their accuracy and fast calculation speed. However, this algorithm is not robust in cases where the object is lost by a sudden change of direction, being obscured or going out of view. In order to improve KCF performance in long-term tracking, this paper proposes an anomaly detection method for target loss warning by analyzing the response map of each frame, and a classification algorithm for reliable target re-locating mechanism by using Random fern. Being tested with Visual Tracker Benchmark and Visual Object Tracking datasets, the experimental results indicated that the precision and success rate of the proposed algorithm were 2.92 and 2.61 times higher than that of the original KCF algorithm, respectively. Moreover, the proposed tracker handles occlusion better than many state-of-the-art long-term tracking methods while running at 60 frames per second.Keywords: correlation filter, long-term tracking, random fern, real-time tracking
Procedia PDF Downloads 14220472 Optimization of the Energy Consumption of the Pottery Kilns by the Use of Heat Exchanger as Recovery System and Modeling of Heat Transfer by Conduction Through the Walls of the Furnace
Authors: Maha Bakakri, Rachid Tadili, Fatiha Lemmini
Abstract:
Morocco is one of the few countries that have kept their traditional crafts, despite the competition of modern industry and its impact on manual labor. Therefore the optimization of energy consumption becomes an obligation and this is the purpose of this document. In this work we present some characteristics of the furnace studied, its operating principle and the experimental measurements of the evolutions of the temperatures inside and outside the walls of the furnace, values which will be used later in the calculation of its thermal losses. In order to determine the major source of the thermal losses of the furnace we have established the heat balance of the furnace. The energy consumed, the useful energy and the thermal losses through the walls and the chimney of the furnace are calculated thanks to the experimental measurements which we realized for several firings. The results show that the energy consumption of this type of furnace is very high and that the main source of energy loss is mainly due to the heat losses of the combustion gases that escape from the furnace by the chimney while the losses through the walls are relatively small. it have opted for energy recovery as a solution where we can recover some of the heat lost through the use of a heat exchanger system using a double tube introduced into the flue gas exhaust stack compartment. The study on the heat recovery system is presented and the heat balance inside the exchanger is established. In this paper we also present the numerical modeling of heat transfer by conduction through the walls of the furnace. A numerical model has been established based on the finite volume method and the double scan method. It makes it possible to determine the temperature profile of the furnace and thus to calculate the thermal losses of its walls and to deduce the thermal losses due to the combustion gases. Validation of the model is done using the experimental measurements carried out on the furnace. The results obtained in this work, relating to the energy consumed during the operation of the furnace are important and are part of the energy efficiency framework that has become a key element in global energy policies. It is the fastest and cheapest way to solve energy, environmental and economic security problems.Keywords: energy cunsumption, energy recovery, modeling, energy eficiency
Procedia PDF Downloads 7620471 Differential Diagnosis of Malaria and Dengue Fever on the Basis of Clinical Findings and Laboratory Investigations
Authors: Aman Ullah Khan, Muhammad Younus, Aqil Ijaz, Muti-Ur-Rehman Khan, Sayyed Aun Muhammad, Asif Idrees, Sanan Raza, Amar Nasir
Abstract:
Dengue fever and malaria are important vector-borne diseases of public health significance affecting millions of people around the globe. Dengue fever is caused by Dengue virus while malaria is caused by plasmodium protozoan. Generally, the consequences of Malaria are less severe compared to dengue fever. This study was designed to differentiate dengue fever and malaria on the basis of clinical and laboratory findings and to compare the changes in both diseases having different causative agents transmitted by the common vector. A total of 200 patients of dengue viral infection (120 males, 80 females) were included in this prospective descriptive study. The blood samples of the individuals were first screened for malaria by blood smear examination and then the negative samples were tested by anti-dengue IgM strip. The strip positive cases were further screened by IgM capture ELISA and their complete blood count including hemoglobin estimation (Hb), total and differential leukocyte counts (TLC and DLC), erythrocyte sedimentation rate (ESR) and platelet counts were performed. On the basis of the severity of signs and symptoms, dengue virus infected patients were subdivided into dengue fever (DF) and dengue hemorrhagic fever (DHF) comprising 70 and 100 confirmed patients, respectively. On the other hand, 30 patients were found infected with Malaria while overall 120 patients showed thrombocytopenia. The patients of DHF were found to have more leucopenia, raised hemoglobin level and thrombocytopenia < 50,000/µl compared to the patients belonging to DF and malaria. On the basis of the outcomes of the study, it was concluded that patients affected by DF were at a lower risk of undergoing haematological disturbance than suffering from DHF. While, the patients infected by Malaria were found to have no significant change in their blood components.Keywords: dengue fever, blood, serum, malaria, ELISA
Procedia PDF Downloads 39620470 Study of the Behavior of Bolted Joints with and Without Reinforcement
Authors: Karim Akkouche
Abstract:
Many methods have been developed for characterizing the behavior of bolted joints. However, in the presence of a certain model of stiffeners, no orientation was given in relation to their modeling. To this end, multitude of coarse errors can arise in the reproduction of the propagation of efforts and in representation of the modes of deformations. Considering these particularities, a numerical investigation was carried out in our laboratory. In this paper we will present a comparative study between three types of assemblies. A non-linear 3D modeling was chosen, given that it takes into consideration geometric and material non-linearity, using the Finite Element calculation code ABAQUS. Initially, we evaluated the influence of the presence of each stiffener on the "global" behavior of the assemblies, this by analyzing their Moment-Rotation curves, also by referring to the classification system proposed by NF EN 1993- 1.8 which is based on the resisting moment Mj-Rd and the initial stiffness Sj.int. In a second step, we evaluated the "local" behavior of their components by referring to the stress-strain curves.Keywords: assembly, post-beam, end plate, nonlinearity
Procedia PDF Downloads 7820469 Towards Safety-Oriented System Design: Preventing Operator Errors by Scenario-Based Models
Authors: Avi Harel
Abstract:
Most accidents are commonly attributed in hindsight to human errors, yet most methodologies for safety focus on technical issues. According to the Black Swan theory, this paradox is due to insufficient data about the ways systems fail. The article presents a study of the sources of errors, and proposes a methodology for utility-oriented design, comprising methods for coping with each of the sources identified. Accident analysis indicates that errors typically result from difficulties of operating in exceptional conditions. Therefore, following STAMP, the focus should be on preventing exceptions. Exception analysis indicates that typically they involve an improper account of the operational scenario, due to deficiencies in the system integration. The methodology proposes a model, which is a formal definition of the system operation, as well as principles and guidelines for safety-oriented system integration. The article calls to develop and integrate tools for recording and analysis of the system activity during the operation, required to implement validate the model.Keywords: accidents, complexity, errors, exceptions, interaction, modeling, resilience, risks
Procedia PDF Downloads 19920468 Optimal Design of Propellant Grain Shape Based on Structural Strength Analysis
Authors: Chen Xiong, Tong Xin, Li Hao, Xu Jin-Sheng
Abstract:
Experiment and simulation researches on the structural integrity of propellant grain in solid rocket motor (SRM) with high volumetric fraction were conducted. First, by using SRM parametric modeling functions with secondary development tool Python of ABAQUS, the three dimensional parameterized modeling programs of star shaped grain, wheel shaped grain and wing cylindrical grain were accomplished. Then, the mechanical properties under different loads for star shaped grain were obtained with the application of automatically established finite element model in ABAQUS. Next, several optimization algorithms are introduced to optimize the star shaped grain, wheel shaped grain and wing cylindrical grain. After meeting the demands of burning surface changes and volumetric fraction, the optimum three dimensional shapes of grain were obtained. Finally, by means of parametric modeling functions, pressure data of SRM’s cold pressurization test was directly applied to simulation of grain in terms of mechanical performance. The results verify the reliability and practical of parameterized modeling program of SRM.Keywords: cold pressurization test, ğarametric modeling, structural integrity, propellant grain, SRM
Procedia PDF Downloads 36620467 Calculation of the Normalized Difference Vegetation Index and the Spectral Signature of Coffee Crops: Benefits of Image Filtering on Mixed Crops
Authors: Catalina Albornoz, Giacomo Barbieri
Abstract:
Crop monitoring has shown to reduce vulnerability to spreading plagues and pathologies in crops. Remote sensing with Unmanned Aerial Vehicles (UAVs) has made crop monitoring more precise, cost-efficient and accessible. Nowadays, remote monitoring involves calculating maps of vegetation indices by using different software that takes either Truecolor (RGB) or multispectral images as an input. These maps are then used to segment the crop into management zones. Finally, knowing the spectral signature of a crop (the reflected radiation as a function of wavelength) can be used as an input for decision-making and crop characterization. The calculation of vegetation indices using software such as Pix4D has high precision for monoculture plantations. However, this paper shows that using this software on mixed crops may lead to errors resulting in an incorrect segmentation of the field. Within this work, authors propose to filter all the elements different from the main crop before the calculation of vegetation indices and the spectral signature. A filter based on the Sobel method for border detection is used for filtering a coffee crop. Results show that segmentation into management zones changes with respect to the traditional situation in which a filter is not applied. In particular, it is shown how the values of the spectral signature change in up to 17% per spectral band. Future work will quantify the benefits of filtering through the comparison between in situ measurements and the calculated vegetation indices obtained through remote sensing.Keywords: coffee, filtering, mixed crop, precision agriculture, remote sensing, spectral signature
Procedia PDF Downloads 39120466 Extended Boolean Petri Nets Generating N-Ary Trees
Authors: Riddhi Jangid, Gajendra Pratap Singh
Abstract:
Petri nets, a mathematical tool, is used for modeling in different areas of computer sciences, biological networks, chemical systems and many other disciplines. A Petri net model of a given system is created by the graphical representation that describes the properties and behavior of the system. While looking for the behavior of any system, 1-safe Petri nets are of particular interest to many in the application part. Boolean Petri nets correspond to those class in 1- safe Petri nets that generate all the binary n-vectors in their reachability analysis. We study the class by changing different parameters like the token counts in the places and how the structure of the tree changes in the reachability analysis. We discuss here an extended class of Boolean Petri nets that generates n-ary trees in their reachability-based analysis.Keywords: marking vector, n-vector, petri nets, reachability
Procedia PDF Downloads 8720465 Designing a Robust Controller for a 6 Linkage Robot
Authors: G. Khamooshian
Abstract:
One of the main points of application of the mechanisms of the series and parallel is the subject of managing them. The control of this mechanism and similar mechanisms is one that has always been the intention of the scholars. On the other hand, modeling the behavior of the system is difficult due to the large number of its parameters, and it leads to complex equations that are difficult to solve and eventually difficult to control. In this paper, a six-linkage robot has been presented that could be used in different areas such as medical robots. Using these robots needs a robust control. In this paper, the system equations are first found, and then the system conversion function is written. A new controller has been designed for this robot which could be used in other parallel robots and could be very useful. Parallel robots are so important in robotics because of their stability, so methods for control of them are important and the robust controller, especially in parallel robots, makes a sense.Keywords: 3-RRS, 6 linkage, parallel robot, control
Procedia PDF Downloads 16220464 Comparative Performance Analysis for Selected Behavioral Learning Systems versus Ant Colony System Performance: Neural Network Approach
Authors: Hassan M. H. Mustafa
Abstract:
This piece of research addresses an interesting comparative analytical study. Which considers two concepts of diverse algorithmic computational intelligence approaches related tightly with Neural and Non-Neural Systems. The first algorithmic intelligent approach concerned with observed obtained practical results after three neural animal systems’ activities. Namely, they are Pavlov’s, and Thorndike’s experimental work. Besides a mouse’s trial during its movement inside figure of eight (8) maze, to reach an optimal solution for reconstruction problem. Conversely, second algorithmic intelligent approach originated from observed activities’ results for Non-Neural Ant Colony System (ACS). These results obtained after reaching an optimal solution while solving Traveling Sales-man Problem (TSP). Interestingly, the effect of increasing number of agents (either neurons or ants) on learning performance shown to be similar for both introduced systems. Finally, performance of both intelligent learning paradigms shown to be in agreement with learning convergence process searching for least mean square error LMS algorithm. While its application for training some Artificial Neural Network (ANN) models. Accordingly, adopted ANN modeling is a relevant and realistic tool to investigate observations and analyze performance for both selected computational intelligence (biological behavioral learning) systems.Keywords: artificial neural network modeling, animal learning, ant colony system, traveling salesman problem, computational biology
Procedia PDF Downloads 47520463 Energy Consumption and GHG Production in Railway and Road Passenger Regional Transport
Authors: Martin Kendra, Tomas Skrucany, Jozef Gnap, Jan Ponicky
Abstract:
Paper deals with the modeling and simulation of energy consumption and GHG production of two different modes of regional passenger transport – road and railway. These two transport modes use the same type of fuel – diesel. Modeling and simulation of the energy consumption in transport is often used due to calculation satisfactory accuracy and cost efficiency. Paper deals with the calculation based on EN standards and information collected from technical information from vehicle producers and characteristics of tracks. Calculation included maximal theoretical capacity of bus and train and real passenger’s measurement from operation. Final energy consumption and GHG production is calculated by using software simulation. In evaluation of the simulation is used system ‘well to wheel’.Keywords: bus, consumption energy, GHG, production, simulation, train
Procedia PDF Downloads 44720462 Fast Tumor Extraction Method Based on Nl-Means Filter and Expectation Maximization
Authors: Sandabad Sara, Sayd Tahri Yassine, Hammouch Ahmed
Abstract:
The development of science has allowed computer scientists to touch the medicine and bring aid to radiologists as we are presenting it in our article. Our work focuses on the detection and localization of tumors areas in the human brain; this will be a completely automatic without any human intervention. In front of the huge volume of MRI to be treated per day, the radiologist can spend hours and hours providing a tremendous effort. This burden has become less heavy with the automation of this step. In this article we present an automatic and effective tumor detection, this work consists of two steps: the first is the image filtering using the filter Nl-means, then applying the expectation maximization algorithm (EM) for retrieving the tumor mask from the brain MRI and extracting the tumor area using the mask obtained from the second step. To prove the effectiveness of this method multiple evaluation criteria will be used, so that we can compare our method to frequently extraction methods used in the literature.Keywords: MRI, Em algorithm, brain, tumor, Nl-means
Procedia PDF Downloads 34120461 Study on Safety Management of Deep Foundation Pit Construction Site Based on Building Information Modeling
Authors: Xuewei Li, Jingfeng Yuan, Jianliang Zhou
Abstract:
The 21st century has been called the century of human exploitation of underground space. Due to the characteristics of large quantity, tight schedule, low safety reserve and high uncertainty of deep foundation pit engineering, accidents frequently occur in deep foundation pit engineering, causing huge economic losses and casualties. With the successful application of information technology in the construction industry, building information modeling has become a research hotspot in the field of architectural engineering. Therefore, the application of building information modeling (BIM) and other information communication technologies (ICTs) in construction safety management is of great significance to improve the level of safety management. This research summed up the mechanism of the deep foundation pit engineering accident through the fault tree analysis to find the control factors of deep foundation pit engineering safety management, the deficiency existing in the traditional deep foundation pit construction site safety management. According to the accident cause mechanism and the specific process of deep foundation pit construction, the hazard information of deep foundation pit engineering construction site was identified, and the hazard list was obtained, including early warning information. After that, the system framework was constructed by analyzing the early warning information demand and early warning function demand of the safety management system of deep foundation pit. Finally, the safety management system of deep foundation pit construction site based on BIM through combing the database and Web-BIM technology was developed, so as to realize the three functions of real-time positioning of construction site personnel, automatic warning of entering a dangerous area, real-time monitoring of deep foundation pit structure deformation and automatic warning. This study can initially improve the current situation of safety management in the construction site of deep foundation pit. Additionally, the active control before the occurrence of deep foundation pit accidents and the whole process dynamic control in the construction process can be realized so as to prevent and control the occurrence of safety accidents in the construction of deep foundation pit engineering.Keywords: Web-BIM, safety management, deep foundation pit, construction
Procedia PDF Downloads 15720460 Automatic and High Precise Modeling for System Optimization
Authors: Stephanie Chen, Mitja Echim, Christof Büskens
Abstract:
To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization
Procedia PDF Downloads 41020459 Diagnosis of the Lubrification System of a Gas Turbine Using the Adaptive Neuro-Fuzzy Inference System
Authors: H. Mahdjoub, B. Hamaidi, B. Zerouali, S. Rouabhia
Abstract:
The issue of fault detection and diagnosis (FDD) has gained widespread industrial interest in process condition monitoring applications. Accordingly, the use of neuro-fuzzy technic seems very promising. This paper treats a diagnosis modeling a strategic equipment of an industrial installation. We propose a diagnostic tool based on adaptive neuro-fuzzy inference system (ANFIS). The neuro-fuzzy network provides an abductive diagnosis. Moreover, it takes into account the uncertainties on the maintenance knowledge by giving a fuzzy characterization of each cause. This work was carried out with real data of a lubrication circuit from the gas turbine. The machine of interest is a gas turbine placed in a gas compressor station at South Industrial Centre (SIC Hassi Messaoud Ouargla, Algeria). We have defined the zones of good and bad functioning, and the results are presented to demonstrate the advantages of the proposed method.Keywords: fault detection and diagnosis, lubrication system, turbine, ANFIS, training, pattern recognition
Procedia PDF Downloads 49320458 Design and Analysis of Semi-Active Isolation System in Low Frequency Excitation Region for Vehicle Seat to Reduce Discomfort
Authors: Andrea Tonoli, Nicola Amati, Maria Cavatorta, Reza Mirsanei, Behzad Mozaffari, Hamed Ahani, Akbar Karamihafshejani, Mohammad Ghazivakili, Mohammad Abuabiah
Abstract:
The vibrations transmitted to the drivers and passengers through vehicle seat seriously effect on the level of their attention, fatigue and physical health and reduce the comfort and efficiency of the occupants. Recently, some researchers have focused on vibrations at low excitation frequency(0.5-5 Hz) which are considered to be the main risk factors for lumbar part of the backbone but they were not applicable to A and B-segment cars regarding to the size and weight. A semi-active system with two symmetric negative stiffness structures (NSS) in parallel to a positive stiffness structure and actuators has been proposed to attenuate low frequency excitation and makes system flexible regarding to different weight of passengers which is applicable for A and B-Segment cars. Here, the 3 degree of freedom system is considered, dynamic equation clearly is presented, then simulated in MATLAB in order to analysis of performance of the system. The design procedure is derived so that the resonance peak of frequency–response curve shift to the left, the isolating range is increased and especially, the peak of the frequency–response curve is minimized. According to ISO standard different class of road profile as an input is applied to the system to evaluate the performance of the system. To evaluate comfort issues, we extract the RMS value of the vertical acceleration acting on the passenger's body. Then apply the band-pass filter, which takes into account the human sensitivity to acceleration. According to ISO, this weighted acceleration is lower than 0.315 m/s^2, so the ride is considered as comfortable.Keywords: low frequency excitation, negative stiffness, seat vehicle, vibration isolation
Procedia PDF Downloads 44220457 A Survey on Lossless Compression of Bayer Color Filter Array Images
Authors: Alina Trifan, António J. R. Neves
Abstract:
Although most digital cameras acquire images in a raw format, based on a Color Filter Array that arranges RGB color filters on a square grid of photosensors, most image compression techniques do not use the raw data; instead, they use the rgb result of an interpolation algorithm of the raw data. This approach is inefficient and by performing a lossless compression of the raw data, followed by pixel interpolation, digital cameras could be more power efficient and provide images with increased resolution given that the interpolation step could be shifted to an external processing unit. In this paper, we conduct a survey on the use of lossless compression algorithms with raw Bayer images. Moreover, in order to reduce the effect of the transition between colors that increase the entropy of the raw Bayer image, we split the image into three new images corresponding to each channel (red, green and blue) and we study the same compression algorithms applied to each one individually. This simple pre-processing stage allows an improvement of more than 15% in predictive based methods.Keywords: bayer image, CFA, lossless compression, image coding standards
Procedia PDF Downloads 32620456 Paper-Like and Battery Free Sensor Patches for Wound Monitoring
Authors: Xiaodi Su, Xin Ting Zheng, Laura Sutarlie, Nur Asinah binte Mohamed Salleh, Yong Yu
Abstract:
Wound healing is a dynamic process with multiple phases. Rapid profiling and quantitative characterization of inflammation and infection remain challenging. We have developed paper-like battery-free multiplexed sensors for holistic wound assessment via quantitative detection of multiple inflammation and infection markers. In one of the designs, the sensor patch consists of a wax-printed paper panel with five colorimetric sensor channels arranged in a pattern resembling a five-petaled flower (denoted as a ‘Petal’ sensor). The five sensors are for temperature, pH, trimethylamine, uric acid, and moisture. The sensor patch is sandwiched between a top transparent silicone layer and a bottom adhesive wound contact layer. In the second design, a palm-like-shaped paper strip is fabricated by a paper-cutter printer (denoted as ‘Palm’ sensor). This sensor strip carries five sensor regions connected by a stem sampling entrance that enables rapid colorimetric detection of multiple bacteria metabolites (aldehyde, lactate, moisture, trimethylamine, tryptophan) from wound exudate. For both the “\’ Petal’ and ‘Palm’ sensors, color images can be captured by a mobile phone. According to the color changes, one can quantify the concentration of the biomarkers and then determine wound healing status and identify/quantify bacterial species in infected wounds. The ‘Petal’ and ‘Palm’ sensors are validated with in-situ animal and ex-situ skin wound models, respectively. These sensors have the potential for integration with wound dressing to allow early warning of adverse events without frequent removal of the plasters. Such in-situ and early detection of non-healing condition can trigger immediate clinical intervention to facilitate wound care management.Keywords: wound infection, colorimetric sensor, paper fluidic sensor, wound care
Procedia PDF Downloads 8520455 Study of Evaluation Model Based on Information System Success Model and Flow Theory Using Web-scale Discovery System
Authors: June-Jei Kuo, Yi-Chuan Hsieh
Abstract:
Because of the rapid growth of information technology, more and more libraries introduce the new information retrieval systems to enhance the users’ experience, improve the retrieval efficiency, and increase the applicability of the library resources. Nevertheless, few of them are discussed the usability from the users’ aspect. The aims of this study are to understand that the scenario of the information retrieval system utilization, and to know why users are willing to continuously use the web-scale discovery system to improve the web-scale discovery system and promote their use of university libraries. Besides of questionnaires, observations and interviews, this study employs both Information System Success Model introduced by DeLone and McLean in 2003 and the flow theory to evaluate the system quality, information quality, service quality, use, user satisfaction, flow, and continuing to use web-scale discovery system of students from National Chung Hsing University. Then, the results are analyzed through descriptive statistics and structural equation modeling using AMOS. The results reveal that in web-scale discovery system, the user’s evaluation of system quality, information quality, and service quality is positively related to the use and satisfaction; however, the service quality only affects user satisfaction. User satisfaction and the flow show a significant impact on continuing to use. Moreover, user satisfaction has a significant impact on user flow. According to the results of this study, to maintain the stability of the information retrieval system, to improve the information content quality, and to enhance the relationship between subject librarians and students are recommended for the academic libraries. Meanwhile, to improve the system user interface, to minimize layer from system-level, to strengthen the data accuracy and relevance, to modify the sorting criteria of the data, and to support the auto-correct function are required for system provider. Finally, to establish better communication with librariana commended for all users.Keywords: web-scale discovery system, discovery system, information system success model, flow theory, academic library
Procedia PDF Downloads 108