Search results for: third party monitoring software
7187 Condition Monitoring of a 3-Ø Induction Motor by Vibration Spectrum Analysis Using FFT Analyzer, a Case Study
Authors: Adinarayana S., Sudhakar I.
Abstract:
Energy conversion is one of the inevitable parts of any industries. It involves either conversion of mechanical energy in to electrical or vice versa. The later conversion of energy i.e. electrical to mechanical emphasizes the need of motor. Statistics reveals, about 8 % of industries’ annual turnover met on maintenance. Thus substantial numbers of efforts are required to minimize in incurring expenditure met towards break down maintenance. Condition monitoring is one of such techniques based on vibration widely used to recognize premature failures and paves a way to minimize cumbersome involved during breakdown of machinery. The present investigation involves a case study of squirrel cage induction motor (frequently in the electro machines) has been chosen for the conditional monitoring to predict its soundness on the basis of results of FFT analyser. Accelerometer which measures the acceleration converts in to impulses by FFT analyser generates vibration spectrum and time spectrum has been located at various positions on motor under different conditions. Results obtained from the FFT analyser are compared to that of ISO standard vibration severity charts are taken to predict the preventative condition of considered machinery. Initial inspection of motor revealed that stator faults, broken end rings in rotor, eccentricity faults and misalignment between bearings are trouble shootings areas for present investigation. From the results of the shaft frequencies, it can be perceived that there is a misalignment between the bearings at both the ends. The higher order harmonics of FTF shows the presence of cracks on the race of the bearings at both the ends which are in the incipient stage. Replacement of the bearings at both the drive end (6306) and non drive end (6206) and the alignment check between the bearings in the shaft are suggested as the constructive measures towards preventive maintenance of considered squirrel cage induction motor.Keywords: FFT analyser, condition monitoring, vibration spectrum, time wave form
Procedia PDF Downloads 3887186 Long-Term Field Performance of Paving Fabric Interlayer Systems to Reduce Reflective Cracking
Authors: Farshad Amini, Kejun Wen
Abstract:
The formation of reflective cracking of pavement overlays has confronted highway engineers for many years. Stress-relieving interlayers, such as paving fabrics, have been used in an attempt to reduce or delay reflective cracking. The effectiveness of paving fabrics in reducing reflection cracking is related to joint or crack movement in the underlying pavement, crack width, overlay thickness, subgrade conditions, climate, and traffic volume. The nonwoven geotextiles are installed between the old and new asphalt layers. Paving fabrics enhance performance through two mechanisms: stress relief and waterproofing. Several factors including proper installation, remedial work performed before overlay, overlay thickness, variability of pavement strength, existing pavement condition, base/subgrade support condition, and traffic volume affect the performance. The primary objective of this study was to conduct a long-term monitoring of the paving fabric interlayer systems to evaluate its effectiveness and performance. A comprehensive testing, monitoring, and analysis program were undertaken, where twelve 500-ft pavement sections of a four-lane highway were rehabilitated, and then monitored for seven years. A comparison between the performance of paving fabric treatment systems and control sections is reported. Lessons learned, and the various factors are discussed.Keywords: monitoring, paving fabrics, performance, reflective cracking
Procedia PDF Downloads 3337185 Drape Simulation by Commercial Software and Subjective Assessment of Virtual Drape
Authors: Evrim Buyukaslan, Simona Jevsnik, Fatma Kalaoglu
Abstract:
Simulation of fabrics is more difficult than any other simulation due to complex mechanics of fabrics. Most of the virtual garment simulation software use mass-spring model and incorporate fabric mechanics into simulation models. The accuracy and fidelity of these virtual garment simulation software is a question mark. Drape is a subjective phenomenon and evaluation of drape has been studied since 1950’s. On the other hand, fabric and garment simulation is relatively new. Understanding drape perception of subjects when looking at fabric simulations is critical as virtual try-on becomes more of an issue by enhanced online apparel sales. Projected future of online apparel retailing is that users may view their avatars and try-on the garment on their avatars in the virtual environment. It is a well-known fact that users will not be eager to accept this innovative technology unless it is realistic enough. Therefore, it is essential to understand what users see when they are displaying fabrics in a virtual environment. Are they able to distinguish the differences between various fabrics in virtual environment? The purpose of this study is to investigate human perception when looking at a virtual fabric and determine the most visually noticeable drape parameter. To this end, five different fabrics are mechanically tested, and their drape simulations are generated by commercial garment simulation software (Optitex®). The simulation images are processed by an image analysis software to calculate drape parameters namely; drape coefficient, node severity, and peak angles. A questionnaire is developed to evaluate drape properties subjectively in a virtual environment. Drape simulation images are shown to 27 subjects and asked to rank the samples according to their questioned drape property. The answers are compared to the calculated drape parameters. The results show that subjects are quite sensitive to drape coefficient changes while they are not very sensitive to changes in node dimensions and node distributions.Keywords: drape simulation, drape evaluation, fabric mechanics, virtual fabric
Procedia PDF Downloads 3387184 Monitoring Spatial Distribution of Blue-Green Algae Blooms with Underwater Drones
Authors: R. L. P. De Lima, F. C. B. Boogaard, R. E. De Graaf-Van Dinther
Abstract:
Blue-green algae blooms (cyanobacteria) is currently a relevant ecological problem that is being addressed by most water authorities in the Netherlands. These can affect recreation areas by originating unpleasant smells and toxins that can poison humans and animals (e.g. fish, ducks, dogs). Contamination events usually take place during summer months, and their frequency is increasing with climate change. Traditional monitoring of this bacteria is expensive, labor-intensive and provides only limited (point sampling) information about the spatial distribution of algae concentrations. Recently, a novel handheld sensor allowed water authorities to quicken their algae surveying and alarm systems. This study converted the mentioned algae sensor into a mobile platform, by combining it with an underwater remotely operated vehicle (also equipped with other sensors and cameras). This provides a spatial visualization (mapping) of algae concentrations variations within the area covered with the drone, and also in depth. Measurements took place in different locations in the Netherlands: i) lake with thick silt layers at the bottom, very eutrophic former bottom of the sea and frequent / intense mowing regime; ii) outlet of waste water into large reservoir; iii) urban canal system. Results allowed to identify probable dominant causes of blooms (i), provide recommendations for the placement of an outlet, day-night differences in algae behavior (ii), or the highlight / pinpoint higher algae concentration areas (iii). Although further research is still needed to fully characterize these processes and to optimize the measuring tool (underwater drone developments / improvements), the method here presented can already provide valuable information about algae behavior and spatial / temporal variability and shows potential as an efficient monitoring system.Keywords: blue-green algae, cyanobacteria, underwater drones / ROV / AUV, water quality monitoring
Procedia PDF Downloads 2077183 An Efficient Digital Baseband ASIC for Wireless Biomedical Signals Monitoring
Authors: Kah-Hyong Chang, Xin Liu, Jia Hao Cheong, Saisundar Sankaranarayanan, Dexing Pang, Hongzhao Zheng
Abstract:
A digital baseband Application-Specific Integrated Circuit (ASIC) is developed for a microchip transponder to transmit signals and temperature levels from biomedical monitoring devices. The transmission protocol is adapted from the ISO/IEC 11784/85 standard. The module has a decimation filter that employs only a single adder-subtractor in its datapath. The filtered output is coded with cyclic redundancy check and transmitted through backscattering Load Shift Keying (LSK) modulation to a reader. Fabricated using the 0.18-μm CMOS technology, the module occupies 0.116 mm² in chip area (digital baseband: 0.060 mm², decimation filter: 0.056 mm²), and consumes a total of less than 0.9 μW of power (digital baseband: 0.75 μW, decimation filter: 0.14 μW).Keywords: biomedical sensor, decimation filter, radio frequency integrated circuit (RFIC) baseband, temperature sensor
Procedia PDF Downloads 3977182 Condition Monitoring of a 3-Ø Induction Motor by Vibration Spectrum Analysis Using FFT Analyzer- a Case Study
Authors: Adi Narayana S Sudhakar. I
Abstract:
Energy conversion is one of the inevitable parts of any industries. It involves either conversion of mechanical energy in to electrical or vice versa. The later conversion of energy i.e. electrical to mechanical emphasizes the need of motor .Statistics reveals, about 8 % of industries’ annual turnover met on maintenance. Thus substantial numbers of efforts are required to minimize in incurring expenditure met towards break down maintenance. Condition monitoring is one of such techniques based on vibration widely used to recognize premature failures and paves a way to minimize cumbersome involved during breakdown of machinery. The present investigation involves a case study of squirrel cage induction motor (frequently in the electro machines) has been chosen for the conditional monitoring to predict its soundness on the basis of results of FFT analyser. Accelerometer which measures the acceleration converts in to impulses by FFT analyser generates vibration spectrum and time spectrum has been located at various positions on motor under different conditions. Results obtained from the FFT analyzer are compared to that of ISO standard vibration severity charts are taken to predict the preventative condition of considered machinery. Initial inspection of motor revealed that stator faults, broken end rings in rotor, eccentricity faults and misalignment between bearings are trouble shootings areas for present investigation. From the results of the shaft frequencies, it can be perceived that there is a misalignment between the bearings at both the ends. The higher order harmonics of FTF shows the presence of cracks on the race of the bearings at both the ends which are in the incipient stage. Replacement of the bearings at both the drive end (6306) and non-drive end (6206) and the alignment check between the bearings in the shaft are suggested as the constructive measures towards preventive maintenance of considered squirrel cage induction motor.Keywords: FFT analyser, condition monitoring, vibration spectrum, time spectrum accelerometer
Procedia PDF Downloads 4517181 Monitoring and Evaluation of the Water Quality of Taal Lake, Talisay, Batangas, Philippines
Authors: Felipe B. Martinez, Imelda C. Galera
Abstract:
This paper presents an update on the physico-chemical properties of the Taal Lake for local government officials and representatives of non-government organizations by monitoring and evaluating a total of nine (9) water quality parameters. The study further shows that the Taal Lakes surface temperature, pH, total dissolved solids, total suspended solids, color, and dissolved oxygen content conform to the standards set by the Department of Environment and Natural resources (DENR); while phosphate, chlorine, and 5-Day 20°C BOD are below the standard. Likewise, the T-test result shows no significant difference in the overall average of the two sites at the Taal Lake (P > 0.05). Based on the data, the Lake is safe for primary contact recreation such as bathing, swimming and skin diving, and can be used for aqua culture purposes.Keywords: cool dry season, hot dry season, rainy season, Taal Lake, water quality
Procedia PDF Downloads 3087180 [Keynote Talk]: The Challenges and Solutions for Developing Mobile Apps in a Small University
Authors: Greg Turner, Bin Lu, Cheer-Sun Yang
Abstract:
As computing technology advances, smartphone applications can assist in student learning in a pervasive way. For example, the idea of using a mobile apps for the PA Common Trees, Pests, Pathogens, in the field as a reference tool allows middle school students to learn about trees and associated pests/pathogens without bringing a textbook. In the past, some researches study the mobile software Mobile Application Software Development Life Cycle (MADLC) including traditional models such as the waterfall model, or more recent Agile Methods. Others study the issues related to the software development process. Very little research is on the development of three heterogenous mobile systems simultaneously in a small university where the availability of developers is an issue. In this paper, we propose to use a hybride model of Waterfall Model and the Agile Model, known as the Relay Race Methodology (RRM) in practice, to reflect the concept of racing and relaying for scheduling. Based on the development project, we observe that the modeling of the transition between any two phases is manifested naturally. Thus, we claim that the RRM model can provide a de fecto rather than a de jure basis for the core concept in the MADLC. In this paper, the background of the project is introduced first. Then, the challenges are pointed out followed by our solutions. Finally, the experiences learned and the future work are presented.Keywords: agile methods, mobile apps, software process model, waterfall model
Procedia PDF Downloads 4097179 Unknown Groundwater Pollution Source Characterization in Contaminated Mine Sites Using Optimal Monitoring Network Design
Authors: H. K. Esfahani, B. Datta
Abstract:
Groundwater is one of the most important natural resources in many parts of the world; however it is widely polluted due to human activities. Currently, effective and reliable groundwater management and remediation strategies are obtained using characterization of groundwater pollution sources, where the measured data in monitoring locations are utilized to estimate the unknown pollutant source location and magnitude. However, accurately identifying characteristics of contaminant sources is a challenging task due to uncertainties in terms of predicting source flux injection, hydro-geological and geo-chemical parameters, and the concentration field measurement. Reactive transport of chemical species in contaminated groundwater systems, especially with multiple species, is a complex and highly non-linear geochemical process. Although sufficient concentration measurement data is essential to accurately identify sources characteristics, available data are often sparse and limited in quantity. Therefore, this inverse problem-solving method for characterizing unknown groundwater pollution sources is often considered ill-posed, complex and non- unique. Different methods have been utilized to identify pollution sources; however, the linked simulation-optimization approach is one effective method to obtain acceptable results under uncertainties in complex real life scenarios. With this approach, the numerical flow and contaminant transport simulation models are externally linked to an optimization algorithm, with the objective of minimizing the difference between measured concentration and estimated pollutant concentration at observation locations. Concentration measurement data are very important to accurately estimate pollution source properties; therefore, optimal design of the monitoring network is essential to gather adequate measured data at desired times and locations. Due to budget and physical restrictions, an efficient and effective approach for groundwater pollutant source characterization is to design an optimal monitoring network, especially when only inadequate and arbitrary concentration measurement data are initially available. In this approach, preliminary concentration observation data are utilized for preliminary source location, magnitude and duration of source activity identification, and these results are utilized for monitoring network design. Further, feedback information from the monitoring network is used as inputs for sequential monitoring network design, to improve the identification of unknown source characteristics. To design an effective monitoring network of observation wells, optimization and interpolation techniques are used. A simulation model should be utilized to accurately describe the aquifer properties in terms of hydro-geochemical parameters and boundary conditions. However, the simulation of the transport processes becomes complex when the pollutants are chemically reactive. Three dimensional transient flow and reactive contaminant transport process is considered. The proposed methodology uses HYDROGEOCHEM 5.0 (HGCH) as the simulation model for flow and transport processes with chemically multiple reactive species. Adaptive Simulated Annealing (ASA) is used as optimization algorithm in linked simulation-optimization methodology to identify the unknown source characteristics. Therefore, the aim of the present study is to develop a methodology to optimally design an effective monitoring network for pollution source characterization with reactive species in polluted aquifers. The performance of the developed methodology will be evaluated for an illustrative polluted aquifer sites, for example an abandoned mine site in Queensland, Australia.Keywords: monitoring network design, source characterization, chemical reactive transport process, contaminated mine site
Procedia PDF Downloads 2317178 European Commission Radioactivity Environmental Monitoring Database REMdb: A Law (Art. 36 Euratom Treaty) Transformed in Environmental Science Opportunities
Authors: M. Marín-Ferrer, M. A. Hernández, T. Tollefsen, S. Vanzo, E. Nweke, P. V. Tognoli, M. De Cort
Abstract:
Under the terms of Article 36 of the Euratom Treaty, European Union Member States (MSs) shall periodically communicate to the European Commission (EC) information on environmental radioactivity levels. Compilations of the information received have been published by the EC as a series of reports beginning in the early 1960s. The environmental radioactivity results received from the MSs have been introduced into the Radioactivity Environmental Monitoring database (REMdb) of the Institute for Transuranium Elements of the EC Joint Research Centre (JRC) sited in Ispra (Italy) as part of its Directorate General for Energy (DG ENER) support programme. The REMdb brings to the scientific community dealing with environmental radioactivity topics endless of research opportunities to exploit the near 200 millions of records received from MSs containing information of radioactivity levels in milk, water, air and mixed diet. The REM action was created shortly after Chernobyl crisis to support the EC in its responsibilities in providing qualified information to the European Parliament and the MSs on the levels of radioactive contamination of the various compartments of the environment (air, water, soil). Hence, the main line of REM’s activities concerns the improvement of procedures for the collection of environmental radioactivity concentrations for routine and emergency conditions, as well as making this information available to the general public. In this way, REM ensures the availability of tools for the inter-communication and access of users from the Member States and the other European countries to this information. Specific attention is given to further integrate the new MSs with the existing information exchange systems and to assist Candidate Countries in fulfilling these obligations in view of their membership of the EU. Article 36 of the EURATOM treaty requires the competent authorities of each MS to provide regularly the environmental radioactivity monitoring data resulting from their Article 35 obligations to the EC in order to keep EC informed on the levels of radioactivity in the environment (air, water, milk and mixed diet) which could affect population. The REMdb has mainly two objectives: to keep a historical record of the radiological accidents for further scientific study, and to collect the environmental radioactivity data gathered through the national environmental monitoring programs of the MSs to prepare the comprehensive annual monitoring reports (MR). The JRC continues his activity of collecting, assembling, analyzing and providing this information to public and MSs even during emergency situations. In addition, there is a growing concern with the general public about the radioactivity levels in the terrestrial and marine environment, as well about the potential risk of future nuclear accidents. To this context, a clear and transparent communication with the public is needed. EURDEP (European Radiological Data Exchange Platform) is both a standard format for radiological data and a network for the exchange of automatic monitoring data. The latest release of the format is version 2.0, which is in use since the beginning of 2002.Keywords: environmental radioactivity, Euratom, monitoring report, REMdb
Procedia PDF Downloads 4437177 Machine Learning Approach to Project Control Threshold Reliability Evaluation
Authors: Y. Kim, H. Lee, M. Park, B. Lee
Abstract:
Planning is understood as the determination of what has to be performed, how, in which sequence, when, what resources are needed, and their cost within the organization before execution. In most construction project, it is evident that the inherent nature of planning is dynamic, and initial planning is subject to be changed due to various uncertain conditions of construction project. Planners take a continuous revision process during the course of a project and until the very end of project. However, current practice lacks reliable, systematic tool for setting variance thresholds to determine when and what corrective actions to be taken. Rather it is heavily dependent on the level of experience and knowledge of the planner. Thus, this paper introduces a machine learning approach to evaluate project control threshold reliability incorporating project-specific data and presents a method to automate the process. The results have shown that the model improves the efficiency and accuracy of the monitoring process as an early warning.Keywords: machine learning, project control, project progress monitoring, schedule
Procedia PDF Downloads 2447176 Neighbor Caring Environment System (NCE) Using Parallel Replication Mechanism
Authors: Ahmad Shukri Mohd Noor, Emma Ahmad Sirajudin, Rabiei Mamat
Abstract:
Pertaining to a particular Marine interest, the process of data sampling could take years before a study can be concluded. Therefore, the need for a robust backup system for the data is invariably implicit. In recent advancement of Marine applications, more functionalities and tools are integrated to assist the work of the researchers. It is anticipated that this modality will continue as research scope widens and intensifies and at the same to follow suit with current technologies and lifestyles. The convenience to collect and share information these days also applies to the work in Marine research. Therefore, Marine system designers should be aware that high availability is a necessary attribute in Marine repository applications as well as a robust backup system for the data. In this paper, the approach to high availability is related both to hardware and software but the focus is more on software. We consider a NABTIC repository system that is primitively built on a single server and does not have replicated components. First, the system is decomposed into separate modules. The modules are placed on multiple servers to create a distributed system. Redundancy is added by placing the copies of the modules on different servers using Neighbor Caring Environment System(NCES) technique. NCER is utilizing parallel replication components mechanism. A background monitoring is established to check servers’ heartbeats to confirm their aliveness. At the same time, a critical adaptive threshold is maintained to make sure a failure is timely detected using Adaptive Fault Detection (AFD). A confirmed failure will set the recovery mode where a selection process will be done before a fail-over server is instructed. In effect, the Marine repository service is continued as the fail-over masks a recent failure. The performance of the new prototype is tested and is confirmed to be more highly available. Furthermore, the downtime is not noticeable as service is immediately restored automatically. The Marine repository system is said to have achieved fault tolerance.Keywords: availability, fault detection, replication, fault tolerance, marine application
Procedia PDF Downloads 3217175 Stability of a Natural Weak Rock Slope under Rapid Water Drawdowns: Interaction between Guadalfeo Viaduct and Rules Reservoir, Granada, Spain
Authors: Sonia Bautista Carrascosa, Carlos Renedo Sanchez
Abstract:
The effect of a rapid drawdown is a classical scenario to be considered in slope stability under submerged conditions. This situation arises when totally or partially submerged slopes experience a descent of the external water level and is a typical verification to be done in a dam engineering discipline, as reservoir water levels commonly fluctuate noticeably during seasons and due to operational reasons. Although the scenario is well known and predictable in general, site conditions can increase the complexity of its assessment and external factors are not always expected, can cause a reduction in the stability or even a failure in a slope under a rapid drawdown situation. The present paper describes and discusses the interaction between two different infrastructures, a dam and a highway, and the impact on the stability of a natural rock slope overlaid by the north abutment of a viaduct of the A-44 Highway due to the rapid drawdown of the Rules Dam, in the province of Granada (south of Spain). In the year 2011, with both infrastructures, the A-44 Highway and the Rules Dam already constructed, delivered and under operation, some movements start to be recorded in the approximation embankment and north abutment of the Guadalfeo Viaduct, included in the highway and developed to solve the crossing above the tail of the reservoir. The embankment and abutment were founded in a low-angle natural rock slope formed by grey graphic phyllites, distinctly weathered and intensely fractured, with pre-existing fault and weak planes. After the first filling of the reservoir, to a relative level of 243m, three consecutive drawdowns were recorded in the autumns 2010, 2011 and 2012, to relative levels of 234m, 232m and 225m. To understand the effect of these drawdowns in the weak rock mass strength and in its stability, a new geological model was developed, after reviewing all the available ground investigations, updating the geological mapping of the area and supplemented with an additional geotechnical and geophysical investigations survey. Together with all this information, rainfall and reservoir level evolution data have been reviewed in detail to incorporate into the monitoring interpretation. The analysis of the monitoring data and the new geological and geotechnical interpretation, supported by the use of limit equilibrium software Slide2, concludes that the movement follows the same direction as the schistosity of the phyllitic rock mass, coincident as well with the direction of the natural slope, indicating a deep-seated movement of the whole slope towards the reservoir. As part of these conclusions, the solutions considered to reinstate the highway infrastructure to the required FoS will be described, and the geomechanical characterization of these weak rocks discussed, together with the influence of water level variations, not only in the water pressure regime but in its geotechnical behavior, by the modification of the strength parameters and deformability.Keywords: monitoring, rock slope stability, water drawdown, weak rock
Procedia PDF Downloads 1607174 Impact Position Method Based on Distributed Structure Multi-Agent Coordination with JADE
Authors: YU Kaijun, Liang Dong, Zhang Yarong, Jin Zhenzhou, Yang Zhaobao
Abstract:
For the impact monitoring of distributed structures, the traditional positioning methods are based on the time difference, which includes the four-point arc positioning method and the triangulation positioning method. But in the actual operation, these two methods have errors. In this paper, the Multi-Agent Blackboard Coordination Principle is used to combine the two methods. Fusion steps: (1) The four-point arc locating agent calculates the initial point and records it to the Blackboard Module.(2) The triangulation agent gets its initial parameters by accessing the initial point.(3) The triangulation agent constantly accesses the blackboard module to update its initial parameters, and it also logs its calculated point into the blackboard.(4) When the subsequent calculation point and the initial calculation point are within the allowable error, the whole coordination fusion process is finished. This paper presents a Multi-Agent collaboration method whose agent framework is JADE. The JADE platform consists of several agent containers, with the agent running in each container. Because of the perfect management and debugging tools of the JADE, it is very convenient to deal with complex data in a large structure. Finally, based on the data in Jade, the results show that the impact location method based on Multi-Agent coordination fusion can reduce the error of the two methods.Keywords: impact monitoring, structural health monitoring(SHM), multi-agent system(MAS), black-board coordination, JADE
Procedia PDF Downloads 1787173 4-DOFs Parallel Mechanism for Minimally Invasive Robotic Surgery
Authors: Khalil Ibrahim, Ahmed Ramadan, Mohamed Fanni, Yo Kobayashi, Ahmed Abo-Ismail, Masakatus G. Fujie
Abstract:
This paper deals with the design process and the dynamic control simulation of a new type of 4-DOFs parallel mechanism that can be used as an endoscopic surgical manipulator. The proposed mechanism, 2-PUU_2-PUS, is designed based on the screw theory and the parallel virtual chain type synthesis method. Based on the structure analysis of the 4-DOF parallel mechanism, the inverse position equation is studied using the inverse analysis theory of kinematics. The design and the stress analysis of the mechanism are investigated using SolidWorks software. The virtual prototype of the parallel mechanism is constructed, and the dynamic simulation is performed using ADAMS TM software. The system model utilizing PID and PI controllers has been built using MATLAB software. A more realistic simulation in accordance with a given bending angle and point to point control is implemented by the use of both ADAMS/MATLAB software. The simulation results showed that this control method has solved the coordinate control for the 4-DOF parallel manipulator so that each output is feedback to the four driving rods. From the results, the tracking performance is achieved. Other control techniques, such as intelligent ones, are recommended to improve the tracking performance and reduce the numerical truncation error.Keywords: parallel mechanisms, medical robotics, tracjectory control, virtual chain type synthesis method
Procedia PDF Downloads 4687172 Modernization of the Economic Price Adjustment Software
Authors: Roger L. Goodwin
Abstract:
The US Consumer Price Indices (CPIs) measures hundreds of items in the US economy. Many social programs and government benefits index to the CPIs. In mid to late 1990, much research went into changes to the CPI by a Congressional Advisory Committee. One thing can be said from the research is that, aside from there are alternative estimators for the CPI; any fundamental change to the CPI will affect many government programs. The purpose of this project is to modernize an existing process. This paper will show the development of a small, visual, software product that documents the Economic Price Adjustment (EPA) for long-term contracts. The existing workbook does not provide the flexibility to calculate EPAs where the base-month and the option-month are different. Nor does the workbook provide automated error checking. The small, visual, software product provides the additional flexibility and error checking. This paper presents the feedback to project.Keywords: Consumer Price Index, Economic Price Adjustment, contracts, visualization tools, database, reports, forms, event procedures
Procedia PDF Downloads 3177171 Closed Loop Traffic Control System Using PLC
Authors: Chinmay Shah
Abstract:
The project is all about development of a close loop traffic light control system using PLC (Programmable Logic Controller). This project is divided into two parts which are hardware and software. The hardware part for this project is a model of four way junction of a traffic light. Three indicator lamps (Red, Yellow and Green) are installed at each lane for represents as traffic light signal. This traffic control model is a replica of actuated traffic control. Actuated traffic control system is a close loop traffic control system which controls the timing of the indicator lamps depending on the fluidity of traffic for a particular lane. To make it autonomous, in each lane three IR sensors are placed which helps to sense the percentage of traffic present on any particular lane. The IR Sensors and Indicator lamps are connected to LG PLC XGB series. The PLC controls every signal which is coming from the inputs (IR Sensors) to software and display to the outputs (Indicator lamps). Default timing for the indicator lamps is 30 seconds for each lane. But depending on the percentage of traffic present, if the traffic is nearly 30-35%, green lamp will be on for 10 seconds, for 65-70% traffic it will be 20 seconds, for full 100% traffic it will be on for full 30 seconds. The software part that operates with LG PLC is “XG 5000” Programmer. Using this software, the ladder logic diagram is programmed to control the traffic light base on the flow chart. At the end of this project, the traffic light system is actuated successfully by PLC.Keywords: close loop, IR sensor, PLC, light control system
Procedia PDF Downloads 5717170 Free Vibration Analysis of Gabled Frame Considering Elastic Supports and Semi-Rigid Connections
Authors: A. Shooshtari, A. R. Masoodi, S. Heyrani Moghaddam
Abstract:
Free vibration analysis of a gabled frame with elastic support and semi-rigid connections is performed by using a program in OpenSees software. Natural frequencies and mode shape details of frame are obtained for two states, which are semi-rigid connections and elastic supports, separately. The members of this structure are analyzed as a prismatic nonlinear beam-column element in software. The mass of structure is considered as two equal lumped masses at the head of two columns in horizontal and vertical directions. Note that the degree of freedom, allocated to all nodes, is equal to three. Furthermore, the mode shapes of frame are achieved. Conclusively, the effects of connections and supports flexibility on the natural frequencies and mode shapes of structure are investigated.Keywords: natural frequency, mode shape, gabled frame, semi-rigid connection, elastic support, OpenSees software
Procedia PDF Downloads 4077169 Long-Term Monitoring and Seasonal Analysis of PM10-Bound Benzo(a)pyrene in the Ambient Air of Northwestern Hungary
Authors: Zs. Csanádi, A. Szabó Nagy, J. Szabó, J. Erdős
Abstract:
Atmospheric aerosols have several important environmental impacts and health effects in point of air quality. Monitoring the PM10-bound polycyclic aromatic hydrocarbons (PAHs) could have important environmental significance and health protection aspects. Benzo(a)pyrene (BaP) is the most relevant indicator of these PAH compounds. In Hungary, the Hungarian Air Quality Network provides air quality monitoring data for several air pollutants including BaP, but these data show only the annual mean concentrations and maximum values. Seasonal variation of BaP concentrations comparing the heating and non-heating periods could have important role and difference as well. For this reason, the main objective of this study was to assess the annual concentration and seasonal variation of BaP associated with PM10 in the ambient air of Northwestern Hungary seven different sampling sites (six urban and one rural) in the sampling period of 2008–2013. A total of 1475 PM10 aerosol samples were collected in the different sampling sites and analyzed for BaP by gas chromatography method. The BaP concentrations ranged from undetected to 8 ng/m3 with the mean value range of 0.50-0.96 ng/m3 referring to all sampling sites. Relatively higher concentrations of BaP were detected in samples collected in each sampling site in the heating seasons compared with non-heating periods. The annual mean BaP concentrations were comparable with the published data of the other Hungarian sites.Keywords: air quality, benzo(a)pyrene, PAHs, polycyclic aromatic hydrocarbons
Procedia PDF Downloads 3087168 Acoustic Emission for Tool-Chip Interface Monitoring during Orthogonal Cutting
Authors: D. O. Ramadan, R. S. Dwyer-Joyce
Abstract:
The measurement of the interface conditions in a cutting tool contact is essential information for performance monitoring and control. This interface provides the path for the heat flux to the cutting tool. This elevate in the cutting tool temperature leads to motivate the mechanism of tool wear, thus affect the life of the cutting tool and the productivity. This zone is representative by the tool-chip interface. Therefore, understanding and monitoring this interface is considered an important issue in machining. In this paper, an acoustic emission (AE) technique was used to find the correlation between AE parameters and the tool-chip interface. For this reason, a response surface design (RSD) has been used to analyse and optimize the machining parameters. The experiment design was based on the face centered, central composite design (CCD) in the Minitab environment. According to this design, a series of orthogonal cutting experiments for different cutting conditions were conducted on a Triumph 2500 lathe machine to study the sensitivity of the acoustic emission (AE) signal to change in tool-chip contact length. The cutting parameters investigated were the cutting speed, depth of cut, and feed and the experiments were performed for 6082-T6 aluminium tube. All the orthogonal cutting experiments were conducted unlubricated. The tool-chip contact area was investigated using a scanning electron microscope (SEM). The results obtained in this paper indicate that there is a strong dependence of the root mean square (RMS) on the cutting speed, where the RMS increases with increasing the cutting speed. A dependence on the tool-chip contact length has been also observed. However there was no effect observed of changing the cutting depth and feed on the RMS. These dependencies have been clarified in terms of the strain and temperature in the primary and secondary shear zones, also the tool-chip sticking and sliding phenomenon and the effect of these mechanical variables on dislocation activity at high strain rates. In conclusion, the acoustic emission technique has the potential to monitor in situ the tool-chip interface in turning and consequently could indicate the approaching end of life of a cutting tool.Keywords: Acoustic emission, tool-chip interface, orthogonal cutting, monitoring
Procedia PDF Downloads 4877167 Procedure for Monitoring the Process of Behavior of Thermal Cracking in Concrete Gravity Dams: A Case Study
Authors: Adriana de Paula Lacerda Santos, Bruna Godke, Mauro Lacerda Santos Filho
Abstract:
Several dams in the world have already collapsed, causing environmental, social and economic damage. The concern to avoid future disasters has stimulated the creation of a great number of laws and rules in many countries. In Brazil, Law 12.334/2010 was created, which establishes the National Policy on Dam Safety. Overall, this policy requires the dam owners to invest in the maintenance of their structures and to improve its monitoring systems in order to provide faster and straightforward responses in the case of an increase of risks. As monitoring tools, visual inspections has provides comprehensive assessment of the structures performance, while auscultation’s instrumentation has added specific information on operational or behavioral changes, providing an alarm when a performance indicator exceeds the acceptable limits. These limits can be set using statistical methods based on the relationship between instruments measures and other variables, such as reservoir level, time of the year or others instruments measuring. Besides the design parameters (uplift of the foundation, displacements, etc.) the dam instrumentation can also be used to monitor the behavior of defects and damage manifestations. Specifically in concrete gravity dams, one of the main causes for the appearance of cracks, are the concrete volumetric changes generated by the thermal origin phenomena, which are associated with the construction process of these structures. Based on this, the goal of this research is to propose a monitoring process of the thermal cracking behavior in concrete gravity dams, through the instrumentation data analysis and the establishment of control values. Therefore, as a case study was selected the Block B-11 of José Richa Governor Dam Power Plant, that presents a cracking process, which was identified even before filling the reservoir in August’ 1998, and where crack meters and surface thermometers were installed for its monitoring. Although these instruments were installed in May 2004, the research was restricted to study the last 4.5 years (June 2010 to November 2014), when all the instruments were calibrated and producing reliable data. The adopted method is based on simple linear correlations procedures to understand the interactions among the instruments time series, verifying the response times between them. The scatter plots were drafted from the best correlations, which supported the definition of the limit control values. Among the conclusions, it is shown that there is a strong or very strong correlation between ambient temperature and the crack meters and flowmeters measurements. Based on the results of the statistical analysis, it was possible to develop a tool for monitoring the behavior of the case study cracks. Thus it was fulfilled the goal of the research to develop a proposal for a monitoring process of the behavior of thermal cracking in concrete gravity dams.Keywords: concrete gravity dam, dams safety, instrumentation, simple linear correlation
Procedia PDF Downloads 2927166 Classification of Forest Types Using Remote Sensing and Self-Organizing Maps
Authors: Wanderson Goncalves e Goncalves, José Alberto Silva de Sá
Abstract:
Human actions are a threat to the balance and conservation of the Amazon forest. Therefore the environmental monitoring services play an important role as the preservation and maintenance of this environment. This study classified forest types using data from a forest inventory provided by the 'Florestal e da Biodiversidade do Estado do Pará' (IDEFLOR-BIO), located between the municipalities of Santarém, Juruti and Aveiro, in the state of Pará, Brazil, covering an area approximately of 600,000 hectares, Bands 3, 4 and 5 of the TM-Landsat satellite image, and Self - Organizing Maps. The information from the satellite images was extracted using QGIS software 2.8.1 Wien and was used as a database for training the neural network. The midpoints of each sample of forest inventory have been linked to images. Later the Digital Numbers of the pixels have been extracted, composing the database that fed the training process and testing of the classifier. The neural network was trained to classify two forest types: Rain Forest of Lowland Emerging Canopy (Dbe) and Rain Forest of Lowland Emerging Canopy plus Open with palm trees (Dbe + Abp) in the Mamuru Arapiuns glebes of Pará State, and the number of examples in the training data set was 400, 200 examples for each class (Dbe and Dbe + Abp), and the size of the test data set was 100, with 50 examples for each class (Dbe and Dbe + Abp). Therefore, total mass of data consisted of 500 examples. The classifier was compiled in Orange Data Mining 2.7 Software and was evaluated in terms of the confusion matrix indicators. The results of the classifier were considered satisfactory, and being obtained values of the global accuracy equal to 89% and Kappa coefficient equal to 78% and F1 score equal to 0,88. It evaluated also the efficiency of the classifier by the ROC plot (receiver operating characteristics), obtaining results close to ideal ratings, showing it to be a very good classifier, and demonstrating the potential of this methodology to provide ecosystem services, particularly in anthropogenic areas in the Amazon.Keywords: artificial neural network, computational intelligence, pattern recognition, unsupervised learning
Procedia PDF Downloads 3617165 Monitoring of Potato Rot Nematode (Ditylenchus destructor Thorne, 1945) in Southern Georgia Nematode Fauna Diversity of Rhizosphere
Authors: E. Tskitishvili, L. Jgenti, I. Eliava, T. Tskitishvili, N. Bagathuria, M. Gigolashvili
Abstract:
The nematode fauna of 20 agrocenosis (soil, tuber of potato, green parts of plant, roots) was studied in four regions in South Georgia (Akhaltsikhe, Aspindza, Akhalkalaki, Ninotsminda). In all, there were registered 173 forms of free-living and Phyto-parasitic nematodes, including 132 forms which were specified according to their species. A few exemplars of potato root nematode (Ditylenchus destructor) were identified in soil samples taken in Ninotsminda, Akhalkalaki and Aspinda stations, i.e. invasion is weak. Based on our data, potato Ditylenchus was not found in any of the researched tubers, while based on the data of previous years the most of tubers were infested. The cysts of 'golden nematodes' were not found during inspection of material for detection of GloboderosisKeywords: ditylenchus, monitoring, nematoda, potato
Procedia PDF Downloads 3577164 Monitoring Land Productivity Dynamics of Gombe State, Nigeria
Authors: Ishiyaku Abdulkadir, Satish Kumar J
Abstract:
Land Productivity is a measure of the greenness of above-ground biomass in health and potential gain and is not related to agricultural productivity. Monitoring land productivity dynamics is essential to identify, especially when and where the trend is characterized degraded for mitigation measures. This research aims to monitor the land productivity trend of Gombe State between 2001 and 2015. QGIS was used to compute NDVI from AVHRR/MODIS datasets in a cloud-based method. The result appears that land area with improving productivity account for 773sq.km with 4.31%, stable productivity traced to 4,195.6 sq.km with 23.40%, stable but stressed productivity represent 18.7sq.km account for 0.10%, early sign of decline productivity occupied 5203.1sq.km with 29%, declining productivity account for 7019.7sq.km, represent 39.2%, water bodies occupied 718.7sq.km traced to 4% of the state’s area.Keywords: above-ground biomass, dynamics, land productivity, man-environment relationship
Procedia PDF Downloads 1457163 Software Verification of Systematic Resampling for Optimization of Particle Filters
Authors: Osiris Terry, Kenneth Hopkinson, Laura Humphrey
Abstract:
Systematic resampling is the most popularly used resampling method in particle filters. This paper seeks to further the understanding of systematic resampling by defining a formula made up of variables from the sampling equation and the particle weights. The formula is then verified via SPARK, a software verification language. The verified systematic resampling formula states that the minimum/maximum number of possible samples taken of a particle is equal to the floor/ceiling value of particle weight divided by the sampling interval, respectively. This allows for the creation of a randomness spectrum that each resampling method can fall within. Methods on the lower end, e.g., systematic resampling, have less randomness and, thus, are quicker to reach an estimate. Although lower randomness allows for error by having a larger bias towards the size of the weight, having this bias creates vulnerabilities to the noise in the environment, e.g., jamming. Conclusively, this is the first step in characterizing each resampling method. This will allow target-tracking engineers to pick the best resampling method for their environment instead of choosing the most popularly used one.Keywords: SPARK, software verification, resampling, systematic resampling, particle filter, tracking
Procedia PDF Downloads 847162 Carbon Based Wearable Patch Devices for Real-Time Electrocardiography Monitoring
Authors: Hachul Jung, Ahee Kim, Sanghoon Lee, Dahye Kwon, Songwoo Yoon, Jinhee Moon
Abstract:
We fabricated a wearable patch device including novel patch type flexible dry electrode based on carbon nanofibers (CNFs) and silicone-based elastomer (MED 6215) for real-time ECG monitoring. There are many methods to make flexible conductive polymer by mixing metal or carbon-based nanoparticles. In this study, CNFs are selected for conductive nanoparticles because carbon nanotubes (CNTs) are difficult to disperse uniformly in elastomer compare with CNFs and silver nanowires are relatively high cost and easily oxidized in the air. Wearable patch is composed of 2 parts that dry electrode parts for recording bio signal and sticky patch parts for mounting on the skin. Dry electrode parts were made by vortexer and baking in prepared mold. To optimize electrical performance and diffusion degree of uniformity, we developed unique mixing and baking process. Secondly, sticky patch parts were made by patterning and detaching from smooth surface substrate after spin-coating soft skin adhesive. In this process, attachable and detachable strengths of sticky patch are measured and optimized for them, using a monitoring system. Assembled patch is flexible, stretchable, easily skin mountable and connectable directly with the system. To evaluate the performance of electrical characteristics and ECG (Electrocardiography) recording, wearable patch was tested by changing concentrations of CNFs and thickness of the dry electrode. In these results, the CNF concentration and thickness of dry electrodes were important variables to obtain high-quality ECG signals without incidental distractions. Cytotoxicity test is conducted to prove biocompatibility, and long-term wearing test showed no skin reactions such as itching or erythema. To minimize noises from motion artifacts and line noise, we make the customized wireless, light-weight data acquisition system. Measured ECG Signals from this system are stable and successfully monitored simultaneously. To sum up, we could fully utilize fabricated wearable patch devices for real-time ECG monitoring easily.Keywords: carbon nanofibers, ECG monitoring, flexible dry electrode, wearable patch
Procedia PDF Downloads 1857161 Meta Model for Optimum Design Objective Function of Steel Frames Subjected to Seismic Loads
Authors: Salah R. Al Zaidee, Ali S. Mahdi
Abstract:
Except for simple problems of statically determinate structures, optimum design problems in structural engineering have implicit objective functions where structural analysis and design are essential within each searching loop. With these implicit functions, the structural engineer is usually enforced to write his/her own computer code for analysis, design, and searching for optimum design among many feasible candidates and cannot take advantage of available software for structural analysis, design, and searching for the optimum solution. The meta-model is a regression model used to transform an implicit objective function into objective one and leads in turn to decouple the structural analysis and design processes from the optimum searching process. With the meta-model, well-known software for structural analysis and design can be used in sequence with optimum searching software. In this paper, the meta-model has been used to develop an explicit objective function for plane steel frames subjected to dead, live, and seismic forces. Frame topology is assumed as predefined based on architectural and functional requirements. Columns and beams sections and different connections details are the main design variables in this study. Columns and beams are grouped to reduce the number of design variables and to make the problem similar to that adopted in engineering practice. Data for the implicit objective function have been generated based on analysis and assessment for many design proposals with CSI SAP software. These data have been used later in SPSS software to develop a pure quadratic nonlinear regression model for the explicit objective function. Good correlations with a coefficient, R2, in the range from 0.88 to 0.99 have been noted between the original implicit functions and the corresponding explicit functions generated with meta-model.Keywords: meta-modal, objective function, steel frames, seismic analysis, design
Procedia PDF Downloads 2437160 Study Secondary Particle Production in Carbon Ion Beam Radiotherapy
Authors: Shaikah Alsubayae, Gianluigi Casse, Carlos Chavez, Jon Taylor, Alan Taylor, Mohammad Alsulimane
Abstract:
Ensuring accurate radiotherapy with carbon therapy requires precise monitoring of radiation dose distribution within the patient's body. This monitoring is essential for targeted tumor treatment, minimizing harm to healthy tissues, and improving treatment effectiveness while lowering side effects. In our investigation, we employed a methodological approach to monitor secondary proton doses in carbon therapy using Monte Carlo simulations. Initially, Geant4 simulations were utilized to extract the initial positions of secondary particles formed during interactions between carbon ions and water. These particles included protons, gamma rays, alpha particles, neutrons, and tritons. Subsequently, we studied the relationship between the carbon ion beam and these secondary particles. Interaction Vertex Imaging (IVI) is valuable for monitoring dose distribution in carbon therapy. It provides details about the positions and amounts of secondary particles, particularly protons. The IVI method depends on charged particles produced during ion fragmentation to gather information about the range by reconstructing particle trajectories back to their point of origin, referred to as the vertex. In our simulations regarding carbon ion therapy, we observed a strong correlation between some secondary particles and the range of carbon ions. However, challenges arose due to the target's unique elongated geometry, which hindered the straightforward transmission of forward-generated protons. Consequently, the limited protons that emerged mostly originated from points close to the target entrance. The trajectories of fragments (protons) were approximated as straight lines, and a beam back-projection algorithm, using recorded interaction positions in Si detectors, was developed to reconstruct vertices. The analysis revealed a correlation between the reconstructed and actual positions.Keywords: radiotherapy, carbon therapy, monitoring of radiation dose, interaction vertex imaging
Procedia PDF Downloads 847159 Comparing Energy Labelling of Buildings in Spain
Authors: Carolina Aparicio-Fernández, Alejandro Vilar Abad, Mar Cañada Soriano, Jose-Luis Vivancos
Abstract:
The building sector is responsible for 40% of the total energy consumption in the European Union (EU). Thus, implementation of strategies for quantifying and reducing buildings energy consumption is indispensable for reaching the EU’s carbon neutrality and energy efficiency goals. Each Member State has transposed the European Directives according to its own peculiarities: existing technical legislation, constructive solutions, climatic zones, etc. Therefore, in accordance with the Energy Performance of Buildings Directive, Member States have developed different Energy Performance Certificate schemes, using proposed energy simulation software-tool for each national or regional area. Energy Performance Certificates provide a powerful and comprehensive information to predict, analyze and improve the energy demand of new and existing buildings. Energy simulation software and databases allow a better understanding of the current constructive reality of the European building stock. However, Energy Performance Certificates still have to face several issues to consider them as a reliable and global source of information since different calculation tools are used that do not allow the connection between them. In this document, TRNSYS (TRaNsient System Simulation program) software is used to calculate the energy demand of a building, and it is compared with the energy labeling obtained with Spanish Official software-tools. We demonstrate the possibility of using not official software-tools to calculate the Energy Performance Certificate. Thus, this approach could be used throughout the EU and compare the results in all possible cases proposed by the EU Member States. To implement the simulations, an isolated single-family house with different construction solutions is considered. The results are obtained for every climatic zone of the Spanish Technical Building Code.Keywords: energy demand, energy performance certificate EPBD, trnsys, buildings
Procedia PDF Downloads 1277158 Automatic Fluid-Structure Interaction Modeling and Analysis of Butterfly Valve Using Python Script
Authors: N. Guru Prasath, Sangjin Ma, Chang-Wan Kim
Abstract:
A butterfly valve is a quarter turn valve which is used to control the flow of a fluid through a section of pipe. Generally, butterfly valve is used in wide range of applications such as water distribution, sewage, oil and gas plants. In particular, butterfly valve with larger diameter finds its immense applications in hydro power plants to control the fluid flow. In-lieu with the constraints in cost and size to run laboratory setup, analysis of large diameter values will be mostly studied by computational method which is the best and inexpensive solution. For fluid and structural analysis, CFD and FEM software is used to perform large scale valve analyses, respectively. In order to perform above analysis in butterfly valve, the CAD model has to recreate and perform mesh in conventional software’s for various dimensions of valve. Therefore, its limitation is time consuming process. In-order to overcome that issue, python code was created to outcome complete pre-processing setup automatically in Salome software. Applying dimensions of the model clearly in the python code makes the running time comparatively lower and easier way to perform analysis of the valve. Hence, in this paper, an attempt was made to study the fluid-structure interaction (FSI) of butterfly valves by varying the valve angles and dimensions using python code in pre-processing software, and results are produced.Keywords: butterfly valve, flow coefficient, automatic CFD analysis, FSI analysis
Procedia PDF Downloads 241