Search results for: Semantic data integration
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8171

Search results for: Semantic data integration

5861 The Direct Updating of Damping and Gyroscopic Matrices using Incomplete Complex Test Data

Authors: Jiashang Jiang, Yongxin Yuan

Abstract:

In this paper we develop an efficient numerical method for the finite-element model updating of damped gyroscopic systems based on incomplete complex modal measured data. It is assumed that the analytical mass and stiffness matrices are correct and only the damping and gyroscopic matrices need to be updated. By solving a constrained optimization problem, the optimal corrected symmetric damping matrix and skew-symmetric gyroscopic matrix complied with the required eigenvalue equation are found under a weighted Frobenius norm sense.

Keywords: Model updating, damped gyroscopic system, partially prescribed spectral information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1788
5860 A Low-Power Two-Stage Seismic Sensor Scheme for Earthquake Early Warning System

Authors: Arvind Srivastav, Tarun Kanti Bhattacharyya

Abstract:

The north-eastern, Himalayan, and Eastern Ghats Belt of India comprise of earthquake-prone, remote, and hilly terrains. Earthquakes have caused enormous damages in these regions in the past. A wireless sensor network based earthquake early warning system (EEWS) is being developed to mitigate the damages caused by earthquakes. It consists of sensor nodes, distributed over the region, that perform majority voting of the output of the seismic sensors in the vicinity, and relay a message to a base station to alert the residents when an earthquake is detected. At the heart of the EEWS is a low-power two-stage seismic sensor that continuously tracks seismic events from incoming three-axis accelerometer signal at the first-stage, and, in the presence of a seismic event, triggers the second-stage P-wave detector that detects the onset of P-wave in an earthquake event. The parameters of the P-wave detector have been optimized for minimizing detection time and maximizing the accuracy of detection.Working of the sensor scheme has been verified with seven earthquakes data retrieved from IRIS. In all test cases, the scheme detected the onset of P-wave accurately. Also, it has been established that the P-wave onset detection time reduces linearly with the sampling rate. It has been verified with test data; the detection time for data sampled at 10Hz was around 2 seconds which reduced to 0.3 second for the data sampled at 100Hz.

Keywords: Earthquake early warning system, EEWS, STA/LTA, polarization, wavelet, event detector, P-wave detector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 781
5859 Liquid-Liquid Equilibrium Data for Butan-2-ol - Ethanol - Water, Pentan-1-ol - Ethanol - Water and Toluene - Acetone - Water Systems

Authors: Tinuade Jolaade Afolabi, Theresa Ibibia Edewor

Abstract:

Experimental liquid-liquid equilibra of butan-2-ol - ethanol -water; pentan-1-ol - ethanol - water and toluene - acetone - water ternary systems were investigated at (25oC). The reliability of the experimental tie-line data was ascertained by using Othmer-Tobias and Hand plots. The distribution coefficients (D) and separation factors (S) of the immiscibility region were evaluated for the three systems.

Keywords: Distribution coefficient, Liquid-liquid equilibrium, separation factors, thermodynamic models

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3842
5858 Estimation of Methane from Hydrocarbon Exploration and Production in India

Authors: A. K. Pathak, K. Ojha

Abstract:

Methane is the second most important greenhouse gas (GHG) after carbon dioxide. Amount of methane emission from energy sector is increasing day by day with various activities. In present work, various sources of methane emission from upstream, middle stream and downstream of oil & gas sectors are identified and categorised as per IPCC-2006 guidelines. Data were collected from various oil & gas sector like (i) exploration & production of oil & gas (ii) supply through pipelines (iii) refinery throughput & production (iv) storage & transportation (v) usage. Methane emission factors for various categories were determined applying Tier-II and Tier-I approach using the collected data. Total methane emission from Indian Oil & Gas sectors was thus estimated for the year 1990 to 2007.

Keywords: Carbon credit, Climate change, Methane emission, Oil & Gas production

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2140
5857 Rainfall and Flood Forecast Models for Better Flood Relief Plan of the Mae Sot Municipality

Authors: S. Chuenchooklin, S. Taweepong, U. Pangnakorn

Abstract:

This research was conducted in the Mae Sot Watershed where located in the Moei River Basin at the Upper Salween River Basin in Tak Province, Thailand. The Mae Sot Municipality is the largest urban area in Tak Province and situated in the midstream of the Mae Sot Watershed. It usually faces flash flood problem after heavy rain due to poor flood management has been reported since economic rapidly bloom up in recent years. Its catchment can be classified as ungauged basin with lack of rainfall data and no any stream gaging station was reported. It was attached by most severely flood events in 2013 as the worst studied case for all those communities in this municipality. Moreover, other problems are also faced in this watershed, such shortage water supply for domestic consumption and agriculture utilizations including a deterioration of water quality and landslide as well. The research aimed to increase capability building and strengthening the participation of those local community leaders and related agencies to conduct better water management in urban area was started by mean of the data collection and illustration of the appropriated application of some short period rainfall forecasting model as they aim for better flood relief plan and management through the hydrologic model system and river analysis system programs. The authors intended to apply the global rainfall data via the integrated data viewer (IDV) program from the Unidata with the aim for rainfall forecasting in a short period of 7-10 days in advance during rainy season instead of real time record. The IDV product can be present in an advance period of rainfall with time step of 3-6 hours was introduced to the communities. The result can be used as input data to the hydrologic modeling system model (HEC-HMS) for synthesizing flood hydrographs and use for flood forecasting as well. The authors applied the river analysis system model (HEC-RAS) to present flood flow behaviors in the reach of the Mae Sot stream via the downtown of the Mae Sot City as flood extents as the water surface level at every cross-sectional profiles of the stream. Both models of HMS and RAS were tested in 2013 with observed rainfall and inflow-outflow data from the Mae Sot Dam. The result of HMS showed fit to the observed data at the dam and applied at upstream boundary discharge to RAS in order to simulate flood extents and tested in the field, and the result found satisfying. The product of rainfall from IDV was fair while compared with observed data. However, it is an appropriate tool to use in the ungauged catchment to use with flood hydrograph and river analysis models for future efficient flood relief plan and management.

Keywords: Global rainfall, flood forecasting, hydrologic modeling system, river analysis system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2415
5856 An Effort at Improving Reliability of Laboratory Data in Titrimetric Analysis for Zinc Sulphate Tablets Using Validated Spreadsheet Calculators

Authors: M. A. Okezue, K. L. Clase, S. R. Byrn

Abstract:

The requirement for maintaining data integrity in laboratory operations is critical for regulatory compliance. Automation of procedures reduces incidence of human errors. Quality control laboratories located in low-income economies may face some barriers in attempts to automate their processes. Since data from quality control tests on pharmaceutical products are used in making regulatory decisions, it is important that laboratory reports are accurate and reliable. Zinc Sulphate (ZnSO4) tablets is used in treatment of diarrhea in pediatric population, and as an adjunct therapy for COVID-19 regimen. Unfortunately, zinc content in these formulations is determined titrimetrically; a manual analytical procedure. The assay for ZnSO4 tablets involves time-consuming steps that contain mathematical formulae prone to calculation errors. To achieve consistency, save costs, and improve data integrity, validated spreadsheets were developed to simplify the two critical steps in the analysis of ZnSO4 tablets: standardization of 0.1M Sodium Edetate (EDTA) solution, and the complexometric titration assay procedure. The assay method in the United States Pharmacopoeia was used to create a process flow for ZnSO4 tablets. For each step in the process, different formulae were input into two spreadsheets to automate calculations. Further checks were created within the automated system to ensure validity of replicate analysis in titrimetric procedures. Validations were conducted using five data sets of manually computed assay results. The acceptance criteria set for the protocol were met. Significant p-values (p < 0.05, α = 0.05, at 95% Confidence Interval) were obtained from students’ t-test evaluation of the mean values for manual-calculated and spreadsheet results at all levels of the analysis flow. Right-first-time analysis and principles of data integrity were enhanced by use of the validated spreadsheet calculators in titrimetric evaluations of ZnSO4 tablets. Human errors were minimized in calculations when procedures were automated in quality control laboratories. The assay procedure for the formulation was achieved in a time-efficient manner with greater level of accuracy. This project is expected to promote cost savings for laboratory business models.

Keywords: Data integrity, spreadsheets, titrimetry, validation, zinc sulphate tablets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 516
5855 Conventional and Hybrid Network Energy Systems Optimization for Canadian Community

Authors: Mohamed Ghorab

Abstract:

Local generated and distributed system for thermal and electrical energy is sighted in the near future to reduce transmission losses instead of the centralized system. Distributed Energy Resources (DER) is designed at different sizes (small and medium) and it is incorporated in energy distribution between the hubs. The energy generated from each technology at each hub should meet the local energy demands. Economic and environmental enhancement can be achieved when there are interaction and energy exchange between the hubs. Network energy system and CO2 optimization between different six hubs presented Canadian community level are investigated in this study. Three different scenarios of technology systems are studied to meet both thermal and electrical demand loads for the six hubs. The conventional system is used as the first technology system and a reference case study. The conventional system includes boiler to provide the thermal energy, but the electrical energy is imported from the utility grid. The second technology system includes combined heat and power (CHP) system to meet the thermal demand loads and part of the electrical demand load. The third scenario has integration systems of CHP and Organic Rankine Cycle (ORC) where the thermal waste energy from the CHP system is used by ORC to generate electricity. General Algebraic Modeling System (GAMS) is used to model DER system optimization based on energy economics and CO2 emission analyses. The results are compared with the conventional energy system. The results show that scenarios 2 and 3 provide an annual total cost saving of 21.3% and 32.3 %, respectively compared to the conventional system (scenario 1). Additionally, Scenario 3 (CHP & ORC systems) provides 32.5% saving in CO2 emission compared to conventional system subsequent case 2 (CHP system) with a value of 9.3%.  

Keywords: Distributed energy resources, network energy system, optimization, microgeneration system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 940
5854 Real-Time Identification of Media in a Laboratory-Scaled Penetrating Process

Authors: Sheng-Hong Pong, Herng-Yu Huang, Yi-Ju Lee, Shih-Hsuan Chiu

Abstract:

In this paper, a neural network technique is applied to real-time classifying media while a projectile is penetrating through them. A laboratory-scaled penetrating setup was built for the experiment. Features used as the network inputs were extracted from the acceleration of penetrator. 6000 set of features from a single penetration with known media and status were used to train the neural network. The trained system was tested on 30 different penetration experiments. The system produced an accuracy of 100% on the training data set. And, their precision could be 99% for the test data from 30 tests.

Keywords: back-propagation, identification, neural network, penetration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1277
5853 Arriving at an Optimum Value of Tolerance Factor for Compressing Medical Images

Authors: Sumathi Poobal, G. Ravindran

Abstract:

Medical imaging uses the advantage of digital technology in imaging and teleradiology. In teleradiology systems large amount of data is acquired, stored and transmitted. A major technology that may help to solve the problems associated with the massive data storage and data transfer capacity is data compression and decompression. There are many methods of image compression available. They are classified as lossless and lossy compression methods. In lossy compression method the decompressed image contains some distortion. Fractal image compression (FIC) is a lossy compression method. In fractal image compression an image is coded as a set of contractive transformations in a complete metric space. The set of contractive transformations is guaranteed to produce an approximation to the original image. In this paper FIC is achieved by PIFS using quadtree partitioning. PIFS is applied on different images like , Ultrasound, CT Scan, Angiogram, X-ray, Mammograms. In each modality approximately twenty images are considered and the average values of compression ratio and PSNR values are arrived. In this method of fractal encoding, the parameter, tolerance factor Tmax, is varied from 1 to 10, keeping the other standard parameters constant. For all modalities of images the compression ratio and Peak Signal to Noise Ratio (PSNR) are computed and studied. The quality of the decompressed image is arrived by PSNR values. From the results it is observed that the compression ratio increases with the tolerance factor and mammogram has the highest compression ratio. The quality of the image is not degraded upto an optimum value of tolerance factor, Tmax, equal to 8, because of the properties of fractal compression.

Keywords: Fractal image compression, IFS, PIFS, PSNR, Quadtree partitioning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1740
5852 Optimizing Logistics for Courier Organizations with Considerations of Congestions and Pickups: A Courier Delivery System in Amman as Case Study

Authors: Nader A. Al Theeb, Zaid Abu Manneh, Ibrahim Al-Qadi

Abstract:

Traveling salesman problem (TSP) is a combinatorial integer optimization problem that asks "What is the optimal route for a vehicle to traverse in order to deliver requests to a given set of customers?”. It is widely used by the package carrier companies’ distribution centers. The main goal of applying the TSP in courier organizations is to minimize the time that it takes for the courier in each trip to deliver or pick up the shipments during a day. In this article, an optimization model is constructed to create a new TSP variant to optimize the routing in a courier organization with a consideration of congestion in Amman, the capital of Jordan. Real data were collected by different methods and analyzed. Then, concert technology - CPLEX was used to solve the proposed model for some random generated data instances and for the real collected data. At the end, results have shown a great improvement in time compared with the current trip times, and an economic study was conducted afterwards to figure out the impact of using such models.

Keywords: Travel salesman problem, congestions, pick-up, integer programming, package carriers, service engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 942
5851 Study on Radio Link Availability in Millimeter Wave Range

Authors: Boncho G. Bonev, Kliment N. Angelov, Emil S. Altimirski

Abstract:

In this paper, the link quality in SHF and EHF ranges are studied. In order to achieve high data rate higher frequencies must be used – centimeter waves (SHF), millimeter waves (EHF) or optical range. However, there are significant problem when a radio link work in that diapason – rain attenuation and attenuation in earth-s atmosphere. Based on statistical rain rates data for Bulgaria, the link availability can be determined, depending on the working frequency, the path length and the Power Budget of the link. For the calculations of rain attenuation and atmosphere-s attenuation the ITU recommendations are used.

Keywords: rain attenuation, atmospheric gaseous attenuation, link availability, link breaking probability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2024
5850 Estimating the Runoff Using the Simple Tank Model and Comparing it with the SCS-CN Model - A Case Study of the Dez River Basin

Authors: H. Alaleh, N. Hedayat, A. Alaleh, H. Ayazi, A. Ruhani

Abstract:

Run-offs are considered as important hydrological factors in feasibility studies of river engineering and irrigation-related projects under arid and semi-arid condition. Flood control is one of the crucial factor, the management of which while mitigates its destructive consequences, abstracts considerable volume of renewable water resources. The methodology applied here was based on Mizumura, which applied a mathematical model for simple tank to simulate the rainfall-run-off process in a particular water basin using the data from the observational hydrograph. The model was applied in the Dez River water basin adjacent to Greater Dezful region, Iran in order to simulate and estimate the floods. Results indicated that the calculated hydrographs using the simple tank method, SCS-CN model and the observation hydrographs had a close proximity. It was also found that on average the flood time and discharge peaks in the simple tank were closer to the observational data than the CN method. On the other hand, the calculated flood volume in the CN model was significantly closer to the observational data than the simple tank model.

Keywords: Simple tank, Dez River, run-off, lag time, excess rainfall.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2593
5849 Increasing Value Added of Recycling Business Management: A Case of Thailand

Authors: Yananda Siraphatthada

Abstract:

This policy participation action research explores the roles of Thai government units during its 2010 fiscal year on how to create value added to recycling business in the central part of Thailand. The research aims to a) study how the government plays a role to support the business, and its problems and obstacles on supporting the business, b) to design a strategic action – short, medium, and long term plans -- to create value added to the recycling business, particularly in local full-loop companies/organizations licensed by Wongpanit Waste Separation Plant as well as those licensed by the Department of Provincial Administration. Mixed method research design, i.e., a combination of quantitative and qualitative methods is utilized in the present study in both data collection and analysis procedures. Quantitative data was analyzed by frequency, percent value, mean scores, and standard deviation, and aimed to note trend and generalizations. Qualitative data was collected via semi-structured interviews/focus group interviews to explore in-depth views of the operators. The sampling included 1,079 operators in eight provinces in the central part of Thailand.

Keywords: Management, Recycling Business, Value Added.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2758
5848 Assessing the Theoretical Suitability of Sentinel-2 and WorldView-3 Data for Hydrocarbon Mapping of Spill Events, Using HYSS

Authors: K. Tunde Olagunju, C. Scott Allen, F.D. (Freek) van der Meer

Abstract:

Identification of hydrocarbon oil in remote sensing images is often the first step in monitoring oil during spill events. Most remote sensing methods adopt techniques for hydrocarbon identification to achieve detection in order to model an appropriate cleanup program. Identification on optical sensors does not only allow for detection but also for characterization and quantification. Until recently, in optical remote sensing, quantification and characterization were only potentially possible using high-resolution laboratory and airborne imaging spectrometers (hyperspectral data). Unlike multispectral, hyperspectral data are not freely available, as this data category is mainly obtained via airborne survey at present. In this research, two operational high-resolution multispectral satellites (WorldView-3 and Sentinel-2) are theoretically assessed for their suitability for hydrocarbon characterization, using the Hydrocarbon Spectra Slope model (HYSS). This method utilized the two most persistent hydrocarbon diagnostic/absorption features at 1.73 µm and 2.30 µm for hydrocarbon mapping on multispectral data. In this research, spectra measurement of seven different hydrocarbon oils (crude and refined oil) taken on 10 different substrates with the use of laboratory ASD Fieldspec were convolved to Sentinel-2 and WorldView-3 resolution, using their full width half maximum (FWHM) parameter. The resulting hydrocarbon slope values obtained from the studied samples enable clear qualitative discrimination of most hydrocarbons, despite the presence of different background substrates, particularly on WorldView-3. Due to close conformity of central wavelengths and narrow bandwidths to key hydrocarbon bands used in HYSS, the statistical significance for qualitative analysis on WorldView-3 sensors for all studied hydrocarbon oil returned with 95% confidence level (P-value ˂ 0.01), except for Diesel. Using multifactor analysis of variance (MANOVA), the discriminating power of HYSS is statistically significant for most hydrocarbon-substrate combinations on Sentinel-2 and WorldView-3 FWHM, revealing the potential of these two operational multispectral sensors as rapid response tools for hydrocarbon mapping. One notable exception is highly transmissive hydrocarbons on Sentinel-2 data due to the non-conformity of spectral bands with key hydrocarbon absorptions and the relatively coarse bandwidth (> 100 nm).

Keywords: hydrocarbon, oil spill, remote sensing, hyperspectral, multispectral, hydrocarbon – substrate combination, Sentinel-2, WorldView-3

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 705
5847 Elements of a Culture of Quality in the Implementation of Quality Assurance Systems of Countries in the European Higher Education Area

Authors: L. Mion

Abstract:

The implementation of quality management systems in higher education in different countries is determined by national regulatory choices and supranational indications (such as the European Standard Guidelines for Quality Assurance). The effective functioning and transformative capacity of these quality management systems largely depend on the organizational context in which they are applied and, more specifically, on the culture of quality developed in single universities or in single countries. The University's concept of quality culture integrates the structural dimension of Quality Assurance (QA) (quality management manuals, process definitions, tools) with the value dimension of an organization (principles, skills, and attitudes). Within the EHEA (European Higher Education Area), countries such as Portugal, the Netherlands, the UK, and Norway demonstrate a greater integration of QA principles in the various organizational levels and areas of competence of university institutions or have greater experience in implementation or scientific and political debate on the matter. Therefore, the study, through an integrative literature review, of the quality management systems of these countries is aimed at determining a framework of the culture of quality, helpful in defining the elements which, both in structural-organizational terms and in terms of values and skills and attitudes, have proved to be factors of success in the effective implementation of quality assurance systems in universities and in the countries considered in the research. In order for a QA system to effectively aim for continuous improvement in a complex and dynamic context such as the university one, it must embrace a holistic vision of quality from an integrative perspective, focusing on the objective of transforming the reality being evaluated.

Keywords: Higher education, quality assurance, quality culture, Portugal, Norway, Netherlands, United Kingdom.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 105
5846 Elements of a Culture of Quality in the Implementation of Quality Assurance Systems of Countries in the European Higher Education Area

Authors: L. Mion

Abstract:

The implementation of quality management systems in higher education in different countries is determined by national regulatory choices and supranational indications (such as the European Standard Guidelines for Quality Assurance). The effective functioning and transformative capacity of these quality management systems largely depend on the organizational context in which they are applied and, more specifically, on the culture of quality developed in single universities or in single countries. The University's concept of quality culture integrates the structural dimension of Quality Assurance (QA) (quality management manuals, process definitions, tools) with the value dimension of an organization (principles, skills, and attitudes). Within the EHEA (European Higher Education Area), countries such as Portugal, the Netherlands, the UK, Norway demonstrate a greater integration of QA principles in the various organizational levels and areas of competence of university institutions or have greater experience in implementation or scientific and political debate on the matter. Therefore, the study, through an integrative literature review, of the quality management systems of these countries, aimed at determining a framework of the culture of quality, helpful in defining the elements which, both in structural-organizational terms and in terms of values and skills and attitudes, have proved to be factors of success in the effective implementation of quality assurance systems in universities and in the countries considered in the research. In order for a QA system to effectively aim for continuous improvement in a complex and dynamic context such as the university one, it must embrace a holistic vision of quality from an integrative perspective, focusing on the objective of transforming the reality being evaluated.

Keywords: Higher Education, quality assurance, quality culture, Portugal, Norway, Netherlands, United Kingdom.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 77
5845 Interest of the Sequences Pseudo Noises Codes of Different Lengths for the Reduction from the Interference between Users of CDMA Network

Authors: Nerguè Kassahan Kone, Souleymane Oumtanaga

Abstract:

The third generation (3G) of cellular system adopted the spread spectrum as solution for the transmission of the data in the physical layer. Contrary to systems IS-95 or CDMAOne (systems with spread spectrum of the preceding generation), the new standard, called Universal Mobil Telecommunications System (UMTS), uses long codes in the down link. The system is conceived for the vocal communication and the transmission of the data. In particular, the down link is very important, because of the asymmetrical request of the data, i.e., more remote loading towards the mobiles than towards the basic station. Moreover, the UMTS uses for the down link an orthogonal spreading out with a variable factor of spreading out (OVSF for Orthogonal Variable Spreading Factor). This characteristic makes it possible to increase the flow of data of one or more users by reducing their factor of spreading out without changing the factor of spreading out of other users. In the current standard of the UMTS, two techniques to increase the performances of the down link were proposed, the diversity of sending antenna and the codes space-time. These two techniques fight only fainding. The receiver proposed for the mobil station is the RAKE, but one can imagine a receiver more sophisticated, able to reduce the interference between users and the impact of the coloured noise and interferences to narrow band. In this context, where the users have long codes synchronized with variable factor of spreading out and ignorance by the mobile of the other active codes/users, the use of the sequences of code pseudo-noises different lengths is presented in the form of one of the most appropriate solutions.

Keywords: DS-CDMA, multiple access interference, ratio Signal / interference + Noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1352
5844 Topology Preservation in SOM

Authors: E. Arsuaga Uriarte, F. Díaz Martín

Abstract:

The SOM has several beneficial features which make it a useful method for data mining. One of the most important features is the ability to preserve the topology in the projection. There are several measures that can be used to quantify the goodness of the map in order to obtain the optimal projection, including the average quantization error and many topological errors. Many researches have studied how the topology preservation should be measured. One option consists of using the topographic error which considers the ratio of data vectors for which the first and second best BMUs are not adjacent. In this work we present a study of the behaviour of the topographic error in different kinds of maps. We have found that this error devaluates the rectangular maps and we have studied the reasons why this happens. Finally, we suggest a new topological error to improve the deficiency of the topographic error.

Keywords: Map lattice, Self-Organizing Map, topographic error, topology preservation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3012
5843 RFID Logistic Management with Cold Chain Monitoring – Cold Store Case Study

Authors: Mira Trebar

Abstract:

Logistics processes of perishable food in the supply chain include the distribution activities and the real time temperature monitoring to fulfil the cold chain requirements. The paper presents the use of RFID (Radio Frequency Identification) technology as an identification tool of receiving and shipping activities in the cold store. At the same time, the use of RFID data loggers with temperature sensors is presented to observe and store the temperatures for the purpose of analyzing the processes and having the history data available for traceability purposes and efficient recall management.

Keywords: Logistics, warehouse, RFID device, cold chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3716
5842 Exploring the Challenges to Usage of Building and Construction Cost Indices in Ghana

Authors: J. J. Gyimah, E. Kissi, S. Osei-Tutu, C. D. Adobor, T. Adjei-Kumi, E. Osei-Tutu

Abstract:

Price fluctuation contract is imperative and of paramount essence in the construction industry as it provides adequate relief and cushioning for changes in the prices of input resources during construction. As a result, several methods have been devised to better help in arriving at fair recompense in the event of price changes. However, stakeholders often appear not to be satisfied with the existing methods of fluctuation evaluation, ostensibly because of the challenges associated with them. The aim of this study was to identify the challenges to usage of building construction cost indices in Ghana. Data were gathered from contractors and quantity surveying firms. The study utilized survey questionnaire approach to elicit responses from the contractors and the consultants. Data gathered were analyzed scientifically, using the Relative Importance Index (RII) to rank the problems associated with the existing methods. The findings revealed the following among others: late release of data; inadequate recovery of costs; and work items of interest not included in the published indices as the main challenges of the existing methods. Findings provided useful lessons for policy makers and practitioners in decision making towards the usage and improvement of available indices.

Keywords: Building construction cost indices, challenges, usage, Ghana.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 627
5841 Bureau Management Technologies and Information Systems in Developing Countries

Authors: Mehmet Altınöz

Abstract:

This study focuses on bureau management technologies and information systems in developing countries. Developing countries use such systems which facilitate executive and organizational functions through the utilization of bureau management technologies and provide the executive staff with necessary information. The concepts of data and information differ from each other in developing countries, and thus the concepts of data processing and information processing are different. Symbols represent ideas, objects, figures, letters and numbers. Data processing system is an integrated system which deals with the processing of the data related to the internal and external environment of the organization in order to make decisions, create plans and develop strategies; it goes without saying that this system is composed of both human beings and machines. Information is obtained through the acquisition and the processing of data. On the other hand, data are raw communicative messages. Within this framework, data processing equals to producing plausible information out of raw data. Organizations in developing countries need to obtain information relevant to them because rapid changes in the organizational arena require rapid access to accurate information. The most significant role of the directors and managers who work in the organizational arena is to make decisions. Making a correct decision is possible only when the directors and managers are equipped with sound ideas and appropriate information. Therefore, acquisition, organization and distribution of information gain significance. Today-s organizations make use of computer-assisted “Management Information Systems" in order to obtain and distribute information. Decision Support System which is closely related to practice is an information system that facilitates the director-s task of making decisions. Decision Support System integrates human intelligence, information technology and software in order to solve the complex problems. With the support of the computer technology and software systems, Decision Support System produces information relevant to the decision to be made by the director and provides the executive staff with supportive ideas about the decision. Artificial Intelligence programs which transfer the studies and experiences of the people to the computer are called expert systems. An expert system stores expert information in a limited area and can solve problems by deriving rational consequences. Bureau management technologies and information systems in developing countries create a kind of information society and information economy which make those countries have their places in the global socio-economic structure and which enable them to play a reasonable and fruitful role; therefore it is of crucial importance to make use of information and management technologies in order to work together with innovative and enterprising individuals and it is also significant to create “scientific policies" based on information and technology in the fields of economy, politics, law and culture.

Keywords: Bureau Management, Information Systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1517
5840 Automation of Heat Exchanger using Neural Network

Authors: Sudhir Agashe, Ashok Ghatol, Sujata Agashe

Abstract:

In this paper the development of a heat exchanger as a pilot plant for educational purpose is discussed and the use of neural network for controlling the process is being presented. The aim of the study is to highlight the need of a specific Pseudo Random Binary Sequence (PRBS) to excite a process under control. As the neural network is a data driven technique, the method for data generation plays an important role. In light of this a careful experimentation procedure for data generation was crucial task. Heat exchange is a complex process, which has a capacity and a time lag as process elements. The proposed system is a typical pipe-in- pipe type heat exchanger. The complexity of the system demands careful selection, proper installation and commissioning. The temperature, flow, and pressure sensors play a vital role in the control performance. The final control element used is a pneumatically operated control valve. While carrying out the experimentation on heat exchanger a welldrafted procedure is followed giving utmost attention towards safety of the system. The results obtained are encouraging and revealing the fact that if the process details are known completely as far as process parameters are concerned and utilities are well stabilized then feedback systems are suitable, whereas neural network control paradigm is useful for the processes with nonlinearity and less knowledge about process. The implementation of NN control reinforces the concepts of process control and NN control paradigm. The result also underlined the importance of excitation signal typically for that process. Data acquisition, processing, and presentation in a typical format are the most important parameters while validating the results.

Keywords: Process identification, neural network, heat exchanger.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1573
5839 Reflectance Imaging Spectroscopy Data (Hyperspectral) for Mineral Mapping in the Orientale Basin Region on the Moon Surface

Authors: V. Sivakumar, R. Neelakantan

Abstract:

Mineral mapping on the Moon surface provides the clue to understand the origin, evolution, stratigraphy and geological history of the Moon. Recently, reflectance imaging spectroscopy plays a significant role in identifying minerals on the planetary surface in the Visible to NIR region of the electromagnetic spectrum. The Moon Mineralogy Mapper (M3) onboard Chandrayaan-1 provides unprecedented spectral data of lunar surface to study about the Moon surface. Here we used the M3 sensor data (hyperspectral imaging spectroscopy) for analysing mineralogy of Orientale basin region on the Moon surface. Reflectance spectrums were sampled from different locations of the basin and continuum was removed using ENvironment for Visualizing Images (ENVI) software. Reflectance spectra of unknown mineral composition were compared with known Reflectance Experiment Laboratory (RELAB) spectra for discriminating mineralogy. Minerals like olivine, Low-Ca Pyroxene (LCP), High-Ca Pyroxene (HCP) and plagioclase were identified. In addition to these minerals, an unusual type of spectral signature was identified, which indicates the probable Fe-Mg-spinel lithology in the basin region.

Keywords: Chandrayaan-1, moon mineralogy mapper, orientale basin, moon, spectroscopy, hyperspectral.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2854
5838 Cryptanalysis of Yang-Li-Liao’s Simple Three-Party Key Exchange (S-3PAKE) Protocol

Authors: Hae-Soon Ahn, Eun-Jun Yoon

Abstract:

Three-party password authenticated key exchange (3PAKE) protocols are widely deployed on lots of remote user authentication system due to its simplicity and convenience of maintaining a human-memorable password at client side to achieve secure communication within a hostile network. Recently, an improvement of 3PAKE protocol by processing a built-in data attached to other party for identity authentication to individual data was proposed by some researchers. However, this paper points out that the improved 3PAKE protocol is still vulnerable to undetectable on-line dictionary attack and off-line dictionary attack.

Keywords: Three-party key exchange, 3PAKE, Passwordauthenticated key exchange, Network security, Dictionary attack

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2123
5837 Meteorological Data Study and Forecasting Using Particle Swarm Optimization Algorithm

Authors: S. Esfandeh, M. Sedighizadeh

Abstract:

Weather systems use enormously complex combinations of numerical tools for study and forecasting. Unfortunately, due to phenomena in the world climate, such as the greenhouse effect, classical models may become insufficient mostly because they lack adaptation. Therefore, the weather forecast problem is matched for heuristic approaches, such as Evolutionary Algorithms. Experimentation with heuristic methods like Particle Swarm Optimization (PSO) algorithm can lead to the development of new insights or promising models that can be fine tuned with more focused techniques. This paper describes a PSO approach for analysis and prediction of data and provides experimental results of the aforementioned method on realworld meteorological time series.

Keywords: Weather, Climate, PSO, Prediction, Meteorological

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2077
5836 Classification of Right and Left-Hand Movement Using Multi-Resolution Analysis Method

Authors: Nebi Gedik

Abstract:

The aim of the brain-computer interface studies on electroencephalogram (EEG) signals containing motor imagery is to extract the effective features that will provide the highest possible classification accuracy for the detection of the desired motor movement. However, achieving this goal is difficult as the most suitable frequency band and time frame vary from subject to subject. In this study, the classification success of the two-feature data obtained from raw EEG signals and the coefficients of the multi-resolution analysis method applied to the EEG signals were analyzed comparatively. The method was applied to several EEG channels (C3, Cz and C4) signals obtained from the EEG data set belonging to the publicly available BCI competition III.

Keywords: Motor imagery, EEG, wave atom transform, k-NN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 590
5835 Analysis of Statistical Data on Social Resources Dimension of Occupational Status Attainment: A Rational Choice Approach

Authors: Oleg Demchenko

Abstract:

The aim of the present study is to analyze empirical researches on the social resources dimension of occupational status attainment process and relate them to the rational choice approach. The analysis suggests that the existing data on the strength of ties aspect of social resources is insufficient and does not allow any implication concerning rational actor-s behavior. However, the results concerning work relation aspect are more encouraging.

Keywords: Social resources, status attainment, rational choice, weak ties, work-related ties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1486
5834 A Diagnostic Fuzzy Rule-Based System for Congenital Heart Disease

Authors: Ersin Kaya, Bulent Oran, Ahmet Arslan

Abstract:

In this study, fuzzy rule-based classifier is used for the diagnosis of congenital heart disease. Congenital heart diseases are defined as structural or functional heart disease. Medical data sets were obtained from Pediatric Cardiology Department at Selcuk University, from years 2000 to 2003. Firstly, fuzzy rules were generated by using medical data. Then the weights of fuzzy rules were calculated. Two different reasoning methods as “weighted vote method" and “singles winner method" were used in this study. The results of fuzzy classifiers were compared.

Keywords: Congenital heart disease, Fuzzy rule-basedclassifiers, Classification

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1822
5833 Part of Speech Tagging Using Statistical Approach for Nepali Text

Authors: Archit Yajnik

Abstract:

Part of Speech Tagging has always been a challenging task in the era of Natural Language Processing. This article presents POS tagging for Nepali text using Hidden Markov Model and Viterbi algorithm. From the Nepali text, annotated corpus training and testing data set are randomly separated. Both methods are employed on the data sets. Viterbi algorithm is found to be computationally faster and accurate as compared to HMM. The accuracy of 95.43% is achieved using Viterbi algorithm. Error analysis where the mismatches took place is elaborately discussed.

Keywords: Hidden Markov model, Viterbi algorithm, POS tagging, natural language processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1710
5832 High Capacity Data Hiding based on Predictor and Histogram Modification

Authors: Hui-Yu Huang, Shih-Hsu Chang

Abstract:

In this paper, we propose a high capacity image hiding technology based on pixel prediction and the difference of modified histogram. This approach is used the pixel prediction and the difference of modified histogram to calculate the best embedding point. This approach can improve the predictive accuracy and increase the pixel difference to advance the hiding capacity. We also use the histogram modification to prevent the overflow and underflow. Experimental results demonstrate that our proposed method within the same average hiding capacity can still keep high quality of image and low distortion

Keywords: data hiding, predictor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1886