Search results for: overdispersed count data
5998 Estimation of Methane from Hydrocarbon Exploration and Production in India
Authors: A. K. Pathak, K. Ojha
Abstract:
Methane is the second most important greenhouse gas (GHG) after carbon dioxide. Amount of methane emission from energy sector is increasing day by day with various activities. In present work, various sources of methane emission from upstream, middle stream and downstream of oil & gas sectors are identified and categorised as per IPCC-2006 guidelines. Data were collected from various oil & gas sector like (i) exploration & production of oil & gas (ii) supply through pipelines (iii) refinery throughput & production (iv) storage & transportation (v) usage. Methane emission factors for various categories were determined applying Tier-II and Tier-I approach using the collected data. Total methane emission from Indian Oil & Gas sectors was thus estimated for the year 1990 to 2007.Keywords: Carbon credit, Climate change, Methane emission, Oil & Gas production
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21405997 Rainfall and Flood Forecast Models for Better Flood Relief Plan of the Mae Sot Municipality
Authors: S. Chuenchooklin, S. Taweepong, U. Pangnakorn
Abstract:
This research was conducted in the Mae Sot Watershed where located in the Moei River Basin at the Upper Salween River Basin in Tak Province, Thailand. The Mae Sot Municipality is the largest urban area in Tak Province and situated in the midstream of the Mae Sot Watershed. It usually faces flash flood problem after heavy rain due to poor flood management has been reported since economic rapidly bloom up in recent years. Its catchment can be classified as ungauged basin with lack of rainfall data and no any stream gaging station was reported. It was attached by most severely flood events in 2013 as the worst studied case for all those communities in this municipality. Moreover, other problems are also faced in this watershed, such shortage water supply for domestic consumption and agriculture utilizations including a deterioration of water quality and landslide as well. The research aimed to increase capability building and strengthening the participation of those local community leaders and related agencies to conduct better water management in urban area was started by mean of the data collection and illustration of the appropriated application of some short period rainfall forecasting model as they aim for better flood relief plan and management through the hydrologic model system and river analysis system programs. The authors intended to apply the global rainfall data via the integrated data viewer (IDV) program from the Unidata with the aim for rainfall forecasting in a short period of 7-10 days in advance during rainy season instead of real time record. The IDV product can be present in an advance period of rainfall with time step of 3-6 hours was introduced to the communities. The result can be used as input data to the hydrologic modeling system model (HEC-HMS) for synthesizing flood hydrographs and use for flood forecasting as well. The authors applied the river analysis system model (HEC-RAS) to present flood flow behaviors in the reach of the Mae Sot stream via the downtown of the Mae Sot City as flood extents as the water surface level at every cross-sectional profiles of the stream. Both models of HMS and RAS were tested in 2013 with observed rainfall and inflow-outflow data from the Mae Sot Dam. The result of HMS showed fit to the observed data at the dam and applied at upstream boundary discharge to RAS in order to simulate flood extents and tested in the field, and the result found satisfying. The product of rainfall from IDV was fair while compared with observed data. However, it is an appropriate tool to use in the ungauged catchment to use with flood hydrograph and river analysis models for future efficient flood relief plan and management.
Keywords: Global rainfall, flood forecasting, hydrologic modeling system, river analysis system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24155996 An Effort at Improving Reliability of Laboratory Data in Titrimetric Analysis for Zinc Sulphate Tablets Using Validated Spreadsheet Calculators
Authors: M. A. Okezue, K. L. Clase, S. R. Byrn
Abstract:
The requirement for maintaining data integrity in laboratory operations is critical for regulatory compliance. Automation of procedures reduces incidence of human errors. Quality control laboratories located in low-income economies may face some barriers in attempts to automate their processes. Since data from quality control tests on pharmaceutical products are used in making regulatory decisions, it is important that laboratory reports are accurate and reliable. Zinc Sulphate (ZnSO4) tablets is used in treatment of diarrhea in pediatric population, and as an adjunct therapy for COVID-19 regimen. Unfortunately, zinc content in these formulations is determined titrimetrically; a manual analytical procedure. The assay for ZnSO4 tablets involves time-consuming steps that contain mathematical formulae prone to calculation errors. To achieve consistency, save costs, and improve data integrity, validated spreadsheets were developed to simplify the two critical steps in the analysis of ZnSO4 tablets: standardization of 0.1M Sodium Edetate (EDTA) solution, and the complexometric titration assay procedure. The assay method in the United States Pharmacopoeia was used to create a process flow for ZnSO4 tablets. For each step in the process, different formulae were input into two spreadsheets to automate calculations. Further checks were created within the automated system to ensure validity of replicate analysis in titrimetric procedures. Validations were conducted using five data sets of manually computed assay results. The acceptance criteria set for the protocol were met. Significant p-values (p < 0.05, α = 0.05, at 95% Confidence Interval) were obtained from students’ t-test evaluation of the mean values for manual-calculated and spreadsheet results at all levels of the analysis flow. Right-first-time analysis and principles of data integrity were enhanced by use of the validated spreadsheet calculators in titrimetric evaluations of ZnSO4 tablets. Human errors were minimized in calculations when procedures were automated in quality control laboratories. The assay procedure for the formulation was achieved in a time-efficient manner with greater level of accuracy. This project is expected to promote cost savings for laboratory business models.
Keywords: Data integrity, spreadsheets, titrimetry, validation, zinc sulphate tablets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5165995 Real-Time Identification of Media in a Laboratory-Scaled Penetrating Process
Authors: Sheng-Hong Pong, Herng-Yu Huang, Yi-Ju Lee, Shih-Hsuan Chiu
Abstract:
In this paper, a neural network technique is applied to real-time classifying media while a projectile is penetrating through them. A laboratory-scaled penetrating setup was built for the experiment. Features used as the network inputs were extracted from the acceleration of penetrator. 6000 set of features from a single penetration with known media and status were used to train the neural network. The trained system was tested on 30 different penetration experiments. The system produced an accuracy of 100% on the training data set. And, their precision could be 99% for the test data from 30 tests.Keywords: back-propagation, identification, neural network, penetration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12775994 Arriving at an Optimum Value of Tolerance Factor for Compressing Medical Images
Authors: Sumathi Poobal, G. Ravindran
Abstract:
Medical imaging uses the advantage of digital technology in imaging and teleradiology. In teleradiology systems large amount of data is acquired, stored and transmitted. A major technology that may help to solve the problems associated with the massive data storage and data transfer capacity is data compression and decompression. There are many methods of image compression available. They are classified as lossless and lossy compression methods. In lossy compression method the decompressed image contains some distortion. Fractal image compression (FIC) is a lossy compression method. In fractal image compression an image is coded as a set of contractive transformations in a complete metric space. The set of contractive transformations is guaranteed to produce an approximation to the original image. In this paper FIC is achieved by PIFS using quadtree partitioning. PIFS is applied on different images like , Ultrasound, CT Scan, Angiogram, X-ray, Mammograms. In each modality approximately twenty images are considered and the average values of compression ratio and PSNR values are arrived. In this method of fractal encoding, the parameter, tolerance factor Tmax, is varied from 1 to 10, keeping the other standard parameters constant. For all modalities of images the compression ratio and Peak Signal to Noise Ratio (PSNR) are computed and studied. The quality of the decompressed image is arrived by PSNR values. From the results it is observed that the compression ratio increases with the tolerance factor and mammogram has the highest compression ratio. The quality of the image is not degraded upto an optimum value of tolerance factor, Tmax, equal to 8, because of the properties of fractal compression.Keywords: Fractal image compression, IFS, PIFS, PSNR, Quadtree partitioning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17405993 Optimizing Logistics for Courier Organizations with Considerations of Congestions and Pickups: A Courier Delivery System in Amman as Case Study
Authors: Nader A. Al Theeb, Zaid Abu Manneh, Ibrahim Al-Qadi
Abstract:
Traveling salesman problem (TSP) is a combinatorial integer optimization problem that asks "What is the optimal route for a vehicle to traverse in order to deliver requests to a given set of customers?”. It is widely used by the package carrier companies’ distribution centers. The main goal of applying the TSP in courier organizations is to minimize the time that it takes for the courier in each trip to deliver or pick up the shipments during a day. In this article, an optimization model is constructed to create a new TSP variant to optimize the routing in a courier organization with a consideration of congestion in Amman, the capital of Jordan. Real data were collected by different methods and analyzed. Then, concert technology - CPLEX was used to solve the proposed model for some random generated data instances and for the real collected data. At the end, results have shown a great improvement in time compared with the current trip times, and an economic study was conducted afterwards to figure out the impact of using such models.
Keywords: Travel salesman problem, congestions, pick-up, integer programming, package carriers, service engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9415992 Study on Radio Link Availability in Millimeter Wave Range
Authors: Boncho G. Bonev, Kliment N. Angelov, Emil S. Altimirski
Abstract:
In this paper, the link quality in SHF and EHF ranges are studied. In order to achieve high data rate higher frequencies must be used – centimeter waves (SHF), millimeter waves (EHF) or optical range. However, there are significant problem when a radio link work in that diapason – rain attenuation and attenuation in earth-s atmosphere. Based on statistical rain rates data for Bulgaria, the link availability can be determined, depending on the working frequency, the path length and the Power Budget of the link. For the calculations of rain attenuation and atmosphere-s attenuation the ITU recommendations are used.Keywords: rain attenuation, atmospheric gaseous attenuation, link availability, link breaking probability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20235991 Estimating the Runoff Using the Simple Tank Model and Comparing it with the SCS-CN Model - A Case Study of the Dez River Basin
Authors: H. Alaleh, N. Hedayat, A. Alaleh, H. Ayazi, A. Ruhani
Abstract:
Run-offs are considered as important hydrological factors in feasibility studies of river engineering and irrigation-related projects under arid and semi-arid condition. Flood control is one of the crucial factor, the management of which while mitigates its destructive consequences, abstracts considerable volume of renewable water resources. The methodology applied here was based on Mizumura, which applied a mathematical model for simple tank to simulate the rainfall-run-off process in a particular water basin using the data from the observational hydrograph. The model was applied in the Dez River water basin adjacent to Greater Dezful region, Iran in order to simulate and estimate the floods. Results indicated that the calculated hydrographs using the simple tank method, SCS-CN model and the observation hydrographs had a close proximity. It was also found that on average the flood time and discharge peaks in the simple tank were closer to the observational data than the CN method. On the other hand, the calculated flood volume in the CN model was significantly closer to the observational data than the simple tank model.
Keywords: Simple tank, Dez River, run-off, lag time, excess rainfall.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25935990 Increasing Value Added of Recycling Business Management: A Case of Thailand
Authors: Yananda Siraphatthada
Abstract:
This policy participation action research explores the roles of Thai government units during its 2010 fiscal year on how to create value added to recycling business in the central part of Thailand. The research aims to a) study how the government plays a role to support the business, and its problems and obstacles on supporting the business, b) to design a strategic action – short, medium, and long term plans -- to create value added to the recycling business, particularly in local full-loop companies/organizations licensed by Wongpanit Waste Separation Plant as well as those licensed by the Department of Provincial Administration. Mixed method research design, i.e., a combination of quantitative and qualitative methods is utilized in the present study in both data collection and analysis procedures. Quantitative data was analyzed by frequency, percent value, mean scores, and standard deviation, and aimed to note trend and generalizations. Qualitative data was collected via semi-structured interviews/focus group interviews to explore in-depth views of the operators. The sampling included 1,079 operators in eight provinces in the central part of Thailand.Keywords: Management, Recycling Business, Value Added.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27585989 Assessing the Theoretical Suitability of Sentinel-2 and WorldView-3 Data for Hydrocarbon Mapping of Spill Events, Using HYSS
Authors: K. Tunde Olagunju, C. Scott Allen, F.D. (Freek) van der Meer
Abstract:
Identification of hydrocarbon oil in remote sensing images is often the first step in monitoring oil during spill events. Most remote sensing methods adopt techniques for hydrocarbon identification to achieve detection in order to model an appropriate cleanup program. Identification on optical sensors does not only allow for detection but also for characterization and quantification. Until recently, in optical remote sensing, quantification and characterization were only potentially possible using high-resolution laboratory and airborne imaging spectrometers (hyperspectral data). Unlike multispectral, hyperspectral data are not freely available, as this data category is mainly obtained via airborne survey at present. In this research, two operational high-resolution multispectral satellites (WorldView-3 and Sentinel-2) are theoretically assessed for their suitability for hydrocarbon characterization, using the Hydrocarbon Spectra Slope model (HYSS). This method utilized the two most persistent hydrocarbon diagnostic/absorption features at 1.73 µm and 2.30 µm for hydrocarbon mapping on multispectral data. In this research, spectra measurement of seven different hydrocarbon oils (crude and refined oil) taken on 10 different substrates with the use of laboratory ASD Fieldspec were convolved to Sentinel-2 and WorldView-3 resolution, using their full width half maximum (FWHM) parameter. The resulting hydrocarbon slope values obtained from the studied samples enable clear qualitative discrimination of most hydrocarbons, despite the presence of different background substrates, particularly on WorldView-3. Due to close conformity of central wavelengths and narrow bandwidths to key hydrocarbon bands used in HYSS, the statistical significance for qualitative analysis on WorldView-3 sensors for all studied hydrocarbon oil returned with 95% confidence level (P-value ˂ 0.01), except for Diesel. Using multifactor analysis of variance (MANOVA), the discriminating power of HYSS is statistically significant for most hydrocarbon-substrate combinations on Sentinel-2 and WorldView-3 FWHM, revealing the potential of these two operational multispectral sensors as rapid response tools for hydrocarbon mapping. One notable exception is highly transmissive hydrocarbons on Sentinel-2 data due to the non-conformity of spectral bands with key hydrocarbon absorptions and the relatively coarse bandwidth (> 100 nm).
Keywords: hydrocarbon, oil spill, remote sensing, hyperspectral, multispectral, hydrocarbon – substrate combination, Sentinel-2, WorldView-3
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7055988 Interest of the Sequences Pseudo Noises Codes of Different Lengths for the Reduction from the Interference between Users of CDMA Network
Authors: Nerguè Kassahan Kone, Souleymane Oumtanaga
Abstract:
The third generation (3G) of cellular system adopted the spread spectrum as solution for the transmission of the data in the physical layer. Contrary to systems IS-95 or CDMAOne (systems with spread spectrum of the preceding generation), the new standard, called Universal Mobil Telecommunications System (UMTS), uses long codes in the down link. The system is conceived for the vocal communication and the transmission of the data. In particular, the down link is very important, because of the asymmetrical request of the data, i.e., more remote loading towards the mobiles than towards the basic station. Moreover, the UMTS uses for the down link an orthogonal spreading out with a variable factor of spreading out (OVSF for Orthogonal Variable Spreading Factor). This characteristic makes it possible to increase the flow of data of one or more users by reducing their factor of spreading out without changing the factor of spreading out of other users. In the current standard of the UMTS, two techniques to increase the performances of the down link were proposed, the diversity of sending antenna and the codes space-time. These two techniques fight only fainding. The receiver proposed for the mobil station is the RAKE, but one can imagine a receiver more sophisticated, able to reduce the interference between users and the impact of the coloured noise and interferences to narrow band. In this context, where the users have long codes synchronized with variable factor of spreading out and ignorance by the mobile of the other active codes/users, the use of the sequences of code pseudo-noises different lengths is presented in the form of one of the most appropriate solutions.Keywords: DS-CDMA, multiple access interference, ratio Signal / interference + Noise.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13515987 Topology Preservation in SOM
Authors: E. Arsuaga Uriarte, F. Díaz Martín
Abstract:
The SOM has several beneficial features which make it a useful method for data mining. One of the most important features is the ability to preserve the topology in the projection. There are several measures that can be used to quantify the goodness of the map in order to obtain the optimal projection, including the average quantization error and many topological errors. Many researches have studied how the topology preservation should be measured. One option consists of using the topographic error which considers the ratio of data vectors for which the first and second best BMUs are not adjacent. In this work we present a study of the behaviour of the topographic error in different kinds of maps. We have found that this error devaluates the rectangular maps and we have studied the reasons why this happens. Finally, we suggest a new topological error to improve the deficiency of the topographic error.Keywords: Map lattice, Self-Organizing Map, topographic error, topology preservation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30125986 RFID Logistic Management with Cold Chain Monitoring – Cold Store Case Study
Authors: Mira Trebar
Abstract:
Logistics processes of perishable food in the supply chain include the distribution activities and the real time temperature monitoring to fulfil the cold chain requirements. The paper presents the use of RFID (Radio Frequency Identification) technology as an identification tool of receiving and shipping activities in the cold store. At the same time, the use of RFID data loggers with temperature sensors is presented to observe and store the temperatures for the purpose of analyzing the processes and having the history data available for traceability purposes and efficient recall management.
Keywords: Logistics, warehouse, RFID device, cold chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37145985 Exploring the Challenges to Usage of Building and Construction Cost Indices in Ghana
Authors: J. J. Gyimah, E. Kissi, S. Osei-Tutu, C. D. Adobor, T. Adjei-Kumi, E. Osei-Tutu
Abstract:
Price fluctuation contract is imperative and of paramount essence in the construction industry as it provides adequate relief and cushioning for changes in the prices of input resources during construction. As a result, several methods have been devised to better help in arriving at fair recompense in the event of price changes. However, stakeholders often appear not to be satisfied with the existing methods of fluctuation evaluation, ostensibly because of the challenges associated with them. The aim of this study was to identify the challenges to usage of building construction cost indices in Ghana. Data were gathered from contractors and quantity surveying firms. The study utilized survey questionnaire approach to elicit responses from the contractors and the consultants. Data gathered were analyzed scientifically, using the Relative Importance Index (RII) to rank the problems associated with the existing methods. The findings revealed the following among others: late release of data; inadequate recovery of costs; and work items of interest not included in the published indices as the main challenges of the existing methods. Findings provided useful lessons for policy makers and practitioners in decision making towards the usage and improvement of available indices.
Keywords: Building construction cost indices, challenges, usage, Ghana.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6255984 Bureau Management Technologies and Information Systems in Developing Countries
Authors: Mehmet Altınöz
Abstract:
This study focuses on bureau management technologies and information systems in developing countries. Developing countries use such systems which facilitate executive and organizational functions through the utilization of bureau management technologies and provide the executive staff with necessary information. The concepts of data and information differ from each other in developing countries, and thus the concepts of data processing and information processing are different. Symbols represent ideas, objects, figures, letters and numbers. Data processing system is an integrated system which deals with the processing of the data related to the internal and external environment of the organization in order to make decisions, create plans and develop strategies; it goes without saying that this system is composed of both human beings and machines. Information is obtained through the acquisition and the processing of data. On the other hand, data are raw communicative messages. Within this framework, data processing equals to producing plausible information out of raw data. Organizations in developing countries need to obtain information relevant to them because rapid changes in the organizational arena require rapid access to accurate information. The most significant role of the directors and managers who work in the organizational arena is to make decisions. Making a correct decision is possible only when the directors and managers are equipped with sound ideas and appropriate information. Therefore, acquisition, organization and distribution of information gain significance. Today-s organizations make use of computer-assisted “Management Information Systems" in order to obtain and distribute information. Decision Support System which is closely related to practice is an information system that facilitates the director-s task of making decisions. Decision Support System integrates human intelligence, information technology and software in order to solve the complex problems. With the support of the computer technology and software systems, Decision Support System produces information relevant to the decision to be made by the director and provides the executive staff with supportive ideas about the decision. Artificial Intelligence programs which transfer the studies and experiences of the people to the computer are called expert systems. An expert system stores expert information in a limited area and can solve problems by deriving rational consequences. Bureau management technologies and information systems in developing countries create a kind of information society and information economy which make those countries have their places in the global socio-economic structure and which enable them to play a reasonable and fruitful role; therefore it is of crucial importance to make use of information and management technologies in order to work together with innovative and enterprising individuals and it is also significant to create “scientific policies" based on information and technology in the fields of economy, politics, law and culture.Keywords: Bureau Management, Information Systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15175983 Opportunities for Precision Feed in Apiculture for Managing the Efficacy of Feed and Medicine
Authors: John Michael Russo
Abstract:
Honeybees are important to our food system and continue to suffer from high rates of colony loss. Precision feed has brought many benefits to livestock cultivation and these should transfer to apiculture. However, apiculture has unique challenges. The objective of this research is to understand how principles of precision agriculture, applied to apiculture and feed specifically, might effectively improve state-of-the-art cultivation. The methodology surveys apicultural practice to build a model for assessment. First, a review of apicultural motivators is made. Feed method is then evaluated. Finally, precision feed methods are examined as accelerants with potential to advance the effectiveness of feed practice. Six important motivators emerge: colony loss, disease, climate change, site variance, operational costs, and competition. Feed practice itself is used to compensate for environmental variables. The research finds that the current state-of-the-art in apiculture feed focuses on critical challenges in the management of feed schedules which satisfy requirements of the bees, preserve potency, optimize environmental variables, and manage costs. Many of the challenges are most acute when feed is used to dispense medication. Technology such as RNA treatments have even more rigorous demands. Precision feed solutions focus on strategies which accommodate specific needs of individual livestock. A major component is data; they integrate precise data with methods that respond to individual needs. There is enormous opportunity for precision feed to improve apiculture through the integration of precision data with policies to translate data into optimized action in the apiary, particularly through automation.
Keywords: Apiculture, precision apiculture, RNA varroa treatment, honeybee feed applications.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2345982 Automation of Heat Exchanger using Neural Network
Authors: Sudhir Agashe, Ashok Ghatol, Sujata Agashe
Abstract:
In this paper the development of a heat exchanger as a pilot plant for educational purpose is discussed and the use of neural network for controlling the process is being presented. The aim of the study is to highlight the need of a specific Pseudo Random Binary Sequence (PRBS) to excite a process under control. As the neural network is a data driven technique, the method for data generation plays an important role. In light of this a careful experimentation procedure for data generation was crucial task. Heat exchange is a complex process, which has a capacity and a time lag as process elements. The proposed system is a typical pipe-in- pipe type heat exchanger. The complexity of the system demands careful selection, proper installation and commissioning. The temperature, flow, and pressure sensors play a vital role in the control performance. The final control element used is a pneumatically operated control valve. While carrying out the experimentation on heat exchanger a welldrafted procedure is followed giving utmost attention towards safety of the system. The results obtained are encouraging and revealing the fact that if the process details are known completely as far as process parameters are concerned and utilities are well stabilized then feedback systems are suitable, whereas neural network control paradigm is useful for the processes with nonlinearity and less knowledge about process. The implementation of NN control reinforces the concepts of process control and NN control paradigm. The result also underlined the importance of excitation signal typically for that process. Data acquisition, processing, and presentation in a typical format are the most important parameters while validating the results.Keywords: Process identification, neural network, heat exchanger.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15735981 Reflectance Imaging Spectroscopy Data (Hyperspectral) for Mineral Mapping in the Orientale Basin Region on the Moon Surface
Authors: V. Sivakumar, R. Neelakantan
Abstract:
Mineral mapping on the Moon surface provides the clue to understand the origin, evolution, stratigraphy and geological history of the Moon. Recently, reflectance imaging spectroscopy plays a significant role in identifying minerals on the planetary surface in the Visible to NIR region of the electromagnetic spectrum. The Moon Mineralogy Mapper (M3) onboard Chandrayaan-1 provides unprecedented spectral data of lunar surface to study about the Moon surface. Here we used the M3 sensor data (hyperspectral imaging spectroscopy) for analysing mineralogy of Orientale basin region on the Moon surface. Reflectance spectrums were sampled from different locations of the basin and continuum was removed using ENvironment for Visualizing Images (ENVI) software. Reflectance spectra of unknown mineral composition were compared with known Reflectance Experiment Laboratory (RELAB) spectra for discriminating mineralogy. Minerals like olivine, Low-Ca Pyroxene (LCP), High-Ca Pyroxene (HCP) and plagioclase were identified. In addition to these minerals, an unusual type of spectral signature was identified, which indicates the probable Fe-Mg-spinel lithology in the basin region.Keywords: Chandrayaan-1, moon mineralogy mapper, orientale basin, moon, spectroscopy, hyperspectral.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28545980 Cryptanalysis of Yang-Li-Liao’s Simple Three-Party Key Exchange (S-3PAKE) Protocol
Authors: Hae-Soon Ahn, Eun-Jun Yoon
Abstract:
Three-party password authenticated key exchange (3PAKE) protocols are widely deployed on lots of remote user authentication system due to its simplicity and convenience of maintaining a human-memorable password at client side to achieve secure communication within a hostile network. Recently, an improvement of 3PAKE protocol by processing a built-in data attached to other party for identity authentication to individual data was proposed by some researchers. However, this paper points out that the improved 3PAKE protocol is still vulnerable to undetectable on-line dictionary attack and off-line dictionary attack.
Keywords: Three-party key exchange, 3PAKE, Passwordauthenticated key exchange, Network security, Dictionary attack
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21225979 Meteorological Data Study and Forecasting Using Particle Swarm Optimization Algorithm
Authors: S. Esfandeh, M. Sedighizadeh
Abstract:
Weather systems use enormously complex combinations of numerical tools for study and forecasting. Unfortunately, due to phenomena in the world climate, such as the greenhouse effect, classical models may become insufficient mostly because they lack adaptation. Therefore, the weather forecast problem is matched for heuristic approaches, such as Evolutionary Algorithms. Experimentation with heuristic methods like Particle Swarm Optimization (PSO) algorithm can lead to the development of new insights or promising models that can be fine tuned with more focused techniques. This paper describes a PSO approach for analysis and prediction of data and provides experimental results of the aforementioned method on realworld meteorological time series.Keywords: Weather, Climate, PSO, Prediction, Meteorological
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20765978 Classification of Right and Left-Hand Movement Using Multi-Resolution Analysis Method
Authors: Nebi Gedik
Abstract:
The aim of the brain-computer interface studies on electroencephalogram (EEG) signals containing motor imagery is to extract the effective features that will provide the highest possible classification accuracy for the detection of the desired motor movement. However, achieving this goal is difficult as the most suitable frequency band and time frame vary from subject to subject. In this study, the classification success of the two-feature data obtained from raw EEG signals and the coefficients of the multi-resolution analysis method applied to the EEG signals were analyzed comparatively. The method was applied to several EEG channels (C3, Cz and C4) signals obtained from the EEG data set belonging to the publicly available BCI competition III.
Keywords: Motor imagery, EEG, wave atom transform, k-NN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5895977 Analysis of Statistical Data on Social Resources Dimension of Occupational Status Attainment: A Rational Choice Approach
Authors: Oleg Demchenko
Abstract:
The aim of the present study is to analyze empirical researches on the social resources dimension of occupational status attainment process and relate them to the rational choice approach. The analysis suggests that the existing data on the strength of ties aspect of social resources is insufficient and does not allow any implication concerning rational actor-s behavior. However, the results concerning work relation aspect are more encouraging.Keywords: Social resources, status attainment, rational choice, weak ties, work-related ties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14855976 A Diagnostic Fuzzy Rule-Based System for Congenital Heart Disease
Authors: Ersin Kaya, Bulent Oran, Ahmet Arslan
Abstract:
In this study, fuzzy rule-based classifier is used for the diagnosis of congenital heart disease. Congenital heart diseases are defined as structural or functional heart disease. Medical data sets were obtained from Pediatric Cardiology Department at Selcuk University, from years 2000 to 2003. Firstly, fuzzy rules were generated by using medical data. Then the weights of fuzzy rules were calculated. Two different reasoning methods as “weighted vote method" and “singles winner method" were used in this study. The results of fuzzy classifiers were compared.Keywords: Congenital heart disease, Fuzzy rule-basedclassifiers, Classification
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18215975 Part of Speech Tagging Using Statistical Approach for Nepali Text
Authors: Archit Yajnik
Abstract:
Part of Speech Tagging has always been a challenging task in the era of Natural Language Processing. This article presents POS tagging for Nepali text using Hidden Markov Model and Viterbi algorithm. From the Nepali text, annotated corpus training and testing data set are randomly separated. Both methods are employed on the data sets. Viterbi algorithm is found to be computationally faster and accurate as compared to HMM. The accuracy of 95.43% is achieved using Viterbi algorithm. Error analysis where the mismatches took place is elaborately discussed.Keywords: Hidden Markov model, Viterbi algorithm, POS tagging, natural language processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17085974 High Capacity Data Hiding based on Predictor and Histogram Modification
Authors: Hui-Yu Huang, Shih-Hsu Chang
Abstract:
In this paper, we propose a high capacity image hiding technology based on pixel prediction and the difference of modified histogram. This approach is used the pixel prediction and the difference of modified histogram to calculate the best embedding point. This approach can improve the predictive accuracy and increase the pixel difference to advance the hiding capacity. We also use the histogram modification to prevent the overflow and underflow. Experimental results demonstrate that our proposed method within the same average hiding capacity can still keep high quality of image and low distortionKeywords: data hiding, predictor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18865973 Optimizing Data Evaluation Metrics for Fraud Detection Using Machine Learning
Authors: Jennifer Leach, Umashanger Thayasivam
Abstract:
The use of technology has benefited society in more ways than one ever thought possible. Unfortunately, as society’s knowledge of technology has advanced, so has its knowledge of ways to use technology to manipulate others. This has led to a simultaneous advancement in the world of fraud. Machine learning techniques can offer a possible solution to help decrease these advancements. This research explores how the use of various machine learning techniques can aid in detecting fraudulent activity across two different types of fraudulent datasets, and the accuracy, precision, recall, and F1 were recorded for each method. Each machine learning model was also tested across five different training and testing splits in order to discover which split and technique would lead to the most optimal results.
Keywords: Data science, fraud detection, machine learning, supervised learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7715972 Sustainable Perspectives and Local Development Potential through Tourism
Authors: Pedro H. S. Messetti, Mary L. G. S. Senna, Afonso R. Aquino
Abstract:
Sustainability is a very important and heavily discussed subject, expanding through tourism as well. The study proposition was to collect data and present it to the competent bodies so they can mold their public policies to improve the conditions of the site. It was hypothesized that the lack of data is currently affecting the quality of life and the sustainable development of the site and the tourism. The research was held in Mateiros, a city in the state of Tocantins (TO)/Brasil near Palmas, its capital city. Because of the concentration of tourists during the high season and several tourist attractions being around, the research took place in Mateiros. The methodological procedure had a script of theoretical construction and investigation of the deductive scientific method parameters through a case study in the Jalapão/TO/Brazil region, using it as a tool for a questionnaire given to the competent bodies in an interview system with the UN sustainability indexes as a base. In the three sustainable development scope: environmental, social and economic, the results indicated that the data presented by the interviewed were scarce or nonexistent. It shows that more research is necessary, providing the tools for the ones responsible to propose action plans to improve the site, strengthening the tourism and making it even more sustainable.Keywords: Jalapão/Brazil State Park, sustainable tourism, UN sustainability indexes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14425971 Cluster Algorithm for Genetic Diversity
Authors: Manpreet Singh, Keerat Kaur, Bhavdeep Singh
Abstract:
With the hardware technology advancing, the cost of storing is decreasing. Thus there is an urgent need for new techniques and tools that can intelligently and automatically assist us in transferring this data into useful knowledge. Different techniques of data mining are developed which are helpful for handling these large size databases [7]. Data mining is also finding its role in the field of biotechnology. Pedigree means the associated ancestry of a crop variety. Genetic diversity is the variation in the genetic composition of individuals within or among species. Genetic diversity depends upon the pedigree information of the varieties. Parents at lower hierarchic levels have more weightage for predicting genetic diversity as compared to the upper hierarchic levels. The weightage decreases as the level increases. For crossbreeding, the two varieties should be more and more genetically diverse so as to incorporate the useful characters of the two varieties in the newly developed variety. This paper discusses the searching and analyzing of different possible pairs of varieties selected on the basis of morphological characters, Climatic conditions and Nutrients so as to obtain the most optimal pair that can produce the required crossbreed variety. An algorithm was developed to determine the genetic diversity between the selected wheat varieties. Cluster analysis technique is used for retrieving the results.Keywords: Genetic diversity, pedigree, nutrients.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18025970 Multimodal Biometric Authentication Using Choquet Integral and Genetic Algorithm
Authors: Anouar Ben Khalifa, Sami Gazzah, Najoua Essoukri BenAmara
Abstract:
The Choquet integral is a tool for the information fusion that is very effective in the case where fuzzy measures associated with it are well chosen. In this paper, we propose a new approach for calculating fuzzy measures associated with the Choquet integral in a context of data fusion in multimodal biometrics. The proposed approach is based on genetic algorithms. It has been validated in two databases: the first base is relative to synthetic scores and the second one is biometrically relating to the face, fingerprint and palmprint. The results achieved attest the robustness of the proposed approach.
Keywords: Multimodal biometrics, data fusion, Choquet integral, fuzzy measures, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25165969 Development of Sleep Quality Index Using Heart Rate
Authors: Dongjoo Kim, Chang-Sik Son, Won-Seok Kang
Abstract:
Adequate sleep affects various parts of one’s overall physical and mental life. As one of the methods in determining the appropriate amount of sleep, this research presents a heart rate based sleep quality index. In order to evaluate sleep quality using the heart rate, sleep data from 280 subjects taken over one month are used. Their sleep data are categorized by a three-part heart rate range. After categorizing, some features are extracted, and the statistical significances are verified for these features. The results show that some features of this sleep quality index model have statistical significance. Thus, this heart rate based sleep quality index may be a useful discriminator of sleep.Keywords: Sleep, sleep quality, heart rate, statistical analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1504