Search results for: web data regions
7089 Color Image Enhancement Using Multiscale Retinex and Image Fusion Techniques
Authors: Chang-Hsing Lee, Cheng-Chang Lien, Chin-Chuan Han
Abstract:
In this paper, an edge-strength guided multiscale retinex (EGMSR) approach will be proposed for color image contrast enhancement. In EGMSR, the pixel-dependent weight associated with each pixel in the single scale retinex output image is computed according to the edge strength around this pixel in order to prevent from over-enhancing the noises contained in the smooth dark/bright regions. Further, by fusing together the enhanced results of EGMSR and adaptive multiscale retinex (AMSR), we can get a natural fused image having high contrast and proper tonal rendition. Experimental results on several low-contrast images have shown that our proposed approach can produce natural and appealing enhanced images.
Keywords: Image Enhancement, Multiscale Retinex, Image Fusion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27387088 Beam and Diffuse Solar Energy in Zarqa City
Authors: Ali M. Jawarneh
Abstract:
Beam and diffuse radiation data are extracted analytically from previous measured data on a horizontal surface in Zarqa city. Moreover, radiation data on a tilted surfaces with different slopes have been derived and analyzed. These data are consisting of of beam contribution, diffuse contribution, and ground reflected contribution radiation. Hourly radiation data for horizontal surface possess the highest radiation values on June, and then the values decay as the slope increases and the sharp decreasing happened for vertical surface. The beam radiation on a horizontal surface owns the highest values comparing to diffuse radiation for all days of June. The total daily radiation on the tilted surface decreases with slopes. The beam radiation data also decays with slopes especially for vertical surface. Diffuse radiation slightly decreases with slopes with sharp decreases for vertical surface. The groundreflected radiation grows with slopes especially for vertical surface. It-s clear that in June the highest harvesting of solar energy occurred for horizontal surface, then the harvesting decreases as the slope increases.
Keywords: Beam and Diffuse Radiation, Zarqa City
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15507087 An Application of the Data Mining Methods with Decision Rule
Authors: Xun Ge, Jianhua Gong
Abstract:
ankings for output of Chinese main agricultural commodity in the world for 1978, 1980, 1990, 2000, 2006, 2007 and 2008 have been released in United Nations FAO Database. Unfortunately, where the ranking of output of Chinese cotton lint in the world for 2008 was missed. This paper uses sequential data mining methods with decision rules filling this gap. This new data mining method will be help to give a further improvement for United Nations FAO Database.
Keywords: Ranking, output of the main agricultural commodity, gross domestic product, decision table, information system, data mining, decision rule
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17107086 LiDAR Based Real Time Multiple Vehicle Detection and Tracking
Authors: Zhongzhen Luo, Saeid Habibi, Martin v. Mohrenschildt
Abstract:
Self-driving vehicle require a high level of situational awareness in order to maneuver safely when driving in real world condition. This paper presents a LiDAR based real time perception system that is able to process sensor raw data for multiple target detection and tracking in dynamic environment. The proposed algorithm is nonparametric and deterministic that is no assumptions and priori knowledge are needed from the input data and no initializations are required. Additionally, the proposed method is working on the three-dimensional data directly generated by LiDAR while not scarifying the rich information contained in the domain of 3D. Moreover, a fast and efficient for real time clustering algorithm is applied based on a radially bounded nearest neighbor (RBNN). Hungarian algorithm procedure and adaptive Kalman filtering are used for data association and tracking algorithm. The proposed algorithm is able to run in real time with average run time of 70ms per frame.Keywords: LiDAR, real-time system, clustering, tracking, data association.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 46707085 Water End-Use Classification with Contemporaneous Water-Energy Data and Deep Learning Network
Authors: Khoi A. Nguyen, Rodney A. Stewart, Hong Zhang
Abstract:
‘Water-related energy’ is energy use which is directly or indirectly influenced by changes to water use. Informatics applying a range of mathematical, statistical and rule-based approaches can be used to reveal important information on demand from the available data provided at second, minute or hourly intervals. This study aims to combine these two concepts to improve the current water end use disaggregation problem through applying a wide range of most advanced pattern recognition techniques to analyse the concurrent high-resolution water-energy consumption data. The obtained results have shown that recognition accuracies of all end-uses have significantly increased, especially for mechanised categories, including clothes washer, dishwasher and evaporative air cooler where over 95% of events were correctly classified.
Keywords: Deep learning network, smart metering, water end use, water-energy data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13637084 Watermark Bit Rate in Diverse Signal Domains
Authors: Nedeljko Cvejic, Tapio Sepp
Abstract:
A study of the obtainable watermark data rate for information hiding algorithms is presented in this paper. As the perceptual entropy for wideband monophonic audio signals is in the range of four to five bits per sample, a significant amount of additional information can be inserted into signal without causing any perceptual distortion. Experimental results showed that transform domain watermark embedding outperforms considerably watermark embedding in time domain and that signal decompositions with a high gain of transform coding, like the wavelet transform, are the most suitable for high data rate information hiding. Keywords?Digital watermarking, information hiding, audio watermarking, watermark data rate.
Keywords: Digital watermarking, information hiding, audio watermarking, watermark data rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16287083 Concurrent Access to Complex Entities
Authors: Cosmin Rablou
Abstract:
In this paper we present a way of controlling the concurrent access to data in a distributed application using the Pessimistic Offline Lock design pattern. In our case, the application processes a complex entity, which contains in a hierarchical structure different other entities (objects). It will be shown how the complex entity and the contained entities must be locked in order to control the concurrent access to data.Keywords: Object-oriented programming, Pessimistic Lock, Design pattern, Concurrent access to data, Processing complex entities
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13117082 Sparse Coding Based Classification of Electrocardiography Signals Using Data-Driven Complete Dictionary Learning
Authors: Fuad Noman, Sh-Hussain Salleh, Chee-Ming Ting, Hadri Hussain, Syed Rasul
Abstract:
In this paper, a data-driven dictionary approach is proposed for the automatic detection and classification of cardiovascular abnormalities. Electrocardiography (ECG) signal is represented by the trained complete dictionaries that contain prototypes or atoms to avoid the limitations of pre-defined dictionaries. The data-driven trained dictionaries simply take the ECG signal as input rather than extracting features to study the set of parameters that yield the most descriptive dictionary. The approach inherently learns the complicated morphological changes in ECG waveform, which is then used to improve the classification. The classification performance was evaluated with ECG data under two different preprocessing environments. In the first category, QT-database is baseline drift corrected with notch filter and it filters the 60 Hz power line noise. In the second category, the data are further filtered using fast moving average smoother. The experimental results on QT database confirm that our proposed algorithm shows a classification accuracy of 92%.Keywords: Electrocardiogram, dictionary learning, sparse coding, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20937081 A Remote Sensing Approach to Calculate Population Using Roads Network Data in Lebanon
Authors: Kamel Allaw, Jocelyne Adjizian Gerard, Makram Chehayeb, Nada Badaro Saliba
Abstract:
In developing countries, such as Lebanon, the demographic data are hardly available due to the absence of the mechanization of population system. The aim of this study is to evaluate, using only remote sensing data, the correlations between the number of population and the characteristics of roads network (length of primary roads, length of secondary roads, total length of roads, density and percentage of roads and the number of intersections). In order to find the influence of the different factors on the demographic data, we studied the degree of correlation between each factor and the number of population. The results of this study have shown a strong correlation between the number of population and the density of roads and the number of intersections.
Keywords: Population, road network, statistical correlations, remote sensing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9967080 Risk-Management by Numerical Pattern Analysis in Data-Mining
Authors: M. Kargar, R. Mirmiran, F. Fartash, T. Saderi
Abstract:
In this paper a new method is suggested for risk management by the numerical patterns in data-mining. These patterns are designed using probability rules in decision trees and are cared to be valid, novel, useful and understandable. Considering a set of functions, the system reaches to a good pattern or better objectives. The patterns are analyzed through the produced matrices and some results are pointed out. By using the suggested method the direction of the functionality route in the systems can be controlled and best planning for special objectives be done.Keywords: Analysis, Data-mining, Pattern, Risk Management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12717079 Wind Speed Data Analysis using Wavelet Transform
Authors: S. Avdakovic, A. Lukac, A. Nuhanovic, M. Music
Abstract:
Renewable energy systems are becoming a topic of great interest and investment in the world. In recent years wind power generation has experienced a very fast development in the whole world. For planning and successful implementations of good wind power plant projects, wind potential measurements are required. In these projects, of great importance is the effective choice of the micro location for wind potential measurements, installation of the measurement station with the appropriate measuring equipment, its maintenance and analysis of the gained data on wind potential characteristics. In this paper, a wavelet transform has been applied to analyze the wind speed data in the context of insight in the characteristics of the wind and the selection of suitable locations that could be the subject of a wind farm construction. This approach shows that it can be a useful tool in investigation of wind potential.Keywords: Wind potential, Wind speed data, Wavelettransform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26327078 A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations
Authors: Ramon Santana
Abstract:
The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.Keywords: Fingerprint, template protection, bio-cryptography, minutiae protection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8427077 SIMGraph: Simplifying Contig Graph to Improve de Novo Genome Assembly Using Next-generation Sequencing Data
Authors: Chien-Ju Li, Chun-Hui Yu, Chi-Chuan Hwang, Tsunglin Liu , Darby Tien-Hao Chang
Abstract:
De novo genome assembly is always fragmented. Assembly fragmentation is more serious using the popular next generation sequencing (NGS) data because NGS sequences are shorter than the traditional Sanger sequences. As the data throughput of NGS is high, the fragmentations in assemblies are usually not the result of missing data. On the contrary, the assembled sequences, called contigs, are often connected to more than one other contigs in a complicated manner, leading to the fragmentations. False connections in such complicated connections between contigs, named a contig graph, are inevitable because of repeats and sequencing/assembly errors. Simplifying a contig graph by removing false connections directly improves genome assembly. In this work, we have developed a tool, SIMGraph, to resolve ambiguous connections between contigs using NGS data. Applying SIMGraph to the assembly of a fungus and a fish genome, we resolved 27.6% and 60.3% ambiguous contig connections, respectively. These results can reduce the experimental efforts in resolving contig connections.
Keywords: Contig graph, NGS, de novo assembly, scaffold.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17347076 Acute Coronary Syndrome Prediction Using Data Mining Techniques- An Application
Authors: Tahseen A. Jilani, Huda Yasin, Madiha Yasin, C. Ardil
Abstract:
In this paper we use data mining techniques to investigate factors that contribute significantly to enhancing the risk of acute coronary syndrome. We assume that the dependent variable is diagnosis – with dichotomous values showing presence or absence of disease. We have applied binary regression to the factors affecting the dependent variable. The data set has been taken from two different cardiac hospitals of Karachi, Pakistan. We have total sixteen variables out of which one is assumed dependent and other 15 are independent variables. For better performance of the regression model in predicting acute coronary syndrome, data reduction techniques like principle component analysis is applied. Based on results of data reduction, we have considered only 14 out of sixteen factors.
Keywords: Acute coronary syndrome (ACS), binary logistic regression analyses, myocardial ischemia (MI), principle component analysis, unstable angina (U.A.).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21147075 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach
Authors: Elias K. Maragos, Petros E. Maravelakis
Abstract:
In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.
Keywords: Data envelopment analysis, Dynamic DEA, Piecewise linear inputs, Piecewise linear outputs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6557074 Data Mining Determination of Sunlight Average Input for Solar Power Plant
Authors: Fl. Loury, P. Sablonière, C. Lamoureux, G. Magnier, Th. Gutierrez
Abstract:
A method is proposed to extract faithful representative patterns from data set of observations when they are suffering from non-negligible fluctuations. Supposing time interval between measurements to be extremely small compared to observation time, it consists in defining first a subset of intermediate time intervals characterizing coherent behavior. Data projection on these intervals gives a set of curves out of which an ideally “perfect” one is constructed by taking the sup limit of them. Then comparison with average real curve in corresponding interval gives an efficiency parameter expressing the degradation consecutive to fluctuation effect. The method is applied to sunlight data collected in a specific place, where ideal sunlight is the one resulting from direct exposure at location latitude over the year, and efficiency is resulting from action of meteorological parameters, mainly cloudiness, at different periods of the year. The extracted information already gives interesting element of decision, before being used for analysis of plant control.
Keywords: Base Input Reconstruction, Data Mining, Efficiency Factor, Information Pattern Operator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15287073 Social Organization of Kazakhstani Business under Conditions of Customs Union and Common Free Market Zone: Empirical Study Practice
Authors: Zh. Nurbekova, Z. Zhanazarova, A. Beissenova
Abstract:
This article is devoted to the analysis of results of sociological researches carried out by authors directed on studying of opinion of representatives of small, medium and big business on formation of the Customs Union, Common Free Market Zone with participation of Kazakhstan, Russia and Belarus. It-s forecasted that companies, their branches will interpenetrate with registration and moving their businesses to regions with more beneficial conditions. They say that in Kazakhstan there are more profitable geo-strategic operating environment for business and lower taxes. Russia using this opportunity will create new conditions for expansion into other countries of Central Asia and China. Opinions of participants of questionnaire and expert poll different in estimation of value of these two integration mechanisms since market segments on the one hand extend, but also on the other hand - loss of exclusive influence in certain fields of activity.Keywords: Customs Union, Kazakhstan, sociological research.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14077072 An Approach for Ensuring Data Flow in Freight Delivery and Management Systems
Authors: Aurelija Burinskienė, Dalė Dzemydienė, Arūnas Miliauskas
Abstract:
This research aims at developing the approach for more effective freight delivery and transportation process management. The road congestions and the identification of causes are important, as well as the context information recognition and management. The measure of many parameters during the transportation period and proper control of driver work became the problem. The number of vehicles per time unit passing at a given time and point for drivers can be evaluated in some situations. The collection of data is mainly used to establish new trips. The flow of the data is more complex in urban areas. Herein, the movement of freight is reported in detail, including the information on street level. When traffic density is extremely high in congestion cases, and the traffic speed is incredibly low, data transmission reaches the peak. Different data sets are generated, which depend on the type of freight delivery network. There are three types of networks: long-distance delivery networks, last-mile delivery networks and mode-based delivery networks; the last one includes different modes, in particular, railways and other networks. When freight delivery is switched from one type of the above-stated network to another, more data could be included for reporting purposes and vice versa. In this case, a significant amount of these data is used for control operations, and the problem requires an integrated methodological approach. The paper presents an approach for providing e-services for drivers by including the assessment of the multi-component infrastructure needed for delivery of freights following the network type. The construction of such a methodology is required to evaluate data flow conditions and overloads, and to minimize the time gaps in data reporting. The results obtained show the possibilities of the proposing methodological approach to support the management and decision-making processes with functionality of incorporating networking specifics, by helping to minimize the overloads in data reporting.Keywords: Transportation networks, freight delivery, data flow, monitoring, e-services.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6077071 Inefficiency of Data Storing in Physical Memory
Authors: Kamaruddin Malik Mohamad, Sapiee Haji Jamel, Mustafa Mat Deris
Abstract:
Memory forensic is important in digital investigation. The forensic is based on the data stored in physical memory that involve memory management and processing time. However, the current forensic tools do not consider the efficiency in terms of storage management and the processing time. This paper shows the high redundancy of data found in the physical memory that cause inefficiency in processing time and memory management. The experiment is done using Borland C compiler on Windows XP with 512 MB of physical memory.Keywords: Digital Evidence, Memory Forensics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20197070 Development of an Avionics System for Flight Data Collection of an UAV Helicopter
Authors: Nikhil Ramaswamy, S.N.Omkar, Kashyap.H.Nathwani, Anil.M.Vanjare
Abstract:
In this present work, the development of an avionics system for flight data collection of a Raptor 30 V2 is carried out. For the data acquisition both onground and onboard avionics systems are developed for testing of a small-scale Unmanned Aerial Vehicle (UAV) helicopter. The onboard avionics record the helicopter state outputs namely accelerations, angular rates and Euler angles, in real time, and the on ground avionics system record the inputs given to the radio controlled helicopter through a transmitter, in real time. The avionic systems are designed and developed taking into consideration low weight, small size, anti-vibration, low power consumption, and easy interfacing. To mitigate the medium frequency vibrations embedded on the UAV helicopter during flight, a damper is designed and its performance is evaluated. A number of flight tests are carried out and the data obtained is then analyzed for accuracy and repeatability and conclusions are inferred.Keywords: Data collection, Flight Testing, Onground and Onboard Avionics, UAV helicopter
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26427069 Using Low Permeability Sand-Fadr Mixture Membrane for Isolated Swelling Soil
Authors: Mohie Eldin Mohamed Afifiy Elmashad
Abstract:
Desert regions around the Nile valley in Upper Egypt contain great extent of swelling soil. Many different comment procedures of treatment of the swelling soils for construction such as pre-swelling, load balance OR soil replacement. One of the measure factors which affect the level of the aggressiveness of the swelling soil is the direction of the infiltration water directions within the swelling soils. In this paper a physical model was installed to measure the effect of water on the swelling soil with replacement using fatty acid distillation residuals (FADR) mixed with sand as thick sand-FADR mixture to prevent the water pathway arrive to the swelling soil. Testing program have been conducted on different artificial samples with different sand to FADR contents ratios (4%, 6%, and 9%) to get the optimum value fulfilling the impermeable replacement. The tests show that a FADR content of 9% is sufficient to produce impermeable replacement.Keywords: Swelling soil, FADR, soil improvement, permeability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18047068 The Research of Fuzzy Classification Rules Applied to CRM
Authors: Chien-Hua Wang, Meng-Ying Chou, Chin-Tzong Pang
Abstract:
In the era of great competition, understanding and satisfying customers- requirements are the critical tasks for a company to make a profits. Customer relationship management (CRM) thus becomes an important business issue at present. With the help of the data mining techniques, the manager can explore and analyze from a large quantity of data to discover meaningful patterns and rules. Among all methods, well-known association rule is most commonly seen. This paper is based on Apriori algorithm and uses genetic algorithms combining a data mining method to discover fuzzy classification rules. The mined results can be applied in CRM to help decision marker make correct business decisions for marketing strategies.Keywords: Customer relationship management (CRM), Data mining, Apriori algorithm, Genetic algorithm, Fuzzy classification rules.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16617067 Equilibrium Modeling of Carbon Dioxide Adsorption on Zeolites
Authors: Alireza Behvandi, Somayeh Tourani
Abstract:
High pressure adsorption of carbon dioxide on zeolite 13X was investigated in the pressure range (0 to 4) Mpa and temperatures 298, 308 and 323K. The data fitting is accomplished with the Toth, UNILAN, Dubinin-Astakhov and virial adsorption models which are generally used for micro porous adsorbents such as zeolites. Comparison with experimental data from the literature indicated that the virial model would best determine results. These results may be partly attributed to the flexibility of the virial model which can accommodate as many constants as the data warrants.Keywords: adsorption models, zeolite, carbon dioxide
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28847066 Application of Java-based Pointcuts in Aspect Oriented Programming (AOP) for Data Race Detection
Authors: Sadaf Khalid, Fahim Arif
Abstract:
Wide applicability of concurrent programming practices in developing various software applications leads to different concurrency errors amongst which data race is the most important. Java provides greatest support for concurrent programming by introducing various concurrency packages. Aspect oriented programming (AOP) is modern programming paradigm facilitating the runtime interception of events of interest and can be effectively used to handle the concurrency problems. AspectJ being an aspect oriented extension to java facilitates the application of concepts of AOP for data race detection. Volatile variables are usually considered thread safe, but they can become the possible candidates of data races if non-atomic operations are performed concurrently upon them. Various data race detection algorithms have been proposed in the past but this issue of volatility and atomicity is still unaddressed. The aim of this research is to propose some suggestions for incorporating certain conditions for data race detection in java programs at the volatile fields by taking into account support for atomicity in java concurrency packages and making use of pointcuts. Two simple test programs will demonstrate the results of research. The results are verified on two different Java Development Kits (JDKs) for the purpose of comparison.Keywords: Aspect Bench Compiler (abc), Aspect OrientedProgramming (AOP), AspectJ, Aspects, Concurrency packages, Concurrent programming, Cross-cutting Concerns, Data race, Eclipse, Java, Java Development Kits (JDKs), Pointcuts
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19307065 Laser Keratoplasty in Human Eye Considering the Fluid Aqueous Humor and Vitreous Humor Fluid Flow
Authors: Dara Singh, Keikhosrow Firouzbakhsh, Mohammad Taghi Ahmadian
Abstract:
In this paper, conventional laser Keratoplasty surgeries in the human eye are studied. For this purpose, a validated 3D finite volume model of the human eye is introduced. In this model the fluid flow has also been considered. The discretized domain of the human eye incorporates a bio-heat transfer equation coupled with a Boussinesq equation. Both continuous and pulsed lasers have been modeled and the results are compared. Moreover, two different conventional surgical positions that are upright and recumbent are compared for these laser therapies. The simulation results show that in these conventional surgeries, the temperature rises above the critical values at the laser insertion areas. However, due to the short duration and the localized nature, the potential damages are restricted to very small regions and can be ignored. The conclusion is that the present day lasers are acceptably safe to the human eye.
Keywords: Eye, heat-transfer, keratoplasty laser, surgery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9467064 Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency
Authors: Rania Alshikhe, Vinita Jindal
Abstract:
Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from trav-eling vehicles, such as taxis through installed global positioning sys-tem (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.
Keywords: GPS trajectory, road network, taxi trips, digital map, big data, STEM, LANCE
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5107063 Flame Stability and Structure of Liquefied Petroleum Gas-Fired Inverse Diffusion Flame with Hydrogen Enrichment
Authors: J. Miao, C. W. Leung, C. S. Cheung, R. C. K. Leung
Abstract:
The present project was conducted with the circumferential-fuel-jets inverse diffusion flame (CIDF) burner burning liquefied petroleum gas (LPG) enriched with 50% of hydrogen fuel (H2). The range of stable operation of the CIDF burner in terms of Reynolds number (from laminar to turbulent flow regions), equivalence ratio and fuel jet velocity of LPG of the 50% H2-LPG mixed fuel was identified. Experiments were also carried out to investigate the flame structures of the LPG flame and LPG enriched H2 flame. Experimental results obtained from these two flames were compared to fully explore the influence of hydrogen addition on flame stability. Flame heights obtained by burning these two kinds of fuels at various equivalence ratios were compared and correlated with the Global Momentum Ratio (GMR).Keywords: Flame stability, hydrogen enriched LPG, inverse diffusion flame.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30477062 Adaptive Gaussian Mixture Model for Skin Color Segmentation
Authors: Reza Hassanpour, Asadollah Shahbahrami, Stephan Wong
Abstract:
Skin color based tracking techniques often assume a static skin color model obtained either from an offline set of library images or the first few frames of a video stream. These models can show a weak performance in presence of changing lighting or imaging conditions. We propose an adaptive skin color model based on the Gaussian mixture model to handle the changing conditions. Initial estimation of the number and weights of skin color clusters are obtained using a modified form of the general Expectation maximization algorithm, The model adapts to changes in imaging conditions and refines the model parameters dynamically using spatial and temporal constraints. Experimental results show that the method can be used in effectively tracking of hand and face regions.Keywords: Face detection, Segmentation, Tracking, Gaussian Mixture Model, Adaptation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24157061 Actionable Rules: Issues and New Directions
Authors: Harleen Kaur
Abstract:
Knowledge Discovery in Databases (KDD) is the process of extracting previously unknown, hidden and interesting patterns from a huge amount of data stored in databases. Data mining is a stage of the KDD process that aims at selecting and applying a particular data mining algorithm to extract an interesting and useful knowledge. It is highly expected that data mining methods will find interesting patterns according to some measures, from databases. It is of vital importance to define good measures of interestingness that would allow the system to discover only the useful patterns. Measures of interestingness are divided into objective and subjective measures. Objective measures are those that depend only on the structure of a pattern and which can be quantified by using statistical methods. While, subjective measures depend only on the subjectivity and understandability of the user who examine the patterns. These subjective measures are further divided into actionable, unexpected and novel. The key issues that faces data mining community is how to make actions on the basis of discovered knowledge. For a pattern to be actionable, the user subjectivity is captured by providing his/her background knowledge about domain. Here, we consider the actionability of the discovered knowledge as a measure of interestingness and raise important issues which need to be addressed to discover actionable knowledge.
Keywords: Data Mining Community, Knowledge Discovery inDatabases (KDD), Interestingness, Subjective Measures, Actionability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19427060 The Alliance for Grassland Renewal: A Model for Teaching Endophyte Technology
Authors: C. A. Roberts, J. G. Andrae, S. R. Smith, M. H. Poore, C. A. Young, D. W. Hancock, G. J. Pent
Abstract:
To the author’s best knowledge, there are no published reports of effective methods for teaching fescue toxicosis and grass endophyte technology in the USA. To address this need, a group of university scientists, industry representatives, government agents, and livestock producers formed an organization called the Alliance for Grassland Renewal. One goal of the Alliance was to develop a teaching method that could be employed across all regions in the USA and all sectors of the agricultural community. The first step in developing this method was identification of experts who were familiar with the science and management of fescue toxicosis. The second step was curriculum development. Experts wrote a curriculum that addressed all aspects of toxicosis and management, including toxicology, animal nutrition, pasture management, economics, and mycology. The curriculum was created for presentation in lectures, laboratories, and in the field. The curriculum was in that it could be delivered across state lines, regardless of peculiar, in-state recommendations. The curriculum was also unique as it was unanimously supported by private companies otherwise in competition with each other. The final step in developing this teaching method was formulating a delivery plan. All experts, including university, industry, government, and production, volunteered to travel from any state in the USA, converge in one location, teach a 1-day workshop, then travel to the next location. The results of this teaching method indicate widespread success. Since 2012, experts across the entire USA have converged to teach Alliance workshops in Kansas, Oklahoma, Missouri, Kentucky, Georgia, South Carolina, North Carolina, and Virginia, with ongoing workshops in Arkansas and Tennessee. Data from post-workshop surveys indicate that instruction has been effective, as at least 50% of the participants stated their intention to adopt the endophyte technology presented in these workshops. The teaching method developed by the Alliance for Grassland Renewal has proved to be effective, and the Alliance continues to expand across the USA.
Keywords: Endophyte, Epichloë coenophiala, ergot alkaloids, fescue toxicosis, tall fescue.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 784