Search results for: Perceptually Important Point identification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20134

Search results for: Perceptually Important Point identification

19924 Smart Unmanned Parking System Based on Radio Frequency Identification Technology

Authors: Yu Qin

Abstract:

In order to tackle the ever-growing problem of the lack of parking space, this paper presents the design and implementation of a smart unmanned parking system that is based on RFID (radio frequency identification) technology and Wireless communication technology. This system uses RFID technology to achieve the identification function (transmitted by 2.4 G wireless module) and is equipped with an STM32L053 micro controller as the main control chip of the smart vehicle. This chip can accomplish automatic parking (in/out), charging and other functions. On this basis, it can also help users easily query the information that is stored in the database through the Internet. Experimental tests have shown that the system has the features of low power consumption and stable operation, among others. It can effectively improve the level of automation control of the parking lot management system and has enormous application prospects.

Keywords: RFID, embedded system, unmanned, parking management

Procedia PDF Downloads 298
19923 Addressing Supply Chain Data Risk with Data Security Assurance

Authors: Anna Fowler

Abstract:

When considering assets that may need protection, the mind begins to contemplate homes, cars, and investment funds. In most cases, the protection of those assets can be covered through security systems and insurance. Data is not the first thought that comes to mind that would need protection, even though data is at the core of most supply chain operations. It includes trade secrets, management of personal identifiable information (PII), and consumer data that can be used to enhance the overall experience. Data is considered a critical element of success for supply chains and should be one of the most critical areas to protect. In the supply chain industry, there are two major misconceptions about protecting data: (i) We do not manage or store confidential/personally identifiable information (PII). (ii) Reliance on Third-Party vendor security. These misconceptions can significantly derail organizational efforts to adequately protect data across environments. These statistics can be exciting yet overwhelming at the same time. The first misconception, “We do not manage or store confidential/personally identifiable information (PII)” is dangerous as it implies the organization does not have proper data literacy. Enterprise employees will zero in on the aspect of PII while neglecting trade secret theft and the complete breakdown of information sharing. To circumvent the first bullet point, the second bullet point forges an ideology that “Reliance on Third-Party vendor security” will absolve the company from security risk. Instead, third-party risk has grown over the last two years and is one of the major causes of data security breaches. It is important to understand that a holistic approach should be considered when protecting data which should not involve purchasing a Data Loss Prevention (DLP) tool. A tool is not a solution. To protect supply chain data, start by providing data literacy training to all employees and negotiating the security component of contracts with vendors to highlight data literacy training for individuals/teams that may access company data. It is also important to understand the origin of the data and its movement to include risk identification. Ensure processes effectively incorporate data security principles. Evaluate and select DLP solutions to address specific concerns/use cases in conjunction with data visibility. These approaches are part of a broader solutions framework called Data Security Assurance (DSA). The DSA Framework looks at all of the processes across the supply chain, including their corresponding architecture and workflows, employee data literacy, governance and controls, integration between third and fourth-party vendors, DLP as a solution concept, and policies related to data residency. Within cloud environments, this framework is crucial for the supply chain industry to avoid regulatory implications and third/fourth party risk.

Keywords: security by design, data security architecture, cybersecurity framework, data security assurance

Procedia PDF Downloads 56
19922 Non-Homogeneous Layered Fiber Reinforced Concrete

Authors: Vitalijs Lusis, Andrejs Krasnikovs

Abstract:

Fiber reinforced concrete is important material for load bearing structural elements. Usually fibers are homogeneously distributed in a concrete body having arbitrary spatial orientations. At the same time, in many situations, fiber concrete with oriented fibers is more optimal. Is obvious, that is possible to create constructions with oriented short fibers in them, in different ways. Present research is devoted to one of such approaches- fiber reinforced concrete prisms having dimensions 100 mm×100 mm×400 mm with layers of non-homogeneously distributed fibers inside them were fabricated. Simultaneously prisms with homogeneously dispersed fibers were produced for reference as well. Prisms were tested under four point bending conditions. During the tests vertical deflection at the center of every prism and crack opening were measured (using linear displacements transducers in real timescale). Prediction results were discussed.

Keywords: fiber reinforced concrete, 4-point bending, steel fiber, construction engineering

Procedia PDF Downloads 339
19921 Contribution to the Decision-Making Process for Selecting the Suitable Maintenance Policy

Authors: Nasser Y. Mahamoud, Pierre Dehombreux, Hassan E. Robleh

Abstract:

Industrial companies may be confronted with questions about their choice of maintenance policy. This choice must be guided by several numbers of decision criteria or objectives related to their production or service activities but also to their level of development and their investment prospects. A decision-support methodology to choose a maintenance policy (corrective, systematic or conditional preventive, predictive, opportunistic or not) is proposed to facilitate this choice using the main categories of the most important decision criteria. The different steps of this methodology are illustrated using theoretical case: identification of the different maintenance alternatives, determining the structure of the most important categories of the decision criteria, assessing the different maintenance policies on to the criteria by using an ordinal preference relation, and finally ranking the different maintenance policies.

Keywords: maintenance policy, decision criteria, decision-making process, AHP

Procedia PDF Downloads 303
19920 Comparison of Deep Convolutional Neural Networks Models for Plant Disease Identification

Authors: Megha Gupta, Nupur Prakash

Abstract:

Identification of plant diseases has been performed using machine learning and deep learning models on the datasets containing images of healthy and diseased plant leaves. The current study carries out an evaluation of some of the deep learning models based on convolutional neural network (CNN) architectures for identification of plant diseases. For this purpose, the publicly available New Plant Diseases Dataset, an augmented version of PlantVillage dataset, available on Kaggle platform, containing 87,900 images has been used. The dataset contained images of 26 diseases of 14 different plants and images of 12 healthy plants. The CNN models selected for the study presented in this paper are AlexNet, ZFNet, VGGNet (four models), GoogLeNet, and ResNet (three models). The selected models are trained using PyTorch, an open-source machine learning library, on Google Colaboratory. A comparative study has been carried out to analyze the high degree of accuracy achieved using these models. The highest test accuracy and F1-score of 99.59% and 0.996, respectively, were achieved by using GoogLeNet with Mini-batch momentum based gradient descent learning algorithm.

Keywords: comparative analysis, convolutional neural networks, deep learning, plant disease identification

Procedia PDF Downloads 157
19919 The Cases Studies of Eyewitness Misidentifications during Criminal Investigation in Taiwan

Authors: Chih Hung Shih

Abstract:

Eyewitness identification is one of the efficient information to identify suspects during criminal investigation. However eyewitness identification is improved frequently, inaccurate and plays vital roles in wrongful convictions. Most eyewitness misidentifications are made during police criminal investigation stage and then accepted by juries. Four failure investigation case studies in Taiwan are conduct to demonstrate how misidentifications are caused during the police investigation context. The result shows that there are several common grounds among these cases: (1) investigators lacked for knowledge about eyewitness memory so that they couldn’t evaluate the validity of the eyewitnesses’ accounts and identifications, (2) eyewitnesses were always asked to filter out several suspects during the investigation, and received investigation information which contaminated the eyewitnesses’ memory, (3) one to one live individual identifications were made in most of cases, (4) eyewitness identifications were always used to support the hypotheses of investigators, and exaggerated theirs powers when conform with the investigation lines, (5) the eyewitnesses’ confidence didn’t t reflect the validity of their identifications , but always influence the investigators’ beliefs for the identifications, (6) the investigators overestimated the power of the eyewitness identifications and ignore the inconsistency with other evidence. Recommendations have been proposed for future academic research and police practice of eyewitness identification in Taiwan.

Keywords: criminal investigation, eyewitness identification, investigative bias, investigative failures

Procedia PDF Downloads 211
19918 Automatic Product Identification Based on Deep-Learning Theory in an Assembly Line

Authors: Fidel Lòpez Saca, Carlos Avilés-Cruz, Miguel Magos-Rivera, José Antonio Lara-Chávez

Abstract:

Automated object recognition and identification systems are widely used throughout the world, particularly in assembly lines, where they perform quality control and automatic part selection tasks. This article presents the design and implementation of an object recognition system in an assembly line. The proposed shapes-color recognition system is based on deep learning theory in a specially designed convolutional network architecture. The used methodology involve stages such as: image capturing, color filtering, location of object mass centers, horizontal and vertical object boundaries, and object clipping. Once the objects are cut out, they are sent to a convolutional neural network, which automatically identifies the type of figure. The identification system works in real-time. The implementation was done on a Raspberry Pi 3 system and on a Jetson-Nano device. The proposal is used in an assembly course of bachelor’s degree in industrial engineering. The results presented include studying the efficiency of the recognition and processing time.

Keywords: deep-learning, image classification, image identification, industrial engineering.

Procedia PDF Downloads 127
19917 Polymorphism of HMW-GS in Collection of Wheat Genotypes

Authors: M. Chňapek, M. Tomka, R. Peroutková, Z. Gálová

Abstract:

Processes of plant breeding, testing and licensing of new varieties, patent protection in seed production, relations in trade and protection of copyright are dependent on identification, differentiation and characterization of plant genotypes. Therefore, we focused our research on utilization of wheat storage proteins as genetic markers suitable not only for differentiation of individual genotypes, but also for identification and characterization of their considerable properties. We analyzed a collection of 102 genotypes of bread wheat (Triticum aestivum L.), 41 genotypes of spelt wheat (Triticum spelta L.), and 35 genotypes of durum wheat (Triticum durum Desf.), in this study. Our results show, that genotypes of bread wheat and durum wheat were homogenous and single line, but spelt wheat genotypes were heterogenous. We observed variability of HMW-GS composition according to environmental factors and level of breeding and predict technological quality on the basis of Glu-score calculation.

Keywords: genotype identification, HMW-GS, wheat quality, polymorphism

Procedia PDF Downloads 432
19916 Kinematic Hardening Parameters Identification with Respect to Objective Function

Authors: Marina Franulovic, Robert Basan, Bozidar Krizan

Abstract:

Constitutive modelling of material behaviour is becoming increasingly important in prediction of possible failures in highly loaded engineering components, and consequently, optimization of their design. In order to account for large number of phenomena that occur in the material during operation, such as kinematic hardening effect in low cycle fatigue behaviour of steels, complex nonlinear material models are used ever more frequently, despite of the complexity of determination of their parameters. As a method for the determination of these parameters, genetic algorithm is good choice because of its capability to provide very good approximation of the solution in systems with large number of unknown variables. For the application of genetic algorithm to parameter identification, inverse analysis must be primarily defined. It is used as a tool to fine-tune calculated stress-strain values with experimental ones. In order to choose proper objective function for inverse analysis among already existent and newly developed functions, the research is performed to investigate its influence on material behaviour modelling.

Keywords: genetic algorithm, kinematic hardening, material model, objective function

Procedia PDF Downloads 293
19915 Parameters Estimation of Multidimensional Possibility Distributions

Authors: Sergey Sorokin, Irina Sorokina, Alexander Yazenin

Abstract:

We present a solution to the Maxmin u/E parameters estimation problem of possibility distributions in m-dimensional case. Our method is based on geometrical approach, where minimal area enclosing ellipsoid is constructed around the sample. Also we demonstrate that one can improve results of well-known algorithms in fuzzy model identification task using Maxmin u/E parameters estimation.

Keywords: possibility distribution, parameters estimation, Maxmin u\E estimator, fuzzy model identification

Procedia PDF Downloads 433
19914 Identification of CLV for Online Shoppers Using RFM Matrix: A Case Based on Features of B2C Architecture

Authors: Riktesh Srivastava

Abstract:

Online Shopping have established an astonishing evolution in the last few years. And it is now apparent that B2C architecture is becoming progressively imperative channel for even traditional brick and mortar type traders as well. In this completion knowing customers and predicting behavior are extremely important. More important, when any customer logs onto the B2C architecture, the traces of their buying patterns can be stored and used for future predictions. Such a prediction is called Customer Lifetime Value (CLV). Earlier, we used Net Present Value to do so, however, it ignores two important aspects of B2C architecture, “market risks” and “big amount of customer data”. Now, we use RFM- Recency, Frequency and Monetary Value to estimate the CLV, and as the term exemplifies, market risks, is well sheltered. Big Data Analysis is also roofed in RFM, which gives real exploration of the Big Data and lead to a better estimation for future cash flow from customers. In the present paper, 6 factors (collected from varied sources) are used to determine as to what attracts the customers to the B2C architecture. For these 6 factors, RFM is computed for 3 years (2013, 2014 and 2015) respectively. CLV and Revenue are the two parameters defined using RFM analysis, which gives the clear picture of the future predictions.

Keywords: CLV, RFM, revenue, recency, frequency, monetary value

Procedia PDF Downloads 189
19913 Evaluation of DNA Microarray System in the Identification of Microorganisms Isolated from Blood

Authors: Merih Şimşek, Recep Keşli, Özgül Çetinkaya, Cengiz Demir, Adem Aslan

Abstract:

Bacteremia is a clinical entity with high morbidity and mortality rates when immediate diagnose, or treatment cannot be achieved. Microorganisms which can cause sepsis or bacteremia are easily isolated from blood cultures. Fifty-five positive blood cultures were included in this study. Microorganisms in 55 blood cultures were isolated by conventional microbiological methods; afterwards, microorganisms were defined in terms of the phenotypic aspects by the Vitek-2 system. The same microorganisms in all blood culture samples were defined in terms of genotypic aspects again by Multiplex-PCR DNA Low-Density Microarray System. At the end of the identification process, the DNA microarray system’s success in identification was evaluated based on the Vitek-2 system. The Vitek-2 system and DNA Microarray system were able to identify the same microorganisms in 53 samples; on the other hand, different microorganisms were identified in the 2 blood cultures by DNA Microarray system. The microorganisms identified by Vitek-2 system were found to be identical to 96.4 % of microorganisms identified by DNA Microarrays system. In addition to bacteria identified by Vitek-2, the presence of a second bacterium has been detected in 5 blood cultures by the DNA Microarray system. It was identified 18 of 55 positive blood culture as E.coli strains with both Vitek 2 and DNA microarray systems. The same identification numbers were found 6 and 8 for Acinetobacter baumanii, 10 and 10 for K.pneumoniae, 5 and 5 for S.aureus, 7 and 11 for Enterococcus spp, 5 and 5 for P.aeruginosa, 2 and 2 for C.albicans respectively. According to these results, DNA Microarray system requires both a technical device and experienced staff support; besides, it requires more expensive kits than Vitek-2. However, this method should be used in conjunction with conventional microbiological methods. Thus, large microbiology laboratories will produce faster, more sensitive and more successful results in the identification of cultured microorganisms.

Keywords: microarray, Vitek-2, blood culture, bacteremia

Procedia PDF Downloads 308
19912 Foggy Image Restoration Using Neural Network

Authors: Khader S. Al-Aidmat, Venus W. Samawi

Abstract:

Blurred vision in the misty atmosphere is essential problem which needs to be resolved. To solve this problem, we developed a technique to restore foggy degraded image from its original version using Back-propagation neural network (BP-NN). The suggested technique is based on mapping between foggy scene and its corresponding original scene. Seven different approaches are suggested based on type of features used in image restoration. Features are extracted from spatial and spatial-frequency domain (using DCT). Each of these approaches comes with its own BP-NN architecture depending on type and number of used features. The weight matrix resulted from training each BP-NN represents a fog filter. The performance of these filters are evaluated empirically (using PSNR), and perceptually. By comparing the performance of these filters, the effective features that suits BP-NN technique for restoring foggy images is recognized. This system proved its effectiveness and success in restoring moderate foggy images.

Keywords: artificial neural network, discrete cosine transform, feed forward neural network, foggy image restoration

Procedia PDF Downloads 354
19911 Improvement of Egyptian Vacuum Distillates by Solvent Dewaxing

Authors: Ehssan M. R. Nassef

Abstract:

De-waxing of vacuum distillates by using solvent was investigated in the present study. The present work deals with studying solvent dewaxing system which have been developed to give better dewaxing performance with respect to the important factors in the choice of solvents which are good solubility of oil in the solvent and low solubility of wax in the solvent. In this study, solvent dewaxing process using Methyl Ethyl Ketone (MEK) and toluene are used for Egyptian vacuum distillates using two types of distillates. The effect of varying the composition of(MEK to toluene) on the percent yield of the oil, percent of wax, pour point, refractive index at 20 and 70°C, viscosity at 40 and 100°C, viscosity index and specific gravity of the oil produced for the two types of distillates (I & II) were evaluated. In the present study, the operating conditions of solvent dewaxing using MEK toluene mixture achieved the best pour point at -15°C for distillate I at (1:1) solvent composition mixture. At the same ratio of MEK to toluene the best specific gravity of oil produced changed from 0.871 to 0.8802, with refractive index of 1.84. Percent yield of 65% for oil was obtained. The results for distillate II, of higher specific gravity, are comparatively higher than those for distillate I. The effect of temperature was also investigated and the best temperature was -20°C.

Keywords: dewaxing, solvent dewaxing, pour point, lubricating oil production, wax

Procedia PDF Downloads 505
19910 Customers’ Priority to Implement SSTs Using AHP Analysis

Authors: Mohammad Jafariahangari, Marjan Habibi, Miresmaeil Mirnabibaboli, Mirza Hassan Hosseini

Abstract:

Self-service technologies (SSTs) make an important contribution to the daily life of people nowadays. However, the introduction of SST does not lead to its usage. Thereby, this paper was an attempt on discovery of the most preferred SST in the customers’ point of view. To fulfill this aim, the Analytical Hierarchy Process (AHP) was applied based on Saaty’s questionnaire which was administered to the customers of e-banking services located in Golestan providence, north of Iran. This study used qualitative factors in association with the intention of consumers’ usage of SSTs to rank three SSTs: ATM, mobile banking, and internet banking. The results showed that mobile banking get the highest weight in consumers’ point of view. This research can be useful both for managers and service providers and also for customers who intend to use e-banking.

Keywords: analytical hierarchy process, decision-making, e-banking, self-service technologies, Iran

Procedia PDF Downloads 278
19909 Modern Trends in Foreign Direct Investments in Georgia

Authors: Rusudan Kinkladze, Guguli Kurashvili, Ketevan Chitaladze

Abstract:

Foreign direct investment is a driving force in the development of the interdependent national economies, and the study and analysis of investments is an urgent problem. It is particularly important for transitional economies, such as Georgia, and the study and analysis of investments is an urgent problem. Consequently, the goal of the research is the study and analysis of direct foreign investments in Georgia, and identification and forecasting of modern trends, and covers the period of 2006-2015. The study uses the methods of statistical observation, grouping and analysis, the methods of analytical indicators of time series, trend identification and the predicted values are calculated, as well as various literary and Internet sources relevant to the research. The findings showed that modern investment policy In Georgia is favorable for domestic as well as foreign investors. Georgia is still a net importer of investments. In 2015, the top 10 investing countries was led by Azerbaijan, United Kingdom and Netherlands, and the largest share of FDIs were allocated in the transport and communication sector; the financial sector was the second, followed by the health and social work sector, and the same trend will continue in the future. 

Keywords: foreign direct investments, methods, statistics, analysis

Procedia PDF Downloads 289
19908 Analysing Maximum Power Point Tracking in a Stand Alone Photovoltaic System

Authors: Osamede Asowata

Abstract:

Optimized gain in respect to output power of stand-alone photovoltaic (PV) systems is one of the major focus of PV in recent times. This is evident in its low carbon emission and efficiency. Power failure or outage from commercial providers, in general, does not promote development to public and private sector; these basically limit the development of industries. The need for a well-structured PV system is of importance for an efficient and cost effective monitoring system. The purpose of this paper is to validate the maximum power point of an off-grid PV system taking into consideration the most effective tilt and orientation angles for PV's in the southern hemisphere. This paper is based on analyzing the system using a solar charger with maximum power point tracking (MPPT) from a pulse width modulation (PWM) perspective. The power conditioning device chosen is a solar charger with MPPT. The practical setup consists of a PV panel that is set to an orientation angle of 0°N, with a corresponding tilt angle of 36°, 26°, and 16°. Preliminary results include regression analysis (normal probability plot) showing the maximum power point in the system as well the best tilt angle for maximum power point tracking.

Keywords: poly-crystalline PV panels, solar chargers, tilt and orientation angles, maximum power point tracking, MPPT, Pulse Width Modulation (PWM).

Procedia PDF Downloads 128
19907 Comparison of Different Methods of Microorganism's Identification from a Copper Mining in Pará, Brazil

Authors: Louise H. Gracioso, Marcela P.G. Baltazar, Ingrid R. Avanzi, Bruno Karolski, Luciana J. Gimenes, Claudio O. Nascimento, Elen A. Perpetuo

Abstract:

Introduction: Higher copper concentrations promote a selection pressure on organisms such as plants, fungi and bacteria, which allows surviving only the resistant organisms to the contaminated site. This selective pressure keeps only the organisms most resistant to a specific condition and subsequently increases their bioremediation potential. Despite the bacteria importance for biosphere maintenance, it is estimated that only a small fraction living microbial species has been described and characterized. Due to the molecular biology development, tools based on analysis 16S ribosomal RNA or another specific gene are making a new scenario for the characterization studies and identification of microorganisms in the environment. News identification of microorganisms methods have also emerged like Biotyper (MALDI / TOF), this method mass spectrometry is subject to the recognition of spectroscopic patterns of conserved and features proteins for different microbial species. In view of this, this study aimed to isolate bacteria resistant to copper present in a Copper Processing Area (Sossego Mine, Canaan, PA) and identifies them in two different methods: Recent (spectrometry mass) and conventional. This work aimed to use them for a future bioremediation of this Mining. Material and Methods: Samples were collected at fifteen different sites of five periods of times. Microorganisms were isolated from mining wastes by culture enrichment technique; this procedure was repeated 4 times. The isolates were inoculated into MJS medium containing different concentrations of chloride copper (1mM, 2.5mM, 5mM, 7.5mM and 10 mM) and incubated in plates for 72 h at 28 ºC. These isolates were subjected to mass spectrometry identification methods (Biotyper – MALDI/TOF) and 16S gene sequencing. Results: A total of 105 strains were isolated in this area, bacterial identification by mass spectrometry method (MALDI/TOF) achieved 74% agreement with the conventional identification method (16S), 31% have been unsuccessful in MALDI-TOF and 2% did not obtain identification sequence the 16S. These results show that Biotyper can be a very useful tool in the identification of bacteria isolated from environmental samples, since it has a better value for money (cheap and simple sample preparation and MALDI plates are reusable). Furthermore, this technique is more rentable because it saves time and has a high performance (the mass spectra are compared to the database and it takes less than 2 minutes per sample).

Keywords: copper mining area, bioremediation, microorganisms, identification, MALDI/TOF, RNA 16S

Procedia PDF Downloads 347
19906 Bridge Members Segmentation Algorithm of Terrestrial Laser Scanner Point Clouds Using Fuzzy Clustering Method

Authors: Donghwan Lee, Gichun Cha, Jooyoung Park, Junkyeong Kim, Seunghee Park

Abstract:

3D shape models of the existing structure are required for many purposes such as safety and operation management. The traditional 3D modeling methods are based on manual or semi-automatic reconstruction from close-range images. It occasions great expense and time consuming. The Terrestrial Laser Scanner (TLS) is a common survey technique to measure quickly and accurately a 3D shape model. This TLS is used to a construction site and cultural heritage management. However there are many limits to process a TLS point cloud, because the raw point cloud is massive volume data. So the capability of carrying out useful analyses is also limited with unstructured 3-D point. Thus, segmentation becomes an essential step whenever grouping of points with common attributes is required. In this paper, members segmentation algorithm was presented to separate a raw point cloud which includes only 3D coordinates. This paper presents a clustering approach based on a fuzzy method for this objective. The Fuzzy C-Means (FCM) is reviewed and used in combination with a similarity-driven cluster merging method. It is applied to the point cloud acquired with Lecia Scan Station C10/C5 at the test bed. The test-bed was a bridge which connects between 1st and 2nd engineering building in Sungkyunkwan University in Korea. It is about 32m long and 2m wide. This bridge was used as pedestrian between two buildings. The 3D point cloud of the test-bed was constructed by a measurement of the TLS. This data was divided by segmentation algorithm for each member. Experimental analyses of the results from the proposed unsupervised segmentation process are shown to be promising. It can be processed to manage configuration each member, because of the segmentation process of point cloud.

Keywords: fuzzy c-means (FCM), point cloud, segmentation, terrestrial laser scanner (TLS)

Procedia PDF Downloads 202
19905 Determination of Various Properties of Biodiesel Produced from Different Feedstocks

Authors: Faisal Anwar, Dawar Zaidi, Shubham Dixit, Nafees Ahmedii

Abstract:

This paper analyzes the various properties of biodiesel such as pour point, cloud point, viscosity, calorific value, etc produced from different feedstocks. The aim of the work is to analyze change in these properties after converting feedstocks to biodiesel and then comparring it with ASTM 6751-02 standards to check whether they are suitable for diesel engines or not. The conversion of feedstocks is carried out by a process called transesterification. This conversion is carried out to reduce viscosity, pour point, etc. It has been observed that there is some remarkable change in the properties of oil after conversion.

Keywords: biodiesel, ethyl ester, free fatty acid, production

Procedia PDF Downloads 329
19904 Classification of Sturm-Liouville Problems at Infinity

Authors: Kishor J. shinde

Abstract:

We determine the values of k and p such that the Sturm-Liouville differential operator τu=-(d^2 u)/(dx^2) + kx^p u is in limit point case or limit circle case at infinity. In particular it is shown that τ is in the limit point case when (i) for p=2 and ∀k, (ii) for ∀p and k=0, (iii) for all p and k>0, (iv) for 0≤p≤2 and k<0, (v) for p<0 and k<0. τ is in the limit circle case when (i) for p>2 and k<0.

Keywords: limit point case, limit circle case, Sturm-Liouville, infinity

Procedia PDF Downloads 332
19903 Development of a Shape Based Estimation Technology Using Terrestrial Laser Scanning

Authors: Gichun Cha, Byoungjoon Yu, Jihwan Park, Minsoo Park, Junghyun Im, Sehwan Park, Sujung Sin, Seunghee Park

Abstract:

The goal of this research is to estimate a structural shape change using terrestrial laser scanning. This study proceeds with development of data reduction and shape change estimation algorithm for large-capacity scan data. The point cloud of scan data was converted to voxel and sampled. Technique of shape estimation is studied to detect changes in structure patterns, such as skyscrapers, bridges, and tunnels based on large point cloud data. The point cloud analysis applies the octree data structure to speed up the post-processing process for change detection. The point cloud data is the relative representative value of shape information, and it used as a model for detecting point cloud changes in a data structure. Shape estimation model is to develop a technology that can detect not only normal but also immediate structural changes in the event of disasters such as earthquakes, typhoons, and fires, thereby preventing major accidents caused by aging and disasters. The study will be expected to improve the efficiency of structural health monitoring and maintenance.

Keywords: terrestrial laser scanning, point cloud, shape information model, displacement measurement

Procedia PDF Downloads 196
19902 Damage Identification Using Experimental Modal Analysis

Authors: Niladri Sekhar Barma, Satish Dhandole

Abstract:

Damage identification in the context of safety, nowadays, has become a fundamental research interest area in the field of mechanical, civil, and aerospace engineering structures. The following research is aimed to identify damage in a mechanical beam structure and quantify the severity or extent of damage in terms of loss of stiffness, and obtain an updated analytical Finite Element (FE) model. An FE model is used for analysis, and the location of damage for single and multiple damage cases is identified numerically using the modal strain energy method and mode shape curvature method. Experimental data has been acquired with the help of an accelerometer. Fast Fourier Transform (FFT) algorithm is applied to the measured signal, and subsequently, post-processing is done in MEscopeVes software. The two sets of data, the numerical FE model and experimental results, are compared to locate the damage accurately. The extent of the damage is identified via modal frequencies using a mixed numerical-experimental technique. Mode shape comparison is performed by Modal Assurance Criteria (MAC). The analytical FE model is adjusted by the direct method of model updating. The same study has been extended to some real-life structures such as plate and GARTEUR structures.

Keywords: damage identification, damage quantification, damage detection using modal analysis, structural damage identification

Procedia PDF Downloads 78
19901 Evaluation of Sensor Pattern Noise Estimators for Source Camera Identification

Authors: Benjamin Anderson-Sackaney, Amr Abdel-Dayem

Abstract:

This paper presents a comprehensive survey of recent source camera identification (SCI) systems. Then, the performance of various sensor pattern noise (SPN) estimators was experimentally assessed, under common photo response non-uniformity (PRNU) frameworks. The experiments used 1350 natural and 900 flat-field images, captured by 18 individual cameras. 12 different experiments, grouped into three sets, were conducted. The results were analyzed using the receiver operator characteristic (ROC) curves. The experimental results demonstrated that combining the basic SPN estimator with a wavelet-based filtering scheme provides promising results. However, the phase SPN estimator fits better with both patch-based (BM3D) and anisotropic diffusion (AD) filtering schemes.

Keywords: sensor pattern noise, source camera identification, photo response non-uniformity, anisotropic diffusion, peak to correlation energy ratio

Procedia PDF Downloads 410
19900 Impact Position Method Based on Distributed Structure Multi-Agent Coordination with JADE

Authors: YU Kaijun, Liang Dong, Zhang Yarong, Jin Zhenzhou, Yang Zhaobao

Abstract:

For the impact monitoring of distributed structures, the traditional positioning methods are based on the time difference, which includes the four-point arc positioning method and the triangulation positioning method. But in the actual operation, these two methods have errors. In this paper, the Multi-Agent Blackboard Coordination Principle is used to combine the two methods. Fusion steps: (1) The four-point arc locating agent calculates the initial point and records it to the Blackboard Module.(2) The triangulation agent gets its initial parameters by accessing the initial point.(3) The triangulation agent constantly accesses the blackboard module to update its initial parameters, and it also logs its calculated point into the blackboard.(4) When the subsequent calculation point and the initial calculation point are within the allowable error, the whole coordination fusion process is finished. This paper presents a Multi-Agent collaboration method whose agent framework is JADE. The JADE platform consists of several agent containers, with the agent running in each container. Because of the perfect management and debugging tools of the JADE, it is very convenient to deal with complex data in a large structure. Finally, based on the data in Jade, the results show that the impact location method based on Multi-Agent coordination fusion can reduce the error of the two methods.

Keywords: impact monitoring, structural health monitoring(SHM), multi-agent system(MAS), black-board coordination, JADE

Procedia PDF Downloads 143
19899 Heart Failure Identification and Progression by Classifying Cardiac Patients

Authors: Muhammad Saqlain, Nazar Abbas Saqib, Muazzam A. Khan

Abstract:

Heart Failure (HF) has become the major health problem in our society. The prevalence of HF has increased as the patient’s ages and it is the major cause of the high mortality rate in adults. A successful identification and progression of HF can be helpful to reduce the individual and social burden from this syndrome. In this study, we use a real data set of cardiac patients to propose a classification model for the identification and progression of HF. The data set has divided into three age groups, namely young, adult, and old and then each age group have further classified into four classes according to patient’s current physical condition. Contemporary Data Mining classification algorithms have been applied to each individual class of every age group to identify the HF. Decision Tree (DT) gives the highest accuracy of 90% and outperform all other algorithms. Our model accurately diagnoses different stages of HF for each age group and it can be very useful for the early prediction of HF.

Keywords: decision tree, heart failure, data mining, classification model

Procedia PDF Downloads 377
19898 Comparison of Data Reduction Algorithms for Image-Based Point Cloud Derived Digital Terrain Models

Authors: M. Uysal, M. Yilmaz, I. Tiryakioğlu

Abstract:

Digital Terrain Model (DTM) is a digital numerical representation of the Earth's surface. DTMs have been applied to a diverse field of tasks, such as urban planning, military, glacier mapping, disaster management. In the expression of the Earth' surface as a mathematical model, an infinite number of point measurements are needed. Because of the impossibility of this case, the points at regular intervals are measured to characterize the Earth's surface and DTM of the Earth is generated. Hitherto, the classical measurement techniques and photogrammetry method have widespread use in the construction of DTM. At present, RADAR, LiDAR, and stereo satellite images are also used for the construction of DTM. In recent years, especially because of its superiorities, Airborne Light Detection and Ranging (LiDAR) has an increased use in DTM applications. A 3D point cloud is created with LiDAR technology by obtaining numerous point data. However recently, by the development in image mapping methods, the use of unmanned aerial vehicles (UAV) for photogrammetric data acquisition has increased DTM generation from image-based point cloud. The accuracy of the DTM depends on various factors such as data collection method, the distribution of elevation points, the point density, properties of the surface and interpolation methods. In this study, the random data reduction method is compared for DTMs generated from image based point cloud data. The original image based point cloud data set (100%) is reduced to a series of subsets by using random algorithm, representing the 75, 50, 25 and 5% of the original image based point cloud data set. Over the ANS campus of Afyon Kocatepe University as the test area, DTM constructed from the original image based point cloud data set is compared with DTMs interpolated from reduced data sets by Kriging interpolation method. The results show that the random data reduction method can be used to reduce the image based point cloud datasets to 50% density level while still maintaining the quality of DTM.

Keywords: DTM, Unmanned Aerial Vehicle (UAV), uniform, random, kriging

Procedia PDF Downloads 118
19897 Promoting Organizational Learning Facing the Complexity of Public Healthcare: How to Design a Voluntary, Learning-Oriented Benchmarking

Authors: Rachel M. Lørum, Henrik Eriksson, Frida Smith

Abstract:

Purpose: In recent years, the use of benchmarks for the improvement of healthcare has become increasingly common. There has been an increasing interest in why improvement initiatives so often fail to eliminate the problems they aspire to solve. Benchmarking comes with its fair share of challenges and problems, such as capturing the dynamics and complexities of the care environments, among others. In this study, we demonstrate how learning-oriented, voluntary benchmarks in the complex environment of public healthcare could be designed. Findings: Our four most important findings were the following: first, important organizational learning (OL) regarding the complexity of the service and implications on how to design a benchmark for learning and improvement occurred during the process. Second, participation by a wide range of professionals and stakeholders was crucial for capturing the complexity of people and organizations and increasing the quality of the template. Third, the continuous dialogue between all organizations involved was an important tool for ongoing organizational learning throughout the process. The last important finding was the impact of the facilitator’s role through supporting progress, coordination, and dialogue. Design: We chose participatory design as the research design. Data were derived from written materials such as e-mails, protocols, observational notes, and reflection notes collected during a period of 1.5 years. Originality: Our main contributions are the identification of important strategies, initiatives, and actors to involve when designing voluntary benchmarks for learning and improvement.

Keywords: organizational learning, quality improvement, learning-oriented benchmark, healthcare, patient safety

Procedia PDF Downloads 75
19896 Application of Groundwater Level Data Mining in Aquifer Identification

Authors: Liang Cheng Chang, Wei Ju Huang, You Cheng Chen

Abstract:

Investigation and research are keys for conjunctive use of surface and groundwater resources. The hydrogeological structure is an important base for groundwater analysis and simulation. Traditionally, the hydrogeological structure is artificially determined based on geological drill logs, the structure of wells, groundwater levels, and so on. In Taiwan, groundwater observation network has been built and a large amount of groundwater-level observation data are available. The groundwater level is the state variable of the groundwater system, which reflects the system response combining hydrogeological structure, groundwater injection, and extraction. This study applies analytical tools to the observation database to develop a methodology for the identification of confined and unconfined aquifers. These tools include frequency analysis, cross-correlation analysis between rainfall and groundwater level, groundwater regression curve analysis, and decision tree. The developed methodology is then applied to groundwater layer identification of two groundwater systems: Zhuoshui River alluvial fan and Pingtung Plain. The abovementioned frequency analysis uses Fourier Transform processing time-series groundwater level observation data and analyzing daily frequency amplitude of groundwater level caused by artificial groundwater extraction. The cross-correlation analysis between rainfall and groundwater level is used to obtain the groundwater replenishment time between infiltration and the peak groundwater level during wet seasons. The groundwater regression curve, the average rate of groundwater regression, is used to analyze the internal flux in the groundwater system and the flux caused by artificial behaviors. The decision tree uses the information obtained from the above mentioned analytical tools and optimizes the best estimation of the hydrogeological structure. The developed method reaches training accuracy of 92.31% and verification accuracy 93.75% on Zhuoshui River alluvial fan and training accuracy 95.55%, and verification accuracy 100% on Pingtung Plain. This extraordinary accuracy indicates that the developed methodology is a great tool for identifying hydrogeological structures.

Keywords: aquifer identification, decision tree, groundwater, Fourier transform

Procedia PDF Downloads 124
19895 Spatial Point Process Analysis of Dengue Fever in Tainan, Taiwan

Authors: Ya-Mei Chang

Abstract:

This research is intended to apply spatio-temporal point process methods to the dengue fever data in Tainan. The spatio-temporal intensity function of the dataset is assumed to be separable. The kernel estimation is a widely used approach to estimate intensity functions. The intensity function is very helpful to study the relation of the spatio-temporal point process and some covariates. The covariate effects might be nonlinear. An nonparametric smoothing estimator is used to detect the nonlinearity of the covariate effects. A fitted parametric model could describe the influence of the covariates to the dengue fever. The correlation between the data points is detected by the K-function. The result of this research could provide useful information to help the government or the stakeholders making decisions.

Keywords: dengue fever, spatial point process, kernel estimation, covariate effect

Procedia PDF Downloads 321