Search results for: fingerprint impression
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 70

Search results for: fingerprint impression

40 Design Systems and the Need for a Usability Method: Assessing the Fitness of Components and Interaction Patterns in Design Systems Using Atmosphere Methodology

Authors: P. Johansson, S. Mardh

Abstract:

The present study proposes a usability test method, Atmosphere, to assess the fitness of components and interaction patterns of design systems. The method covers the user’s perception of the components of the system, the efficiency of the logic of the interaction patterns, perceived ease of use as well as the user’s understanding of the intended outcome of interactions. These aspects are assessed by combining measures of first impression, visual affordance and expectancy. The method was applied to a design system developed for the design of an electronic health record system. The study was conducted involving 15 healthcare personnel. It could be concluded that the Atmosphere method provides tangible data that enable human-computer interaction practitioners to analyze and categorize components and patterns based on perceived usability, success rate of identifying interactive components and success rate of understanding components and interaction patterns intended outcome.

Keywords: atomic design, atmosphere methodology, design system, expectancy testing, first impression testing, usability testing, visual affordance testing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 781
39 Through Biometric Card in Romania: Person Identification by Face, Fingerprint and Voice Recognition

Authors: Hariton N. Costin, Iulian Ciocoiu, Tudor Barbu, Cristian Rotariu

Abstract:

In this paper three different approaches for person verification and identification, i.e. by means of fingerprints, face and voice recognition, are studied. Face recognition uses parts-based representation methods and a manifold learning approach. The assessment criterion is recognition accuracy. The techniques under investigation are: a) Local Non-negative Matrix Factorization (LNMF); b) Independent Components Analysis (ICA); c) NMF with sparse constraints (NMFsc); d) Locality Preserving Projections (Laplacianfaces). Fingerprint detection was approached by classical minutiae (small graphical patterns) matching through image segmentation by using a structural approach and a neural network as decision block. As to voice / speaker recognition, melodic cepstral and delta delta mel cepstral analysis were used as main methods, in order to construct a supervised speaker-dependent voice recognition system. The final decision (e.g. “accept-reject" for a verification task) is taken by using a majority voting technique applied to the three biometrics. The preliminary results, obtained for medium databases of fingerprints, faces and voice recordings, indicate the feasibility of our study and an overall recognition precision (about 92%) permitting the utilization of our system for a future complex biometric card.

Keywords: Biometry, image processing, pattern recognition, speech analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1918
38 Cheiloscopy and Dactylography in Relation to ABO Blood Groups: Egyptian vs. Malay Populations

Authors: Manal Hassan Abdel Aziz, Fatma Mohamed Magdy Badr El Dine, Nourhan Mohamed Mohamed Saeed

Abstract:

Establishing association between lip print patterns and those of fingerprints as well as blood groups is of fundamental importance in the forensic identification domain. The first aim of the current study was to determine the prevalent types of ABO blood groups, lip prints and fingerprints patterns in both studied populations. Secondly, to analyze any relation found between the different print patterns and the blood groups, which would be valuable in identification purposes. The present study was conducted on 60 healthy volunteers, (30 males and 30 females) from each of the studied population. Lip prints and fingerprints were obtained and classified according to Tsuchihashi's classification and Michael Kuchen’s classification, respectively. The results show that the ulnar loop was the most frequent among both populations. Blood group A was the most frequent among Egyptians, while blood groups O and B were the predominant among Malaysians. Significant relations were observed between lip print patterns and fingerprint (in the second quadrant for Egyptian males and the first one for Malaysian). For Malaysian females, a statistically significant association was proved in the fourth quadrant. Regarding the blood groups, 89.5% of ulnar loops were significantly related to blood group A among Egyptian males. The results proved an association between the fingerprint pattern and the lip prints, as well as between the ABO blood group and the pattern of fingerprints. However, further researches with larger sample sizes need to be directed to approve the current results.

Keywords: ABO, cheiloscopy, dactylography, Egyptians, Malaysians.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 842
37 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing

Authors: Yehjune Heo

Abstract:

As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.

Keywords: Anti-spoofing, CNN, fingerprint recognition, loss function, optimizer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 381
36 Development of a Secured Telemedical System Using Biometric Feature

Authors: O. Iyare, A. H. Afolayan, O. T. Oluwadare, B. K. Alese

Abstract:

Access to advanced medical services has been one of the medical challenges faced by our present society especially in distant geographical locations which may be inaccessible. Then the need for telemedicine arises through which live videos of a doctor can be streamed to a patient located anywhere in the world at any time. Patients’ medical records contain very sensitive information which should not be made accessible to unauthorized people in order to protect privacy, integrity and confidentiality. This research work focuses on a more robust security measure which is biometric (fingerprint) as a form of access control to data of patients by the medical specialist/practitioner.

Keywords: Biometrics, telemedicine, privacy, patient information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1617
35 Adaptive WiFi Fingerprinting for Location Approximation

Authors: Mohd Fikri Azli bin Abdullah, Khairul Anwar bin Kamarul Hatta, Esther Jeganathan

Abstract:

WiFi has become an essential technology that is widely used nowadays. It is famous due to its convenience to be used with mobile devices. This is especially true for Internet users worldwide that use WiFi connections. There are many location based services that are available nowadays which uses Wireless Fidelity (WiFi) signal fingerprinting. A common example that is gaining popularity in this era would be Foursquare. In this work, the WiFi signal would be used to estimate the user or client’s location. Similar to GPS, fingerprinting method needs a floor plan to increase the accuracy of location estimation. Still, the factor of inconsistent WiFi signal makes the estimation defer at different time intervals. Given so, an adaptive method is needed to obtain the most accurate signal at all times. WiFi signals are heavily distorted by external factors such as physical objects, radio frequency interference, electrical interference, and environmental factors to name a few. Due to these factors, this work uses a method of reducing the signal noise and estimation using the Nearest Neighbour based on past activities of the signal to increase the signal accuracy up to more than 80%. The repository yet increases the accuracy by using Artificial Neural Network (ANN) pattern matching. The repository acts as the server cum support of the client side application decision. Numerous previous works has adapted the methods of collecting signal strengths in the repository over the years, but mostly were just static. In this work, proposed solutions on how the adaptive method is done to match the signal received to the data in the repository are highlighted. With the said approach, location estimation can be done more accurately. Adaptive update allows the latest location fingerprint to be stored in the repository. Furthermore, any redundant location fingerprints are removed and only the updated version of the fingerprint is stored in the repository. How the location estimation of the user can be predicted would be highlighted more in the proposed solution section. After some studies on previous works, it is found that the Artificial Neural Network is the most feasible method to deploy in updating the repository and making it adaptive. The Artificial Neural Network functions are to do the pattern matching of the WiFi signal to the existing data available in the repository.

Keywords: Adaptive Repository, Artificial Neural Network, Location Estimation, Nearest Neighbour Euclidean Distance, WiFi RSSI Fingerprinting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3435
34 Gait Recognition System: Bundle Rectangle Approach

Authors: Edward Guillen, Daniel Padilla, Adriana Hernandez, Kenneth Barner

Abstract:

Biometrics methods include recognition techniques such as fingerprint, iris, hand geometry, voice, face, ears and gait. The gait recognition approach has some advantages, for example it does not need the prior concern of the observed subject and it can record many biometric features in order to make deeper analysis, but most of the research proposals use high computational cost. This paper shows a gait recognition system with feature subtraction on a bundle rectangle drawn over the observed person. Statistical results within a database of 500 videos are shown.

Keywords: Autentication, Biometrics, Gait Recognition, Human Identification, Security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1848
33 Multimodal Biometric Authentication Using Choquet Integral and Genetic Algorithm

Authors: Anouar Ben Khalifa, Sami Gazzah, Najoua Essoukri BenAmara

Abstract:

The Choquet integral is a tool for the information fusion that is very effective in the case where fuzzy measures associated with it are well chosen. In this paper, we propose a new approach for calculating fuzzy measures associated with the Choquet integral in a context of data fusion in multimodal biometrics. The proposed approach is based on genetic algorithms. It has been validated in two databases: the first base is relative to synthetic scores and the second one is biometrically relating to the face, fingerprint and palmprint. The results achieved attest the robustness of the proposed approach.

Keywords: Multimodal biometrics, data fusion, Choquet integral, fuzzy measures, genetic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2493
32 Rethinking the Analysis of Means-End Chain Data in Marketing Research

Authors: P. Puustinen, A. Kanto

Abstract:

This paper proposes a new procedure for analyzing means-end chain data in marketing research. Most commonly the collected data is summarized in the Hierarchical Value Map (HVM) illustrating the main attribute-consequence-value linkages. This paper argues that traditionally constructed HVM may give an erroneous impression of the results of a means-end study. To justify the arguments, an alternative procedure to (1) determine the dominant attribute-consequence-value linkages and (2) construct HVM in a precise manner is presented. The current approach makes a contribution to means-end analysis, allowing marketers to address a set of marketing problems, such as advertising strategy.

Keywords: Means-end chain analysis, Laddering, Hierarchical Value Map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2762
31 Frequency- and Content-Based Tag Cloud Font Distribution Algorithm

Authors: Ágnes Bogárdi-Mészöly, Takeshi Hashimoto, Shohei Yokoyama, Hiroshi Ishikawa

Abstract:

The spread of Web 2.0 has caused user-generated content explosion. Users can tag resources to describe and organize them. Tag clouds provide rough impression of relative importance of each tag within overall cloud in order to facilitate browsing among numerous tags and resources. The goal of our paper is to enrich visualization of tag clouds. A font distribution algorithm has been proposed to calculate a novel metric based on frequency and content, and to classify among classes from this metric based on power law distribution and percentages. The suggested algorithm has been validated and verified on the tag cloud of a real-world thesis portal.

Keywords: Tag cloud, font distribution algorithm, frequency-based, content-based, power law.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2053
30 Crystalline Model Approach for Studying the Nuclear Properties of Light Nuclei

Authors: A. Amar, O. Hemeda

Abstract:

A study of the structure of the nucleus with the analogy by solid-state physics has been developed. We have used binding energy to calculate R (a parameter that is proportional to the radius of the nucleus) for deuteron, alpha, and 8Be. The calculated parameter r calculated from solid state physics produces a probe for calculation the nuclear radii. 8Be has special attention as it is radioactive nucleus and the latest nucleus to be calculated from crystalline model approach. The distribution of nucleons inside the nucleus is taken to be tetrahedral for 16O. The model has failed to expect the radius of 9Be which is an impression about the modification should be done on the model at near future. A comparison between our calculations and those from literature has been made, and a good agreement has been obtained.

Keywords: The structure of the nucleus, binding energy, crystalline model approach, nuclear radii, tetrahedral for 16O.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 461
29 Genetic Combined with a Simplex Algorithm as an Efficient Method for the Detection of a Depressed Ellipsoidal Flaw using the Boundary Element Method

Authors: Clio G. Vossou, Ioannis N. Koukoulis, Christopher G. Provatidis

Abstract:

The present work encounters the solution of the defect identification problem with the use of an evolutionary algorithm combined with a simplex method. In more details, a Matlab implementation of Genetic Algorithms is combined with a Simplex method in order to lead to the successful identification of the defect. The influence of the location and the orientation of the depressed ellipsoidal flaw was investigated as well as the use of different amount of static data in the cost function. The results were evaluated according to the ability of the simplex method to locate the global optimum in each test case. In this way, a clear impression regarding the performance of the novel combination of the optimization algorithms, and the influence of the geometrical parameters of the flaw in defect identification problems was obtained.

Keywords: Defect identification, genetic algorithms, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1267
28 Decision Support Framework in Managerial Learning Environment for Organization

Authors: M. Mazhar Manzoor, Nasar.A, A. Sattar

Abstract:

In the open space of decision support system the mental impression of a manager-s decision has been the subject of large importance than the ordinary famous one, when helped by decision support system. Much of this study is an attempt to realize the relation of decision support system usage and decision outcomes that governs the system. For example, several researchers have proposed so many different models to analyze the linkage between decision support system processes and results of decision making. This study draws the important relation of manager-s mental approach with the use of decision support system. The findings of this paper are theoretical attempts to provide Decision Support System (DSS) in a way to exhibit and promote the learning in semi structured area. The proposed model shows the points of one-s learning improvements and maintains a theoretical approach in order to explore the DSS contribution in enhancing the decision forming and governing the system.

Keywords: Decision Support System , Learning Organization,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1430
27 Discovery of Time Series Event Patterns based on Time Constraints from Textual Data

Authors: Shigeaki Sakurai, Ken Ueno, Ryohei Orihara

Abstract:

This paper proposes a method that discovers time series event patterns from textual data with time information. The patterns are composed of sequences of events and each event is extracted from the textual data, where an event is characteristic content included in the textual data such as a company name, an action, and an impression of a customer. The method introduces 7 types of time constraints based on the analysis of the textual data. The method also evaluates these constraints when the frequency of a time series event pattern is calculated. We can flexibly define the time constraints for interesting combinations of events and can discover valid time series event patterns which satisfy these conditions. The paper applies the method to daily business reports collected by a sales force automation system and verifies its effectiveness through numerical experiments.

Keywords: Text mining, sequential mining, time constraints, daily business reports.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1463
26 Adaptive Score Normalization: A Novel Approach for Multimodal Biometric Systems

Authors: Anouar Ben Khalifa, Sami Gazzah, Najoua Essoukri BenAmara

Abstract:

Multimodal biometric systems integrate the data presented by multiple biometric sources, hence offering a better performance than the systems based on a single biometric modality. Although the coupling of biometric systems can be done at different levels, the fusion at the scores level is the most common since it has been proven effective than the rest of the fusion levels. However, the scores from different modalities are generally heterogeneous. A step of normalizing the scores is needed to transform these scores into a common domain before combining them. In this paper, we study the performance of several normalization techniques with various fusion methods in a context relating to the merger of three unimodal systems based on the face, the palmprint and the fingerprint. We also propose a new adaptive normalization method that takes into account the distribution of client scores and impostor scores. Experiments conducted on a database of 100 people show that the performances of a multimodal system depend on the choice of the normalization method and the fusion technique. The proposed normalization method has given the best results.

Keywords: Multibiometrics, Fusion, Score level, Score normalization, Adaptive normalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3519
25 Skin Effect: A Natural Phenomenon for Minimization of Ground Bounce in VLSI RC Interconnect

Authors: Shilpi Lavania

Abstract:

As the frequency of operation has attained a range of GHz and signal rise time continues to increase interconnect technology is suffering due to various high frequency effects as well as ground bounce problem. In some recent studies a high frequency effect i.e. skin effect has been modeled and its drawbacks have been discussed. This paper strives to make an impression on the advantage side of modeling skin effect for interconnect line. The proposed method has considered a CMOS with RC interconnect. Delay and noise considering ground bounce problem and with skin effect are discussed. The simulation results reveal an advantage of considering skin effect for minimization of ground bounce problem during the working of the model. Noise and delay variations with temperature are also presented.

Keywords: Interconnect, Skin effect, Ground Bounce, Delay, Noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3105
24 Improving Human Hand Localization in Indoor Environment by Using Frequency Domain Analysis

Authors: Wipassorn Vinicchayakul, Pichaya Supanakoon, Sathaporn Promwong

Abstract:

A human’s hand localization is revised by using radar cross section (RCS) measurements with a minimum root mean square (RMS) error matching algorithm on a touchless keypad mock-up model. RCS and frequency transfer function measurements are carried out in an indoor environment on the frequency ranged from 3.0 to 11.0 GHz to cover federal communications commission (FCC) standards. The touchless keypad model is tested in two different distances between the hand and the keypad. The initial distance of 19.50 cm is identical to the heights of transmitting (Tx) and receiving (Rx) antennas, while the second distance is 29.50 cm from the keypad. Moreover, the effects of Rx angles relative to the hand of human factor are considered. The RCS input parameters are compared with power loss parameters at each frequency. From the results, the performance of the RCS input parameters with the second distance, 29.50 cm at 3 GHz is better than the others.

Keywords: Radar cross section (RCS), fingerprint-based localization, minimum root mean square (RMS) error matching algorithm, touchless keypad model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1326
23 A Parametric Study of an Inverse Electrostatics Problem (IESP) Using Simulated Annealing, Hooke & Jeeves and Sequential Quadratic Programming in Conjunction with Finite Element and Boundary Element Methods

Authors: Ioannis N. Koukoulis, Clio G. Vossou, Christopher G. Provatidis

Abstract:

The aim of the current work is to present a comparison among three popular optimization methods in the inverse elastostatics problem (IESP) of flaw detection within a solid. In more details, the performance of a simulated annealing, a Hooke & Jeeves and a sequential quadratic programming algorithm was studied in the test case of one circular flaw in a plate solved by both the boundary element (BEM) and the finite element method (FEM). The proposed optimization methods use a cost function that utilizes the displacements of the static response. The methods were ranked according to the required number of iterations to converge and to their ability to locate the global optimum. Hence, a clear impression regarding the performance of the aforementioned algorithms in flaw identification problems was obtained. Furthermore, the coupling of BEM or FEM with these optimization methods was investigated in order to track differences in their performance.

Keywords: Elastostatic, inverse problem, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1846
22 A Multilanguage Source Code Retrieval System Using Structural-Semantic Fingerprints

Authors: Mohamed Amine Ouddan, Hassane Essafi

Abstract:

Source code retrieval is of immense importance in the software engineering field. The complex tasks of retrieving and extracting information from source code documents is vital in the development cycle of the large software systems. The two main subtasks which result from these activities are code duplication prevention and plagiarism detection. In this paper, we propose a Mohamed Amine Ouddan, and Hassane Essafi source code retrieval system based on two-level fingerprint representation, respectively the structural and the semantic information within a source code. A sequence alignment technique is applied on these fingerprints in order to quantify the similarity between source code portions. The specific purpose of the system is to detect plagiarism and duplicated code between programs written in different programming languages belonging to the same class, such as C, Cµ, Java and CSharp. These four languages are supported by the actual version of the system which is designed such that it may be easily adapted for any programming language.

Keywords: Source code retrieval, plagiarism detection, clonedetection, sequence alignment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1761
21 Cellulose Nanocrystals Suspensions as Water-Based Lubricants for Slurry Pump Gland Seals

Authors: Mohammad Javad Shariatzadeh, Dana Grecov

Abstract:

The tribological tests were performed on a new tribometer, in order to measure the coefficient of friction of a gland seal packing material on stainless steel shafts in presence of Cellulose Nanocrystal (CNC) suspension as a sustainable, environmentally friendly, water-based lubricant. To simulate the real situation from the slurry pumps, silica sands were used as slurry particles. The surface profiles after tests were measured by interferometer microscope to characterize the surface wear. Moreover, the coefficient of friction and surface wear were measured between stainless steel shaft and chrome steel ball to investigate the tribological effects of CNC in boundary lubrication region. Alignment of nanoparticles in the CNC suspensions are the main reason for friction and wear reduction. The homogeneous concentrated suspensions showed fingerprint patterns of a chiral nematic liquid crystal. These properties made CNC a very good lubricant additive in water.

Keywords: Gland seal, lubricant additives, nanocrystalline cellulose, water-based lubricants.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 785
20 Seismic Base Shear Force Depending on Building Fundamental Period and Site Conditions: Deterministic Formulation and Probabilistic Analysis

Authors: S. Dorbani, M. Badaoui, D. Benouar

Abstract:

The aim of this paper is to investigate the effect of the building fundamental period of reinforced concrete buildings of (6, 9, and 12-storey), with different floor plans: Symmetric, mono-symmetric, and unsymmetric. These structures are erected at different epicentral distances. Using the Boumerdes, Algeria (2003) earthquake data, we focused primarily on the establishment of the deterministic formulation linking the base shear force to two parameters: The first one is the fundamental period that represents the numerical fingerprint of the structure, and the second one is the epicentral distance used to represent the impact of the earthquake on this force. In a second step, with a view to highlight the effect of uncertainty in these parameters on the analyzed response, these parameters are modeled as random variables with a log-normal distribution. The variability of the coefficients of variation of the chosen uncertain parameters, on the statistics on the seismic base shear force, showed that the effect of uncertainty on fundamental period on this force statistics is low compared to the epicentral distance uncertainty influence.

Keywords: Base shear force, fundamental period, epicentral distance, uncertainty, lognormal variable, statistics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1266
19 The Concept of Place and Sense of Place In Architectural Studies

Authors: Mina Najafi, Mustafa Kamal Bin Mohd Shariff

Abstract:

Place is a where dimension formed by people-s relationship with physical settings, individual and group activities, and meanings. 'Place Attachment', 'Place Identity'and 'Sense of Place' are some concepts that could describe the quality of people-s relationships with a place. The concept of Sense of place is used in studying human-place bonding, attachment and place meaning. Sense of Place usually is defined as an overarching impression encompassing the general ways in which people feel about places, senses it, and assign concepts and values to it. Sense of place is highlighted in this article as one of the prevailing concepts among place-based researches. Considering dimensions of sense of place has always been beneficial for investigating public place attachment and pro-environmental attitudes towards these places. The creation or preservation of Sense of place is important in maintaining the quality of the environment as well as the integrity of human life within it. While many scholars argued that sense of place is a vague concept, this paper will summarize and analyze the existing seminal literature. Therefore, in this paper first the concept of Sense of place and its characteristics will be examined afterward the scales of Sense of place will be reviewed and the factors that contribute to form Sense of place will be evaluated and finally Place Attachment as an objective dimension for measuring the sense of place will be described.

Keywords: Place, Place Attachment, Sense of place

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18733
18 Fingerprint Compression Using Multiwavelets

Authors: Sudhakar.R, Jayaraman.S

Abstract:

Large volumes of fingerprints are collected and stored every day in a wide range of applications, including forensics, access control etc. It is evident from the database of Federal Bureau of Investigation (FBI) which contains more than 70 million finger prints. Compression of this database is very important because of this high Volume. The performance of existing image coding standards generally degrades at low bit-rates because of the underlying block based Discrete Cosine Transform (DCT) scheme. Over the past decade, the success of wavelets in solving many different problems has contributed to its unprecedented popularity. Due to implementation constraints scalar wavelets do not posses all the properties which are needed for better performance in compression. New class of wavelets called 'Multiwavelets' which posses more than one scaling filters overcomes this problem. The objective of this paper is to develop an efficient compression scheme and to obtain better quality and higher compression ratio through multiwavelet transform and embedded coding of multiwavelet coefficients through Set Partitioning In Hierarchical Trees algorithm (SPIHT) algorithm. A comparison of the best known multiwavelets is made to the best known scalar wavelets. Both quantitative and qualitative measures of performance are examined for Fingerprints.

Keywords: Mutiwavelet, Modified SPIHT Algorithm, SPIHT, Wavelet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1583
17 Pattern Recognition Techniques Applied to Biomedical Patterns

Authors: Giovanni Luca Masala

Abstract:

Pattern recognition is the research area of Artificial Intelligence that studies the operation and design of systems that recognize patterns in the data. Important application areas are image analysis, character recognition, fingerprint classification, speech analysis, DNA sequence identification, man and machine diagnostics, person identification and industrial inspection. The interest in improving the classification systems of data analysis is independent from the context of applications. In fact, in many studies it is often the case to have to recognize and to distinguish groups of various objects, which requires the need for valid instruments capable to perform this task. The objective of this article is to show several methodologies of Artificial Intelligence for data classification applied to biomedical patterns. In particular, this work deals with the realization of a Computer-Aided Detection system (CADe) that is able to assist the radiologist in identifying types of mammary tumor lesions. As an additional biomedical application of the classification systems, we present a study conducted on blood samples which shows how these methods may help to distinguish between carriers of Thalassemia (or Mediterranean Anaemia) and healthy subjects.

Keywords: Computer Aided Detection, mammary tumor, pattern recognition, dissimilarity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2328
16 Artificial Intelligence Techniques applied to Biomedical Patterns

Authors: Giovanni Luca Masala

Abstract:

Pattern recognition is the research area of Artificial Intelligence that studies the operation and design of systems that recognize patterns in the data. Important application areas are image analysis, character recognition, fingerprint classification, speech analysis, DNA sequence identification, man and machine diagnostics, person identification and industrial inspection. The interest in improving the classification systems of data analysis is independent from the context of applications. In fact, in many studies it is often the case to have to recognize and to distinguish groups of various objects, which requires the need for valid instruments capable to perform this task. The objective of this article is to show several methodologies of Artificial Intelligence for data classification applied to biomedical patterns. In particular, this work deals with the realization of a Computer-Aided Detection system (CADe) that is able to assist the radiologist in identifying types of mammary tumor lesions. As an additional biomedical application of the classification systems, we present a study conducted on blood samples which shows how these methods may help to distinguish between carriers of Thalassemia (or Mediterranean Anaemia) and healthy subjects.

Keywords: Computer Aided Detection, mammary tumor, pattern recognition, thalassemia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1401
15 Corporate Cautionary Statement: A Genre of Professional Communication

Authors: Chie Urawa

Abstract:

Cautionary statements or disclaimers in corporate annual reports need to be carefully designed because clear cautionary statements may protect a company in the case of legal disputes and may undermine positive impressions. This study compares the language of cautionary statements using two corpora, Sony’s cautionary statement corpus (S-corpus) and Panasonic’s cautionary statement corpus (P-corpus), illustrating the differences and similarities in relation to the use of meaningful cautionary statements and critically analyzing why practitioners use the way. The findings describe the distinct differences between the two companies in the presentation of the risk factors and the way how they make the statements. The word ability is used more for legal protection in S-corpus whereas the word possibility is used more to convey a better impression in P-corpus. The main similarities are identified in the use of lexical words and pronouns, and almost the same wordings for eight years. The findings show how they make the statements unique to the company in the presentation of risk factors, and the characteristics of specific genre of professional communication. Important implications of this study are that more comprehensive approach can be applied in other contexts, and be used by companies to reflect upon their cautionary statements.

Keywords: Cautionary statements, corporate annual reports, corpus, risk factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 826
14 A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations

Authors: Ramon Santana

Abstract:

The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.

Keywords: Fingerprint, template protection, bio-cryptography, minutiae protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 803
13 Identity Management in Virtual Worlds Based on Biometrics Watermarking

Authors: S. Bader, N. Essoukri Ben Amara

Abstract:

With the technological development and rise of virtual worlds, these spaces are becoming more and more attractive for cybercriminals, hidden behind avatars and fictitious identities. Since access to these spaces is not restricted or controlled, some impostors take advantage of gaining unauthorized access and practicing cyber criminality. This paper proposes an identity management approach for securing access to virtual worlds. The major purpose of the suggested solution is to install a strong security mechanism to protect virtual identities represented by avatars. Thus, only legitimate users, through their corresponding avatars, are allowed to access the platform resources. Access is controlled by integrating an authentication process based on biometrics. In the request process for registration, a user fingerprint is enrolled and then encrypted into a watermark utilizing a cancelable and non-invertible algorithm for its protection. After a user personalizes their representative character, the biometric mark is embedded into the avatar through a watermarking procedure. The authenticity of the avatar identity is verified when it requests authorization for access. We have evaluated the proposed approach on a dataset of avatars from various virtual worlds, and we have registered promising performance results in terms of authentication accuracy, acceptation and rejection rates.

Keywords: Identity management, security, biometrics authentication and authorization, avatar, virtual world.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1619
12 Sensory Characterization of Cookies with Chestnut Flour

Authors: Ljubica Dokić, Ivana Nikolić, Dragana Šoronja–Simović, Biljana Pajin, Nils Juul

Abstract:

In this work sensory characteristics of cookies with different amount of chestnut flour were determined by sensory and instrumental methods. The wheat flour for cookies was substituted with chestnut flour in three different levels (20, 40 and 60%) and the dough moisture was 22%. The control sample was with 100% of wheat flour. Sensory quality of the cookies was described using quantity descriptive method (QDA) by six trained members of descriptive panel. Instrumental evaluation included texture characterization by texture analyzer, the color measurements (CIE L*a*b* system) and determination by videometer.

The samples with 20% of chestnut flour were with highest ponderated score for overall sensory impression (17.6), which is very close to score for control sample (18). Increase in amount of chestnut flour caused decrease in scores for all sensory properties, thus overall sensory score decreased also. Compared to control sample and with increase in amount of chestnut flour, instrumental determination of the samples confirmed the sensory analysis results. The hardness of the cookies increased, as well as the values of red a* and yellow (b*) component coordinate, but the values for lightness (L*) decreased. Also the values, evaluated by videometer at defined wavelength, were the highest for control cookies and decreased with increase in amount of chestnut flour.

Keywords: Cookies, chestnut flour, sensory characteristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2788
11 Differential Analysis: Crew Resource Management and Profiles on the Balanced Inventory of Desirable Responding

Authors: Charalambos C. Cleanthous, Ryan Sain, Tabitha Black, Stephen Vera, Suzanne Milton

Abstract:

A concern when administering questionnaires is whether the participant is providing information that is accurate. The results may be invalid because the person is trying to present oneself in an unrealistic positive manner referred to as ‘faking good’, or in an unrealistic negative manner known as ‘faking bad’. The Balanced Inventory of Desirable Responding (BIDR) was used to assess commercial pilots’ responses on the two subscales of the BIDR: impression management (IM) and self-deceptive enhancement (SDE) that result in high or low scores. Thus, the BIDR produces four valid profiles: IM low and SDE low, IM high and SDE low, IM low and SDE high, and IM high and SDE high. The various profiles were used to compare the respondents’ answers to crew resource management (CRM) items developed from the USA Federal Aviation Administration’s (FAA) guidelines for CRM composition and training. Of particular interest were the results on the IM subscale. The comparisons between those scoring high (lying or faking) versus those low on the IM suggest that there were significant differences regarding their views of the various dimensions of CRM. One of the more disconcerting conclusions is that the high IM scores suggest that the pilots were trying to impress rather than honestly answer the questions regarding their CRM training and practice.

Keywords: USA commercial pilots, crew resource management, faking, social desirability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 911