Search results for: AI algorithm internal audit
5323 Existence of Financial Service Authority Prior to 2045
Authors: Syafril Hendrik Hutabarat, Hartiwiningsih, Pujiyono Suwadi
Abstract:
The Financial Service Authority (FSA) was formed as a response to the 1997 monetary crisis and the 2008 financial crisis so that it was more defensive in nature while developments in information and communication technology have required state policies to be more offensive to keep up with times. Reconstruction of Authorities of the FSA's Investigator is intended to keep the agency worthy to be part of an integrated criminal justice system in Indonesia which has implications for expanding its authority in line with efforts to protect and increase the welfare of the people. The results show that internal synergy between sub-sectors in the financial services sector is not optimised, some are even left behind so that the FSA is not truly an authority in the financial services sector. This research method is empirical. The goal of synergy must begin with internal synergy which has its moment when Indonesia gets a demographic bonus in the 2030s and becomes an international logistics hub supported by the national financial services sector.Keywords: reconstruction, authorities, FSA investigators, synergy, demography
Procedia PDF Downloads 765322 An Improved OCR Algorithm on Appearance Recognition of Electronic Components Based on Self-adaptation of Multifont Template
Authors: Zhu-Qing Jia, Tao Lin, Tong Zhou
Abstract:
The recognition method of Optical Character Recognition has been expensively utilized, while it is rare to be employed specifically in recognition of electronic components. This paper suggests a high-effective algorithm on appearance identification of integrated circuit components based on the existing methods of character recognition, and analyze the pros and cons.Keywords: optical character recognition, fuzzy page identification, mutual correlation matrix, confidence self-adaptation
Procedia PDF Downloads 5405321 A Proposed Algorithm for Obtaining the Map of Subscribers’ Density Distribution for a Mobile Wireless Communication Network
Authors: C. Temaneh-Nyah, F. A. Phiri, D. Karegeya
Abstract:
This paper presents an algorithm for obtaining the map of subscriber’s density distribution for a mobile wireless communication network based on the actual subscriber's traffic data obtained from the base station. This is useful in statistical characterization of the mobile wireless network.Keywords: electromagnetic compatibility, statistical analysis, simulation of communication network, subscriber density
Procedia PDF Downloads 3095320 An Efficient Algorithm of Time Step Control for Error Correction Method
Authors: Youngji Lee, Yonghyeon Jeon, Sunyoung Bu, Philsu Kim
Abstract:
The aim of this paper is to construct an algorithm of time step control for the error correction method most recently developed by one of the authors for solving stiff initial value problems. It is achieved with the generalized Chebyshev polynomial and the corresponding error correction method. The main idea of the proposed scheme is in the usage of the duplicated node points in the generalized Chebyshev polynomials of two different degrees by adding necessary sample points instead of re-sampling all points. At each integration step, the proposed method is comprised of two equations for the solution and the error, respectively. The constructed algorithm controls both the error and the time step size simultaneously and possesses a good performance in the computational cost compared to the original method. Two stiff problems are numerically solved to assess the effectiveness of the proposed scheme.Keywords: stiff initial value problem, error correction method, generalized Chebyshev polynomial, node points
Procedia PDF Downloads 5735319 A Context-Sensitive Algorithm for Media Similarity Search
Authors: Guang-Ho Cha
Abstract:
This paper presents a context-sensitive media similarity search algorithm. One of the central problems regarding media search is the semantic gap between the low-level features computed automatically from media data and the human interpretation of them. This is because the notion of similarity is usually based on high-level abstraction but the low-level features do not sometimes reflect the human perception. Many media search algorithms have used the Minkowski metric to measure similarity between image pairs. However those functions cannot adequately capture the aspects of the characteristics of the human visual system as well as the nonlinear relationships in contextual information given by images in a collection. Our search algorithm tackles this problem by employing a similarity measure and a ranking strategy that reflect the nonlinearity of human perception and contextual information in a dataset. Similarity search in an image database based on this contextual information shows encouraging experimental results.Keywords: context-sensitive search, image search, similarity ranking, similarity search
Procedia PDF Downloads 3655318 Segmentation of Gray Scale Images of Dropwise Condensation on Textured Surfaces
Authors: Helene Martin, Solmaz Boroomandi Barati, Jean-Charles Pinoli, Stephane Valette, Yann Gavet
Abstract:
In the present work we developed an image processing algorithm to measure water droplets characteristics during dropwise condensation on pillared surfaces. The main problem in this process is the similarity between shape and size of water droplets and the pillars. The developed method divides droplets into four main groups based on their size and applies the corresponding algorithm to segment each group. These algorithms generate binary images of droplets based on both their geometrical and intensity properties. The information related to droplets evolution during time including mean radius and drops number per unit area are then extracted from the binary images. The developed image processing algorithm is verified using manual detection and applied to two different sets of images corresponding to two kinds of pillared surfaces.Keywords: dropwise condensation, textured surface, image processing, watershed
Procedia PDF Downloads 2235317 Stress Distribution in Axisymmetric Indentation of an Elastic Layer-Substrate Body
Authors: Kotaro Miura, Makoto Sakamoto, Yuji Tanabe
Abstract:
We focus on internal stress and displacement of an elastic axisymmetric contact problem for indentation of a layer-substrate body. An elastic layer is assumed to be perfectly bonded to an elastic semi-infinite substrate. The elastic layer is smoothly indented with a flat-ended cylindrical indenter. The analytical and exact solutions were obtained by solving an infinite system of simultaneous equations using the method to express a normal contact stress at the upper surface of the elastic layer as an appropriate series. This paper presented the numerical results of internal stress and displacement distributions for hard-coating system with constant values of Poisson’s ratio and the thickness of elastic layer.Keywords: indentation, contact problem, stress distribution, coating materials, layer-substrate body
Procedia PDF Downloads 1565316 Analysis of Cooperative Learning Behavior Based on the Data of Students' Movement
Authors: Wang Lin, Li Zhiqiang
Abstract:
The purpose of this paper is to analyze the cooperative learning behavior pattern based on the data of students' movement. The study firstly reviewed the cooperative learning theory and its research status, and briefly introduced the k-means clustering algorithm. Then, it used clustering algorithm and mathematical statistics theory to analyze the activity rhythm of individual student and groups in different functional areas, according to the movement data provided by 10 first-year graduate students. It also focused on the analysis of students' behavior in the learning area and explored the law of cooperative learning behavior. The research result showed that the cooperative learning behavior analysis method based on movement data proposed in this paper is feasible. From the results of data analysis, the characteristics of behavior of students and their cooperative learning behavior patterns could be found.Keywords: behavior pattern, cooperative learning, data analyze, k-means clustering algorithm
Procedia PDF Downloads 1875315 An Algorithm for Determining the Arrival Behavior of a Secondary User to a Base Station in Cognitive Radio Networks
Authors: Danilo López, Edwin Rivas, Leyla López
Abstract:
This paper presents the development of an algorithm that predicts the arrival of a secondary user (SU) to a base station (BS) in a cognitive network based on infrastructure, requesting a Best Effort (BE) or Real Time (RT) type of service with a determined bandwidth (BW) implementing neural networks. The algorithm dynamically uses a neural network construction technique using the geometric pyramid topology and trains a Multilayer Perceptron Neural Networks (MLPNN) based on the historical arrival of an SU to estimate future applications. This will allow efficiently managing the information in the BS, since it precedes the arrival of the SUs in the stage of selection of the best channel in CRN. As a result, the software application determines the probability of arrival at a future time point and calculates the performance metrics to measure the effectiveness of the predictions made.Keywords: cognitive radio, base station, best effort, MLPNN, prediction, real time
Procedia PDF Downloads 3315314 Through Additive Manufacturing. A New Perspective for the Mass Production of Made in Italy Products
Authors: Elisabetta Cianfanelli, Paolo Pupparo, Maria Claudia Coppola
Abstract:
The recent evolutions in the innovation processes and in the intrinsic tendencies of the product development process, lead to new considerations on the design flow. The instability and complexity that contemporary life describes, defines new problems in the production of products, stimulating at the same time the adoption of new solutions across the entire design process. The advent of Additive Manufacturing, but also of IOT and AI technologies, continuously puts us in front of new paradigms regarding design as a social activity. The totality of these technologies from the point of view of application describes a whole series of problems and considerations immanent to design thinking. Addressing these problems may require some initial intuition and the use of some provisional set of rules or plausible strategies, i.e., heuristic reasoning. At the same time, however, the evolution of digital technology and the computational speed of new design tools describe a new and contrary design framework in which to operate. It is therefore interesting to understand the opportunities and boundaries of the new man-algorithm relationship. The contribution investigates the man-algorithm relationship starting from the state of the art of the Made in Italy model, the most known fields of application are described and then focus on specific cases in which the mutual relationship between man and AI becomes a new driving force of innovation for entire production chains. On the other hand, the use of algorithms could engulf many design phases, such as the definition of shape, dimensions, proportions, materials, static verifications, and simulations. Operating in this context, therefore, becomes a strategic action, capable of defining fundamental choices for the design of product systems in the near future. If there is a human-algorithm combination within a new integrated system, quantitative values can be controlled in relation to qualitative and material values. The trajectory that is described therefore becomes a new design horizon in which to operate, where it is interesting to highlight the good practices that already exist. In this context, the designer developing new forms can experiment with ways still unexpressed in the project and can define a new synthesis and simplification of algorithms, so that each artifact has a signature in order to define in all its parts, emotional and structural. This signature of the designer, a combination of values and design culture, will be internal to the algorithms and able to relate to digital technologies, creating a generative dialogue for design purposes. The result that is envisaged indicates a new vision of digital technologies, no longer understood only as of the custodians of vast quantities of information, but also as a valid integrated tool in close relationship with the design culture.Keywords: decision making, design euristics, product design, product design process, design paradigms
Procedia PDF Downloads 1195313 Effect of Diazepam on Internal Organs of Chrysomya megacephala Using Micro-Computed Tomograph
Authors: Sangkhao M., Butcher B. A.
Abstract:
Diazepam (known as valium) is a medication for calming effect. Many reports on committed suicide cases shown that diazepam is frequently used for this purpose. This research aims to study effect of diazepam on the development of forensically important blowflies, Chrysomya megacephala (Diptera: Calliphoridae) using micro-computed tomography (micro CT). In this study, four rabbits were treated with three different lethal doses of diazepam and one control (LD₀, LD₅₀, LD₁₀₀ and LC). The rabbit’s livers were removed for rearing the blowflies. Pupae were sampled for two series (ages; S1: 24h and S2: 120h) of development. After preparing the specimens, all samples were performed Micro CT using Skyscan 1172. The results shown the effect of diazepam on internal organs and tissues such as brain, cavity of the body, gas bubble, meconium and especially fat body. In the control group, in series 1 (LCS1), fat body was equally dispersed in the head, thorax, and abdomen, development of internal organs were not completed, however, brain, thoracic muscle, wings, legs and rectum were able to observe at 24h after developing into the pupal stage. Development of each organ in the control group in the series two was completed. In the treatment groups, LD₀, LD₅₀, LD₁₀₀ (Series 1 and Series 2), tissues are different, such as gas bubble in LD₀S1, was observed due to rapidity morphological changes during the metamorphosis of blowfly’s pupa in this treatment. Meconium was observed in LD₅₀S2 group because excretion of metabolic waste was not completed. All of the samples in the treatment groups had differentiation of fat bodies because metabolic activities were not completed and these changes affected on functions of every internal system. Discovering of differentiated fat bodies are important results because fat bodies of insect functions as liver in human, therefore it is shown that toxin eliminates from blowfly’s body and homeostatic maintenance of the hemolymph proteins, lipid and carbohydrates in each treatment group are abnormal.Keywords: forensic toxicology, forensic entomology, diptera, diazepam
Procedia PDF Downloads 1275312 Liaison Psychiatry in Baixo Alentejo, Portugal: Reality and Perspectives
Authors: Mariana Mangas, Yaroslava Martins, M. Suárez, Célia Santos, Ana Matos Pires
Abstract:
Baixo Alentejo is a region of Portugal characterized by an aging population, geographic isolation, social deprivation and a lack of medical staff. It is one of the most problematic regions in regards to mental health, particularly due to the factors mentioned. The aim of this study is a presentation of liaison psychiatry in Hospital José Joaquim Fernandes; a sample of the work done, the current situation and future perspectives. The aim is to present a retrospective study of internal psychiatric emergencies from January 1st, 2016 to August 31st, 2016. Liaison psychiatry of Department of Psychiatry and Mental Health (Psychiatry Service) of ULSBA includes the following activities: internal psychiatry emergencies, HIV consultation (comprised in the general consultation) and liaison psychology (oncology and pain), consisting of a total of 111 internal psychiatry emergencies during the identified period. Gender distribution was uniform. The most prevalent age group was 71-80 years, and 66,6% of patients were 60 years old and over. The majority of the emergency observations was requested by hospital services of medicine (56,8%) and surgery (24,3%). The most frequent reasons for admission were: respiratory disease (18,0%); tumors (15.3%); other surgical and orthopedic pathology (14,5%) and stroke (11,7%). The most frequent psychiatric diagnoses were: neurotic and organic depression (24,3%); delirium (26,1%) and adjustment reaction (14,5%). Major psychiatric pathology (schizophrenia and affective disorders) was found in 10,8%. Antidepressive medication was prescribed in 37,8% patients; antipsychotics in 34,2%. In 9.9% of the cases, no psychotropic drug was prescribed, and 5,4% of patients received psychologic support. Regarding hospital discharge, 42,4% of patients were referred to the general practitioner or to the medical specialist; 22,5% to outpatient gerontopsychiatry; 17,1% to psychiatric outpatient and 14,4% deceased. A future perspective is to start liaison in areas of HIV and psycho oncology in multidisciplinary approach and to improve collaboration with colleagues of other specialties for refining psychiatric referrals.Keywords: psychiatry, liaison, internal emergency, psychiatric referral
Procedia PDF Downloads 2495311 Image Reconstruction Method Based on L0 Norm
Authors: Jianhong Xiang, Hao Xiang, Linyu Wang
Abstract:
Compressed sensing (CS) has a wide range of applications in sparse signal reconstruction. Aiming at the problems of low recovery accuracy and long reconstruction time of existing reconstruction algorithms in medical imaging, this paper proposes a corrected smoothing L0 algorithm based on compressed sensing (CSL0). First, an approximate hyperbolic tangent function (AHTF) that is more similar to the L0 norm is proposed to approximate the L0 norm. Secondly, in view of the "sawtooth phenomenon" in the steepest descent method and the problem of sensitivity to the initial value selection in the modified Newton method, the use of the steepest descent method and the modified Newton method are jointly optimized to improve the reconstruction accuracy. Finally, the CSL0 algorithm is simulated on various images. The results show that the algorithm proposed in this paper improves the reconstruction accuracy of the test image by 0-0. 98dB.Keywords: smoothed L0, compressed sensing, image processing, sparse reconstruction
Procedia PDF Downloads 1155310 FE Analysis of Blade-Disc Dovetail Joints Using Mortar Base Frictional Contact Formulation
Authors: Abbas Moradi, Mohsen Safajoy, Reza Yazdanparast
Abstract:
Analysis of blade-disc dovetail joints is one of the biggest challenges facing designers of aero-engines. To avoid comparatively expensive experimental full-scale tests, numerical methods can be used to simulate loaded disc-blades assembly. Mortar method provides a powerful and flexible tool for solving frictional contact problems. In this study, 2D frictional contact in dovetail has been analysed based on the mortar algorithm. In order to model the friction, the classical law of coulomb and moving friction cone algorithm is applied. The solution is then obtained by solving the resulting set of non-linear equations using an efficient numerical algorithm based on Newton–Raphson Method. The numerical results show that this approach has better convergence rate and accuracy than other proposed numerical methods.Keywords: computational contact mechanics, dovetail joints, nonlinear FEM, mortar approach
Procedia PDF Downloads 3525309 Offset Dependent Uniform Delay Mathematical Optimization Model for Signalized Traffic Network Using Differential Evolution Algorithm
Authors: Tahseen Saad, Halim Ceylan, Jonathan Weaver, Osman Nuri Çelik, Onur Gungor Sahin
Abstract:
A new concept of uniform delay offset dependent mathematical optimization problem is derived as the main objective for this study using a differential evolution algorithm. To control the coordination problem, which depends on offset selection and to estimate uniform delay based on the offset choice in a traffic signal network. The assumption is the periodic sinusoidal function for arrival and departure patterns. The cycle time is optimized at the entry links and the optimized value is used in the non-entry links as a common cycle time. The offset optimization algorithm is used to calculate the uniform delay at each link. The results are illustrated by using a case study and are compared with the canonical uniform delay model derived by Webster and the highway capacity manual’s model. The findings show new model minimizes the total uniform delay to almost half compared to conventional models. The mathematical objective function is robust. The algorithm convergence time is fast.Keywords: area traffic control, traffic flow, differential evolution, sinusoidal periodic function, uniform delay, offset variable
Procedia PDF Downloads 2755308 Evaluation of the MCFLIRT Correction Algorithm in Head Motion from Resting State fMRI Data
Authors: V. Sacca, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone
Abstract:
In the last few years, resting-state functional MRI (rs-fMRI) was widely used to investigate the architecture of brain networks by investigating the Blood Oxygenation Level Dependent response. This technique represented an interesting, robust and reliable approach to compare pathologic and healthy subjects in order to investigate neurodegenerative diseases evolution. On the other hand, the elaboration of rs-fMRI data resulted to be very prone to noise due to confounding factors especially the head motion. Head motion has long been known to be a source of artefacts in task-based functional MRI studies, but it has become a particularly challenging problem in recent studies using rs-fMRI. The aim of this work was to evaluate in MS patients a well-known motion correction algorithm from the FMRIB's Software Library - MCFLIRT - that could be applied to minimize the head motion distortions, allowing to correctly interpret rs-fMRI results.Keywords: head motion correction, MCFLIRT algorithm, multiple sclerosis, resting state fMRI
Procedia PDF Downloads 2125307 Apollo Quality Program: The Essential Framework for Implementing Patient Safety
Authors: Anupam Sibal
Abstract:
Apollo Quality Program(AQP) was launched across the Apollo Group of Hospitals to address the four patient safety areas; Safety during Clinical Handovers, Medication Safety, Surgical Safety and the six International Patient Safety Goals(IPSGs) of JCI. A measurable, online, quality dashboard covering 20 process and outcome parameters was devised for monthly monitoring. The expected outcomes were also defined and categorized into green, yellow and red ranges. An audit methodology was also devised to check the processes for the measurable dashboard. Documented clinical handovers were introduced for the first time at many locations for in-house patient transfer, nursing-handover, and physician-handover. Prototype forms using the SBAR format were made. Patient-identifiers, read-back for verbal orders, safety of high-alert medications, site marking and time-outs and falls risk-assessment were introduced for all hospitals irrespective of accreditation status. Measurement of Surgical-Site-Infection (SSI) for 30 days postoperatively, was done. All hospitals now tracked the time of administration of antimicrobial prophylaxis before surgery. Situations with high risk of retention of foreign body were delineated and precautionary measures instituted. Audit of medications prescribed in the discharge summaries was made uniform. Formularies, prescription-audits and other means for reduction of medication errors were implemented. There is a marked increase in the compliance to processes and patient safety outcomes. Compliance to read-back for verbal orders rose from 86.83% in April’11 to 96.95% in June’15, to policy for high alert medications from 87.83% to 98.82%, to use of measures to prevent wrong-site, wrong-patient, wrong procedure surgery from 85.75% to 97.66%, to hand-washing from 69.18% to 92.54%, to antimicrobial prophylaxis within one hour before incision from 79.43% to 93.46%. Percentage of patients excluded from SSI calculation due to lack of follow-up for the requisite time frame decreased from 21.25% to 10.25%. The average AQP scores for all Apollo Hospitals improved from 62 in April’11 to 87.7 in Jun’15.Keywords: clinical handovers, international patient safety goals, medication safety, surgical safety
Procedia PDF Downloads 2565306 Fixed Point of Lipschitz Quasi Nonexpansive Mappings
Authors: Maryam Moosavi, Hadi Khatibzadeh
Abstract:
The main purpose of this paper is to study the proximal point algorithm for quasi-nonexpansive mappings in Hadamard spaces. △-convergence and strong convergence of cyclic resolvents for a finite family of quasi-nonexpansive mappings one to a fixed point of the mappings are establishedKeywords: Fixed point, Hadamard space, Proximal point algorithm, Quasi-nonexpansive sequence of mappings, Resolvent
Procedia PDF Downloads 915305 Performance Analysis of Proprietary and Non-Proprietary Tools for Regression Testing Using Genetic Algorithm
Authors: K. Hema Shankari, R. Thirumalaiselvi, N. V. Balasubramanian
Abstract:
The present paper addresses to the research in the area of regression testing with emphasis on automated tools as well as prioritization of test cases. The uniqueness of regression testing and its cyclic nature is pointed out. The difference in approach between industry, with business model as basis, and academia, with focus on data mining, is highlighted. Test Metrics are discussed as a prelude to our formula for prioritization; a case study is further discussed to illustrate this methodology. An industrial case study is also described in the paper, where the number of test cases is so large that they have to be grouped as Test Suites. In such situations, a genetic algorithm proposed by us can be used to reconfigure these Test Suites in each cycle of regression testing. The comparison is made between a proprietary tool and an open source tool using the above-mentioned metrics. Our approach is clarified through several tables.Keywords: APFD metric, genetic algorithm, regression testing, RFT tool, test case prioritization, selenium tool
Procedia PDF Downloads 4365304 Efficient Feature Fusion for Noise Iris in Unconstrained Environment
Authors: Yao-Hong Tsai
Abstract:
This paper presents an efficient fusion algorithm for iris images to generate stable feature for recognition in unconstrained environment. Recently, iris recognition systems are focused on real scenarios in our daily life without the subject’s cooperation. Under large variation in the environment, the objective of this paper is to combine information from multiple images of the same iris. The result of image fusion is a new image which is more stable for further iris recognition than each original noise iris image. A wavelet-based approach for multi-resolution image fusion is applied in the fusion process. The detection of the iris image is based on Adaboost algorithm and then local binary pattern (LBP) histogram is then applied to texture classification with the weighting scheme. Experiment showed that the generated features from the proposed fusion algorithm can improve the performance for verification system through iris recognition.Keywords: image fusion, iris recognition, local binary pattern, wavelet
Procedia PDF Downloads 3675303 A Parallel Implementation of k-Means in MATLAB
Authors: Dimitris Varsamis, Christos Talagkozis, Alkiviadis Tsimpiris, Paris Mastorocostas
Abstract:
The aim of this work is the parallel implementation of k-means in MATLAB, in order to reduce the execution time. Specifically, a new function in MATLAB for serial k-means algorithm is developed, which meets all the requirements for the conversion to a function in MATLAB with parallel computations. Additionally, two different variants for the definition of initial values are presented. In the sequel, the parallel approach is presented. Finally, the performance tests for the computation times respect to the numbers of features and classes are illustrated.Keywords: K-means algorithm, clustering, parallel computations, Matlab
Procedia PDF Downloads 3855302 Oil Pollution Analysis of the Ecuadorian Rainforest Using Remote Sensing Methods
Authors: Juan Heredia, Naci Dilekli
Abstract:
The Ecuadorian Rainforest has been polluted for almost 60 years with little to no regard to oversight, law, or regulations. The consequences have been vast environmental damage such as pollution and deforestation, as well as sickness and the death of many people and animals. The aim of this paper is to quantify and localize the polluted zones, which something that has not been conducted and is the first step for remediation. To approach this problem, multi-spectral Remote Sensing imagery was utilized using a novel algorithm developed for this study, based on four normalized indices available in the literature. The algorithm classifies the pixels in polluted or healthy ones. The results of this study include a new algorithm for pixel classification and quantification of the polluted area in the selected image. Those results were finally validated by ground control points found in the literature. The main conclusion of this work is that using hyperspectral images, it is possible to identify polluted vegetation. The future work is environmental remediation, in-situ tests, and more extensive results that would inform new policymaking.Keywords: remote sensing, oil pollution quatification, amazon forest, hyperspectral remote sensing
Procedia PDF Downloads 1635301 Verification & Validation of Map Reduce Program Model for Parallel K-Mediod Algorithm on Hadoop Cluster
Authors: Trapti Sharma, Devesh Kumar Srivastava
Abstract:
This paper is basically a analysis study of above MapReduce implementation and also to verify and validate the MapReduce solution model for Parallel K-Mediod algorithm on Hadoop Cluster. MapReduce is a programming model which authorize the managing of huge amounts of data in parallel, on a large number of devices. It is specially well suited to constant or moderate changing set of data since the implementation point of a position is usually high. MapReduce has slowly become the framework of choice for “big data”. The MapReduce model authorizes for systematic and instant organizing of large scale data with a cluster of evaluate nodes. One of the primary affect in Hadoop is how to minimize the completion length (i.e. makespan) of a set of MapReduce duty. In this paper, we have verified and validated various MapReduce applications like wordcount, grep, terasort and parallel K-Mediod clustering algorithm. We have found that as the amount of nodes increases the completion time decreases.Keywords: hadoop, mapreduce, k-mediod, validation, verification
Procedia PDF Downloads 3695300 Fingerprint Image Encryption Using a 2D Chaotic Map and Elliptic Curve Cryptography
Authors: D. M. S. Bandara, Yunqi Lei, Ye Luo
Abstract:
Fingerprints are suitable as long-term markers of human identity since they provide detailed and unique individual features which are difficult to alter and durable over life time. In this paper, we propose an algorithm to encrypt and decrypt fingerprint images by using a specially designed Elliptic Curve Cryptography (ECC) procedure based on block ciphers. In addition, to increase the confusing effect of fingerprint encryption, we also utilize a chaotic-behaved method called Arnold Cat Map (ACM) for a 2D scrambling of pixel locations in our method. Experimental results are carried out with various types of efficiency and security analyses. As a result, we demonstrate that the proposed fingerprint encryption/decryption algorithm is advantageous in several different aspects including efficiency, security and flexibility. In particular, using this algorithm, we achieve a margin of about 0.1% in the test of Number of Pixel Changing Rate (NPCR) values comparing to the-state-of-the-art performances.Keywords: arnold cat map, biometric encryption, block cipher, elliptic curve cryptography, fingerprint encryption, Koblitz’s encoding
Procedia PDF Downloads 2045299 HR MRI CS Based Image Reconstruction
Authors: Krzysztof Malczewski
Abstract:
Magnetic Resonance Imaging (MRI) reconstruction algorithm using compressed sensing is presented in this paper. It is exhibited that the offered approach improves MR images spatial resolution in circumstances when highly undersampled k-space trajectories are applied. Compressed Sensing (CS) aims at signal and images reconstructing from significantly fewer measurements than were conventionally assumed necessary. Magnetic Resonance Imaging (MRI) is a fundamental medical imaging method struggles with an inherently slow data acquisition process. The use of CS to MRI has the potential for significant scan time reductions, with visible benefits for patients and health care economics. In this study the objective is to combine super-resolution image enhancement algorithm with CS framework benefits to achieve high resolution MR output image. Both methods emphasize on maximizing image sparsity on known sparse transform domain and minimizing fidelity. The presented algorithm considers the cardiac and respiratory movements.Keywords: super-resolution, MRI, compressed sensing, sparse-sense, image enhancement
Procedia PDF Downloads 4305298 Triangulations via Iterated Largest Angle Bisection
Authors: Yeonjune Kang
Abstract:
A triangulation of a planar region is a partition of that region into triangles. In the finite element method, triangulations are often used as the grid underlying a computation. In order to be suitable as a finite element mesh, a triangulation must have well-shaped triangles, according to criteria that depend on the details of the particular problem. For instance, most methods require that all triangles be small and as close to the equilateral shape as possible. Stated differently, one wants to avoid having either thin or flat triangles in the triangulation. There are many triangulation procedures, a particular one being the one known as the longest edge bisection algorithm described below. Starting with a given triangle, locate the midpoint of the longest edge and join it to the opposite vertex of the triangle. Two smaller triangles are formed; apply the same bisection procedure to each of these triangles. Continuing in this manner after n steps one obtains a triangulation of the initial triangle into 2n smaller triangles. The longest edge algorithm was first considered in the late 70’s. It was shown by various authors that this triangulation has the desirable properties for the finite element method: independently of the number of iterations the angles of these triangles cannot get too small; moreover, the size of the triangles decays exponentially. In the present paper we consider a related triangulation algorithm we refer to as the largest angle bisection procedure. As the name suggests, rather than bisecting the longest edge, at each step we bisect the largest angle. We study the properties of the resulting triangulation and prove that, while the general behavior resembles the one in the longest edge bisection algorithm, there are several notable differences as well.Keywords: angle bisectors, geometry, triangulation, applied mathematics
Procedia PDF Downloads 4015297 Traditional Drawing, BIM and Erudite Design Process
Authors: Maryam Kalkatechi
Abstract:
Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.Keywords: erudite, data-sketch, algorithm design in architecture, design process
Procedia PDF Downloads 2755296 Facial Biometric Privacy Using Visual Cryptography: A Fundamental Approach to Enhance the Security of Facial Biometric Data
Authors: Devika Tanna
Abstract:
'Biometrics' means 'life measurement' but the term is usually associated with the use of unique physiological characteristics to identify an individual. It is important to secure the privacy of digital face image that is stored in central database. To impart privacy to such biometric face images, first, the digital face image is split into two host face images such that, each of it gives no idea of existence of the original face image and, then each cover image is stored in two different databases geographically apart. When both the cover images are simultaneously available then only we can access that original image. This can be achieved by using the XM2VTS and IMM face database, an adaptive algorithm for spatial greyscale. The algorithm helps to select the appropriate host images which are most likely to be compatible with the secret image stored in the central database based on its geometry and appearance. The encryption is done using GEVCS which results in a reconstructed image identical to the original private image.Keywords: adaptive algorithm, database, host images, privacy, visual cryptography
Procedia PDF Downloads 1305295 Psychometrics of the Farsi Version of the Newcastle Nursing Care Satisfaction Scale in Patients Admitted to the Internal and General Surgery Departments of Hospitals Affiliated with Ardabil University of Medical Sciences in 2017
Authors: Mansoureh Karimollahi, Mehriar Adrmohammadi, Mohsen Mohammadi
Abstract:
Introduction: Patient satisfaction with nursing care is considered as an important indicator of the quality and effectiveness of the health care system, and improving the quality of care is not possible without paying attention to the opinions and expectations of patients. Considering that the scales for assessing satisfaction with nursing care in our country are not comprehensive and measure very few areas, therefore, in this study, psychometrically, the Persian version of the Newcastle Nursing Care Satisfaction Scale was used in patients hospitalized in the wards. Internal medicine and general surgery were discussed. Methods: This cross-sectional study was conducted on 200 patients admitted to the surgery and internal departments of hospitals affiliated to Ardabil University of Medical Sciences. The Newcastle nursing care satisfaction scale was used for the first time in Iran in comparison with the good nursing care scale from the patients' point of view to evaluate the criterion validity. The Newcastle nursing care satisfaction scale was used after translation, validity, and reliability. Results: The level of satisfaction of patients and the experience of patients with nursing care was at a favorable level, respectively, with an average of 111.8 ± 14.2 and 69.07 ± 14.8. Total CVI was estimated at 0.96 for the experience section, 0.95 for the satisfaction section, and 0.96 for the whole scale. The index (CVR) was also 0.95 for the experience section, 0.95 for the satisfaction section, and 0.95 for the whole scale. Criterion validity was also estimated using 0.725 correlation. The validity of the construct was also confirmed using the goodness of fit index (X2=1932/05, p=0.013, KMO=0.913). Convergent validity was estimated at 0.99 in the experience subscale and 0.98 in the satisfaction subscale. . The overall reliability in the experience subscale and satisfaction subscale was 94%, 92%, and 98%, respectively, which indicated the acceptable reliability of the questionnaire. Conclusion: The Persian version of the Newcastle nursing care satisfaction scale as a comprehensive tool that can be easily completed by patients and is easy to interpret, has good validity and reliability and can be used in patient care centers, in departments Surgery, and internal medicine are recommended.Keywords: psychometrics, Newcastle nursing care satisfaction scale, nursing care satisfaction, general surgery department
Procedia PDF Downloads 985294 A Non-Parametric Based Mapping Algorithm for Use in Audio Fingerprinting
Authors: Analise Borg, Paul Micallef
Abstract:
Over the past few years, the online multimedia collection has grown at a fast pace. Several companies showed interest to study the different ways to organize the amount of audio information without the need of human intervention to generate metadata. In the past few years, many applications have emerged on the market which are capable of identifying a piece of music in a short time. Different audio effects and degradation make it much harder to identify the unknown piece. In this paper, an audio fingerprinting system which makes use of a non-parametric based algorithm is presented. Parametric analysis is also performed using Gaussian Mixture Models (GMMs). The feature extraction methods employed are the Mel Spectrum Coefficients and the MPEG-7 basic descriptors. Bin numbers replaced the extracted feature coefficients during the non-parametric modelling. The results show that non-parametric analysis offer potential results as the ones mentioned in the literature.Keywords: audio fingerprinting, mapping algorithm, Gaussian Mixture Models, MFCC, MPEG-7
Procedia PDF Downloads 421