Search results for: information processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5171

Search results for: information processing

4151 2.5D Face Recognition Using Gabor Discrete Cosine Transform

Authors: Ali Cheraghian, Farshid Hajati, Soheila Gheisari, Yongsheng Gao

Abstract:

In this paper, we present a novel 2.5D face recognition method based on Gabor Discrete Cosine Transform (GDCT). In the proposed method, the Gabor filter is applied to extract feature vectors from the texture and the depth information. Then, Discrete Cosine Transform (DCT) is used for dimensionality and redundancy reduction to improve computational efficiency. The system is combined texture and depth information in the decision level, which presents higher performance compared to methods, which use texture and depth information, separately. The proposed algorithm is examined on publically available Bosphorus database including models with pose variation. The experimental results show that the proposed method has a higher performance compared to the benchmark.

Keywords: Gabor filter, discrete cosine transform, 2.5D face recognition, pose.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1754
4150 Automatic Classification of Lung Diseases from CT Images

Authors: Abobaker Mohammed Qasem Farhan, Shangming Yang, Mohammed Al-Nehari

Abstract:

Pneumonia is a kind of lung disease that creates congestion in the chest. Such pneumonic conditions lead to loss of life due to the severity of high congestion. Pneumonic lung disease is caused by viral pneumonia, bacterial pneumonia, or COVID-19 induced pneumonia. The early prediction and classification of such lung diseases help reduce the mortality rate. We propose the automatic Computer-Aided Diagnosis (CAD) system in this paper using the deep learning approach. The proposed CAD system takes input from raw computerized tomography (CT) scans of the patient's chest and automatically predicts disease classification. We designed the Hybrid Deep Learning Algorithm (HDLA) to improve accuracy and reduce processing requirements. The raw CT scans are pre-processed first to enhance their quality for further analysis. We then applied a hybrid model that consists of automatic feature extraction and classification. We propose the robust 2D Convolutional Neural Network (CNN) model to extract the automatic features from the pre-processed CT image. This CNN model assures feature learning with extremely effective 1D feature extraction for each input CT image. The outcome of the 2D CNN model is then normalized using the Min-Max technique. The second step of the proposed hybrid model is related to training and classification using different classifiers. The simulation outcomes using the publicly available dataset prove the robustness and efficiency of the proposed model compared to state-of-art algorithms.

Keywords: CT scans, COVID-19, deep learning, image processing, pneumonia, lung disease.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 610
4149 An Approach for Integration of Industrial Robot with Vision System and Simulation Software

Authors: Ahmed Sh. Khusheef, Ganesh Kothapalli, Majid Tolouei-Rad

Abstract:

Utilization of various sensors has made it possible to extend capabilities of industrial robots. Among these are vision sensors that are used for providing visual information to assist robot controllers. This paper presents a method of integrating a vision system and a simulation program with an industrial robot. The vision system is employed to detect a target object and compute its location in the robot environment. Then, the target object-s information is sent to the robot controller via parallel communication port. The robot controller uses the extracted object information and the simulation program to control the robot arm for approaching, grasping and relocating the object. This paper presents technical details of system components and describes the methodology used for this integration. It also provides a case study to prove the validity of the methodology developed.

Keywords: industrial robot, integration, simulation, vision system

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2224
4148 Anisotropic Total Fractional Order Variation Model in Seismic Data Denoising

Authors: Jianwei Ma, Diriba Gemechu

Abstract:

In seismic data processing, attenuation of random noise is the basic step to improve quality of data for further application of seismic data in exploration and development in different gas and oil industries. The signal-to-noise ratio of the data also highly determines quality of seismic data. This factor affects the reliability as well as the accuracy of seismic signal during interpretation for different purposes in different companies. To use seismic data for further application and interpretation, we need to improve the signal-to-noise ration while attenuating random noise effectively. To improve the signal-to-noise ration and attenuating seismic random noise by preserving important features and information about seismic signals, we introduce the concept of anisotropic total fractional order denoising algorithm. The anisotropic total fractional order variation model defined in fractional order bounded variation is proposed as a regularization in seismic denoising. The split Bregman algorithm is employed to solve the minimization problem of the anisotropic total fractional order variation model and the corresponding denoising algorithm for the proposed method is derived. We test the effectiveness of theproposed method for synthetic and real seismic data sets and the denoised result is compared with F-X deconvolution and non-local means denoising algorithm.

Keywords: Anisotropic total fractional order variation, fractional order bounded variation, seismic random noise attenuation, Split Bregman Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1014
4147 Tag Broker Model for Protecting Privacy in RFID Environment

Authors: Sokjoon Lee, Howon Kim, Kyoil Chung

Abstract:

RFID system, in which we give identification number to each item and detect it with radio frequency, supports more variable service than barcode system can do. For example, a refrigerator with RFID reader and internet connection will automatically notify expiration of food validity to us. But, in spite of its convenience, RFID system has some security threats, because anybody can get ID information of item easily. One of most critical threats is privacy invasion. Existing privacy protection schemes or systems have been proposed, and these schemes or systems defend normal users from attempts that any attacker tries to get information using RFID tag value. But, these systems still have weakness that attacker can get information using analogous value instead of original tag value. In this paper, we mention this type of attack more precisely and suggest 'Tag Broker Model', which can defend it. Tag broker in this model translates original tag value to random value, and user can only get random value. Attacker can not use analogous tag value, because he/she is not able to know original one from it.

Keywords: Broker, EPC, Privacy, RFID.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
4146 Collaborative Planning and Forecasting

Authors: Neha Asthana, Vishal Krishna Prasad

Abstract:

Collaborative Planning and Forecasting is an innovative and systematic approach towards productive integration and assimilation of data synergized into information. The changing and variable market dynamics have persuaded global business chains to incorporate Collaborative Planning and Forecasting as an imperative tool. Thus, it is essential for the supply chains to constantly improvise, update its nature, and mould as per changing global environment.

Keywords: Information transfer, Forecasting, Optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1905
4145 Nanotechnology in Military Development

Authors: Andrus Pedai, Igor Astrov

Abstract:

Nanotechnology is the new cyber, according to several major leaders in this field. Just as cyber is entrenched across global society now, nano is poised to be major capabilities enabler of the next decades. Expert members from the National Nanotechnology Initiative (in U.S.) representing government and science disciplines say nano has great significance for the military and the general public. It is predicted that after next 15 years nanotechnology will replace information technology as the most economic technology platform. Nanotechnology has even wider applications than information technology.

Keywords: Nanomaterials, nanowires, nanotechnology, sensors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2224
4144 Information Filtering using Index Word Selection based on the Topics

Authors: Takeru YOKOI, Hidekazu YANAGIMOTO, Sigeru OMATU

Abstract:

We have proposed an information filtering system using index word selection from a document set based on the topics included in a set of documents. This method narrows down the particularly characteristic words in a document set and the topics are obtained by Sparse Non-negative Matrix Factorization. In information filtering, a document is often represented with the vector in which the elements correspond to the weight of the index words, and the dimension of the vector becomes larger as the number of documents is increased. Therefore, it is possible that useless words as index words for the information filtering are included. In order to address the problem, the dimension needs to be reduced. Our proposal reduces the dimension by selecting index words based on the topics included in a document set. We have applied the Sparse Non-negative Matrix Factorization to the document set to obtain these topics. The filtering is carried out based on a centroid of the learning document set. The centroid is regarded as the user-s interest. In addition, the centroid is represented with a document vector whose elements consist of the weight of the selected index words. Using the English test collection MEDLINE, thus, we confirm the effectiveness of our proposal. Hence, our proposed selection can confirm the improvement of the recommendation accuracy from the other previous methods when selecting the appropriate number of index words. In addition, we discussed the selected index words by our proposal and we found our proposal was able to select the index words covered some minor topics included in the document set.

Keywords: Information Filtering, Sparse NMF, Index wordSelection, User Profile, Chi-squared Measure

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1456
4143 The Mutated Distance between Two Mixture Trees

Authors: Wan Chian Li, Justie Su-Tzu Juan, Yi-Chun Wang, Shu-Chuan Chen

Abstract:

The evolutionary tree is an important topic in bioinformation. In 2006, Chen and Lindsay proposed a new method to build the mixture tree from DNA sequences. Mixture tree is a new type evolutionary tree, and it has two additional information besides the information of ordinary evolutionary tree. One of the information is time parameter, and the other is the set of mutated sites. In 2008, Lin and Juan proposed an algorithm to compute the distance between two mixture trees. Their algorithm computes the distance with only considering the time parameter between two mixture trees. In this paper, we proposes a method to measure the similarity of two mixture trees with considering the set of mutated sites and develops two algorithm to compute the distance between two mixture trees. The time complexity of these two proposed algorithms are O(n2 × max{h(T1), h(T2)}) and O(n2), respectively

Keywords: evolutionary tree, mixture tree, mutated site, distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1417
4142 Multimedia E-Books for Digital Mechanism and Gear Library

Authors: Rike Brecht, Heidi Krömker, Adrian Kühlewind

Abstract:

This paper presents a digital engineering library – the Digital Mechanism and Gear Library, DMG-Lib – providing a multimedia collection of e-books, pictures, videos and animations in the domain of mechanisms and machines. The specific characteristic about DMG-Lib is the enrichment and cross-linking of the different sources. DMG-Lib e-books not only present pages as pixel images but also selected figures augmented with interactive animations. The presentation of animations in e-books increases the clearness of the information. To present the multimedia e-books and make them available in the DMG-Lib internet portal a special e-book reader called StreamBook was developed for optimal presentation of digitized books and to enable reading the e-books as well as working efficiently and individually with the enriched information. The objective is to support different user tasks ranging from information retrieval to development and design of mechanisms.

Keywords: E-books, digital library, multimedia, enrichment and cross-linking

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1630
4141 A Decision Support System Based on Leprosy Scales

Authors: Dennys Robson Girardi, Hugo Bulegon, Claudia Maria Moro Barra

Abstract:

Leprosy is an infectious disease caused by Mycobacterium Leprae, this disease, generally, compromises the neural fibers, leading to the development of disability. Disabilities are changes that limit daily activities or social life of a normal individual. When comes to leprosy, the study of disability considered the functional limitation (physical disabilities), the limitation of activity and social participation, which are measured respectively by the scales: EHF, SALSA and PARTICIPATION SCALE. The objective of this work is to propose an on-line monitoring of leprosy patients, which is based on information scales EHF, SALSA and PARTICIPATION SCALE. It is expected that the proposed system is applied in monitoring the patient during treatment and after healing therapy of the disease. The correlations that the system is between the scales create a variety of information, presented the state of the patient and full of changes or reductions in disability. The system provides reports with information from each of the scales and the relationships that exist between them. This way, health professionals, with access to patient information, can intervene with techniques for the Prevention of Disability. Through the automated scale, the system shows the level of the patient and allows the patient, or the responsible, to take a preventive measure. With an online system, it is possible take the assessments and monitor patients from anywhere.

Keywords: Leprosy, Medical Informatics, Decision SupportSystem, Disability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2048
4140 The Service Failure and Recovery in the Information Technology Services

Authors: Jun Luo, Weiguo Zhang., Dabin Qin

Abstract:

It is important to retain customer satisfaction in information technology services. When a service failure occurs, companies need to take service recovery action to recover their customer satisfaction. Although companies cannot avoid all problems and complaints, they should try to make up. Therefore, service failure and service recovery have become an important and challenging issue for companies. In this paper, the literature and the problems in the information technology services were reviewed. An integrated model of profit driven for the service failure and service recovery was established in view of the benefit of customer and enterprise. Moreover, the interaction between service failure and service recovery strategy was studied, the result of which verified the matching principles of the service recovery strategy and the type of service failure. In addition, the relationship between the cost of service recovery and customer-s cumulative value of service after recovery was analyzed with the model. The result attributes to managers in deciding on appropriate resource allocations for recovery strategies.

Keywords: service failure, service recovery, informationtechnology services

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2106
4139 Development of Position Changing System for Obstructive Sleep Apnea Patient using HRV

Authors: Soo- Young Ye, Dong-Hyun Kim

Abstract:

Obstructive sleep apnea in patients, between 70 and 80 percent, can be cured with just a posture correcting. The most import thing to do this is detection of obstructive sleep apnea. Detection of obstructive sleep apnea can be performed through heart rate variability analysis using power spectrum density analysis. After HRV analysis we needed to know the current position information for correcting the position. The pressure sensors of the array type were used to obtain position information. These sensors can obtain information from the experimenter about position. In addition, air cylinder corrected the position of the experimenter by lifting the bed. The experimenter can be changed position without breaking during sleep by the system. Polysomnograph recording were obtained from 10 patients. The results of HRV analysis were that NLF and LF/HF ratio increased, while NHF decreased during OSA. Position change had to be done the periods.

Keywords: Obstructive sleep apnea, Heart rate variability, Air cylinder, PSD, RR interval, ANS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1690
4138 Conceptual Multidimensional Model

Authors: Manpreet Singh, Parvinder Singh, Suman

Abstract:

The data is available in abundance in any business organization. It includes the records for finance, maintenance, inventory, progress reports etc. As the time progresses, the data keep on accumulating and the challenge is to extract the information from this data bank. Knowledge discovery from these large and complex databases is the key problem of this era. Data mining and machine learning techniques are needed which can scale to the size of the problems and can be customized to the application of business. For the development of accurate and required information for particular problem, business analyst needs to develop multidimensional models which give the reliable information so that they can take right decision for particular problem. If the multidimensional model does not possess the advance features, the accuracy cannot be expected. The present work involves the development of a Multidimensional data model incorporating advance features. The criterion of computation is based on the data precision and to include slowly change time dimension. The final results are displayed in graphical form.

Keywords: Multidimensional, data precision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1458
4137 A Self Supervised Bi-directional Neural Network (BDSONN) Architecture for Object Extraction Guided by Beta Activation Function and Adaptive Fuzzy Context Sensitive Thresholding

Authors: Siddhartha Bhattacharyya, Paramartha Dutta, Ujjwal Maulik, Prashanta Kumar Nandi

Abstract:

A multilayer self organizing neural neural network (MLSONN) architecture for binary object extraction, guided by a beta activation function and characterized by backpropagation of errors estimated from the linear indices of fuzziness of the network output states, is discussed. Since the MLSONN architecture is designed to operate in a single point fixed/uniform thresholding scenario, it does not take into cognizance the heterogeneity of image information in the extraction process. The performance of the MLSONN architecture with representative values of the threshold parameters of the beta activation function employed is also studied. A three layer bidirectional self organizing neural network (BDSONN) architecture comprising fully connected neurons, for the extraction of objects from a noisy background and capable of incorporating the underlying image context heterogeneity through variable and adaptive thresholding, is proposed in this article. The input layer of the network architecture represents the fuzzy membership information of the image scene to be extracted. The second layer (the intermediate layer) and the final layer (the output layer) of the network architecture deal with the self supervised object extraction task by bi-directional propagation of the network states. Each layer except the output layer is connected to the next layer following a neighborhood based topology. The output layer neurons are in turn, connected to the intermediate layer following similar topology, thus forming a counter-propagating architecture with the intermediate layer. The novelty of the proposed architecture is that the assignment/updating of the inter-layer connection weights are done using the relative fuzzy membership values at the constituent neurons in the different network layers. Another interesting feature of the network lies in the fact that the processing capabilities of the intermediate and the output layer neurons are guided by a beta activation function, which uses image context sensitive adaptive thresholding arising out of the fuzzy cardinality estimates of the different network neighborhood fuzzy subsets, rather than resorting to fixed and single point thresholding. An application of the proposed architecture for object extraction is demonstrated using a synthetic and a real life image. The extraction efficiency of the proposed network architecture is evaluated by a proposed system transfer index characteristic of the network.

Keywords: Beta activation function, fuzzy cardinality, multilayer self organizing neural network, object extraction,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1566
4136 Providing On-Demand Path and Arrival Time Information Considering Realtime Delays of Buses

Authors: Yoshifumi Ishizaki, Naoki Kanatani, Masaki Ito, Toshihiko Sasama, Takao Kawamura, Kazunori Sugahara

Abstract:

This paper demonstrates the bus location system for the route bus through the experiment in the real environment. A bus location system is a system that provides information such as the bus delay and positions. This system uses actual services and positions data of buses, and those information should match data on the database. The system has two possible problems. One, the system could cost high in preparing devices to get bus positions. Two, it could be difficult to match services data of buses. To avoid these problems, we have developed this system at low cost and short time by using the smart phone with GPS and the bus route system. This system realizes the path planning considering bus delay and displaying position of buses on the map. The bus location system was demonstrated on route buses with smart phones for two months.

Keywords: Route Bus, Path Planning System, GPS, Smart Phone.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1449
4135 Distributed Cost-Based Scheduling in Cloud Computing Environment

Authors: Rupali, Anil Kumar Jaiswal

Abstract:

Cloud computing can be defined as one of the prominent technologies that lets a user change, configure and access the services online. it can be said that this is a prototype of computing that helps in saving cost and time of a user practically the use of cloud computing can be found in various fields like education, health, banking etc.  Cloud computing is an internet dependent technology thus it is the major responsibility of Cloud Service Providers(CSPs) to care of data stored by user at data centers. Scheduling in cloud computing environment plays a vital role as to achieve maximum utilization and user satisfaction cloud providers need to schedule resources effectively.  Job scheduling for cloud computing is analyzed in the following work. To complete, recreate the task calculation, and conveyed scheduling methods CloudSim3.0.3 is utilized. This research work discusses the job scheduling for circulated processing condition also by exploring on this issue we find it works with minimum time and less cost. In this work two load balancing techniques have been employed: ‘Throttled stack adjustment policy’ and ‘Active VM load balancing policy’ with two brokerage services ‘Advanced Response Time’ and ‘Reconfigure Dynamically’ to evaluate the VM_Cost, DC_Cost, Response Time, and Data Processing Time. The proposed techniques are compared with Round Robin scheduling policy.

Keywords: Physical machines, virtual machines, support for repetition, self-healing, highly scalable programming model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 851
4134 Artifacts in Spiral X-ray CT Scanners: Problems and Solutions

Authors: Mehran Yazdi, Luc Beaulieu

Abstract:

Artifact is one of the most important factors in degrading the CT image quality and plays an important role in diagnostic accuracy. In this paper, some artifacts typically appear in Spiral CT are introduced. The different factors such as patient, equipment and interpolation algorithm which cause the artifacts are discussed and new developments and image processing algorithms to prevent or reduce them are presented.

Keywords: CT artifacts, Spiral CT, Artifact removal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4506
4133 Modeling Peer-to-Peer Networks with Interest-Based Clusters

Authors: Bertalan Forstner, Dr. Hassan Charaf

Abstract:

In the world of Peer-to-Peer (P2P) networking different protocols have been developed to make the resource sharing or information retrieval more efficient. The SemPeer protocol is a new layer on Gnutella that transforms the connections of the nodes based on semantic information to make information retrieval more efficient. However, this transformation causes high clustering in the network that decreases the number of nodes reached, therefore the probability of finding a document is also decreased. In this paper we describe a mathematical model for the Gnutella and SemPeer protocols that captures clustering-related issues, followed by a proposition to modify the SemPeer protocol to achieve moderate clustering. This modification is a sort of link management for the individual nodes that allows the SemPeer protocol to be more efficient, because the probability of a successful query in the P2P network is reasonably increased. For the validation of the models, we evaluated a series of simulations that supported our results.

Keywords: Peer-to-Peer, model, performance, networkmanagement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1306
4132 Management Decision System for the Documentary Archives in the Library of a Public Moroccan Institution: Case of Sultan Moulay Slimane University, Beni Mellal

Authors: Jaouad Oukrich, Belaid Bouikhalene, Noureddine Askour

Abstract:

This paper deals with the problem of management of information resources in libraries of the public institution Sultan Moulay Slimane University (SMSU) in order to analyze the satisfaction of the readers, and allow university leaders to make better strategic and instant decisions. For this, the integration of an integrated management decision library system is a priority program of higher education, as part of the Digital Morocco, which has a proactive policy to develop the use of new technologies information and communication in higher institutions. This operational information system can provide better services to the students and for the leaders. Our approach is to integrate the tools of business intelligence (BI) in the library management by using power BI.

Keywords: PMB, integrated library management system, ILMS, document, SMSU, power BI, satisfaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1902
4131 Time Series Forecasting Using Various Deep Learning Models

Authors: Jimeng Shi, Mahek Jain, Giri Narasimhan

Abstract:

Time Series Forecasting (TSF) is used to predict the target variables at a future time point based on the learning from previous time points. To keep the problem tractable, learning methods use data from a fixed length window in the past as an explicit input. In this paper, we study how the performance of predictive models change as a function of different look-back window sizes and different amounts of time to predict into the future. We also consider the performance of the recent attention-based transformer models, which had good success in the image processing and natural language processing domains. In all, we compare four different deep learning methods (Recurrent Neural Network (RNN), Long Short-term Memory (LSTM), Gated Recurrent Units (GRU), and Transformer) along with a baseline method. The dataset (hourly) we used is the Beijing Air Quality Dataset from the website of University of California, Irvine (UCI), which includes a multivariate time series of many factors measured on an hourly basis for a period of 5 years (2010-14). For each model, we also report on the relationship between the performance and the look-back window sizes and the number of predicted time points into the future. Our experiments suggest that Transformer models have the best performance with the lowest Mean   Absolute Errors (MAE = 14.599, 23.273) and Root Mean Square Errors (RSME = 23.573, 38.131) for most of our single-step and multi-steps predictions. The best size for the look-back window to predict 1 hour into the future appears to be one day, while 2 or 4 days perform the best to predict 3 hours into the future.

Keywords: Air quality prediction, deep learning algorithms, time series forecasting, look-back window.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1170
4130 Fusion of Colour and Depth Information to Enhance Wound Tissue Classification

Authors: Darren Thompson, Philip Morrow, Bryan Scotney, John Winder

Abstract:

Patients with diabetes are susceptible to chronic foot wounds which may be difficult to manage and slow to heal. Diagnosis and treatment currently rely on the subjective judgement of experienced professionals. An objective method of tissue assessment is required. In this paper, a data fusion approach was taken to wound tissue classification. The supervised Maximum Likelihood and unsupervised Multi-Modal Expectation Maximisation algorithms were used to classify tissues within simulated wound models by weighting the contributions of both colour and 3D depth information. It was found that, at low weightings, depth information could show significant improvements in classification accuracy when compared to classification by colour alone, particularly when using the maximum likelihood method. However, larger weightings were found to have an entirely negative effect on accuracy.

Keywords: Classification, data fusion, diabetic foot, stereophotogrammetry, tissue colour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1710
4129 Experimental Correlation for Erythrocyte Aggregation Rate in Population Balance Modeling

Authors: Erfan Niazi, Marianne Fenech

Abstract:

Red Blood Cells (RBCs) or erythrocytes tend to form chain-like aggregates under low shear rate called rouleaux. This is a reversible process and rouleaux disaggregate in high shear rates. Therefore, RBCs aggregation occurs in the microcirculation where low shear rates are present but does not occur under normal physiological conditions in large arteries. Numerical modeling of RBCs interactions is fundamental in analytical models of a blood flow in microcirculation. Population Balance Modeling (PBM) is particularly useful for studying problems where particles agglomerate and break in a two phase flow systems to find flow characteristics. In this method, the elementary particles lose their individual identity due to continuous destructions and recreations by break-up and agglomeration. The aim of this study is to find RBCs aggregation in a dynamic situation. Simplified PBM was used previously to find the aggregation rate on a static observation of the RBCs aggregation in a drop of blood under the microscope. To find aggregation rate in a dynamic situation we propose an experimental set up testing RBCs sedimentation. In this test, RBCs interact and aggregate to form rouleaux. In this configuration, disaggregation can be neglected due to low shear stress. A high-speed camera is used to acquire video-microscopic pictures of the process. The sizes of the aggregates and velocity of sedimentation are extracted using an image processing techniques. Based on the data collection from 5 healthy human blood samples, the aggregation rate was estimated as 2.7x103(±0.3 x103) 1/s.

Keywords: Red blood cell, Rouleaux, microfluidics, image processing, population balance modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1058
4128 Information Security Risk in Financial Institutions

Authors: James A. Nelson

Abstract:

The history of technology and banking is examined as it relates to risk and technological determinism. It is proposed that the services that banks offer are determined by technology and that banks must adopt new technologies to be competitive. The adoption of technologies paradoxically forces the adoption of other new technologies to protect the bank from the increased risk of technology. This cycle will lead to bank examiners and regulators to focus on human behavior, not on the ever changing technology.

Keywords: Banking, information security, risk, technologicaldeterminism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1684
4127 User Selections on Social Network Applications

Authors: C. C. Liang

Abstract:

MSN used to be the most popular application for communicating among social networks, but Facebook chat is now the most popular. Facebook and MSN have similar characteristics, including usefulness, ease-of-use, and a similar function, which is the exchanging of information with friends. Facebook outperforms MSN in both of these areas. However, the adoption of Facebook and abandonment of MSN have occurred for other reasons. Functions can be improved, but users’ willingness to use does not just depend on functionality. Flow status has been established to be crucial to users’ adoption of cyber applications and to affects users’ adoption of software applications. If users experience flow in using software application, they will enjoy using it frequently, and even change their preferred application from an old to this new one. However, no investigation has examined choice behavior related to switching from Facebook to MSN based on a consideration of flow experiences and functions. This investigation discusses the flow experiences and functions of social-networking applications. Flow experience is found to affect perceived ease of use and perceived usefulness; perceived ease of use influences information ex-change with friends, and perceived usefulness; information exchange influences perceived usefulness, but information exchange has no effect on flow experience.

Keywords: Consumer behavior, social media, technology acceptance model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1406
4126 Omni: Data Science Platform for Evaluate Performance of a LoRaWAN Network

Authors: Emanuele A. Solagna, Ricardo S, Tozetto, Roberto dos S. Rabello

Abstract:

Nowadays, physical processes are becoming digitized by the evolution of communication, sensing and storage technologies which promote the development of smart cities. The evolution of this technology has generated multiple challenges related to the generation of big data and the active participation of electronic devices in society. Thus, devices can send information that is captured and processed over large areas, but there is no guarantee that all the obtained data amount will be effectively stored and correctly persisted. Because, depending on the technology which is used, there are parameters that has huge influence on the full delivery of information. This article aims to characterize the project, currently under development, of a platform that based on data science will perform a performance and effectiveness evaluation of an industrial network that implements LoRaWAN technology considering its main parameters configuration relating these parameters to the information loss.

Keywords: Internet of Things, LoRa, LoRaWAN, smart cities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 713
4125 Learning Block Memories with Metric Networks

Authors: Mario Gonzalez, David Dominguez, Francisco B. Rodriguez

Abstract:

An attractor neural network on the small-world topology is studied. A learning pattern is presented to the network, then a stimulus carrying local information is applied to the neurons and the retrieval of block-like structure is investigated. A synaptic noise decreases the memory capability. The change of stability from local to global attractors is shown to depend on the long-range character of the network connectivity.

Keywords: Hebbian learning, image recognition, small world, spatial information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1866
4124 Managing IT Departments in Higher Education Institutes: Coping with the Exponentially Growing Needs and Expectations

Authors: Balqees A. Al-Thuhli, Ali H. Al-Badi, Khamis Al-Gharbi

Abstract:

Information technology is changing rapidly and the users’ expectations are also growing. Dealing with these changes in information technology, while satisfying the users’ needs and expectations is a big challenge. IT managers need to explore new mechanisms/strategies to enable them to cope with such challenges.

 The objectives of this research are to identify the significant challenges that might face IT managers in higher education institutes in the face of the high and ever growing customer expectations and to propose possible solutions to cope with such high-speed changes in information technology.

To achieve these objectives, interviews with the IT professionals from different higher education institutes in Oman were conducted. In addition, documentation (printed and online) related to these institutions were studied and an intensive literature review of published work was examined.

The findings of this research are expected to give a better understanding of the challenges that might face the IT managers at higher education institutes. This acquired understanding is expected to highlight the importance of being adaptable and fast in keeping up with the ever-growing technological changes. Moreover, adopting different tools and technologies could assist IT managers in developing their organisations’ IT policies and strategies.

Keywords: Information technology, IT rapid changes, CIO roles, challenges, IT managers, coping mechanisms, users' expectations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1598
4123 The Success of E-Collaborative in E-Commerce: The Study of B2C Business in Thailand

Authors: Wanida Suwunniponth

Abstract:

The objectives of this research were to study the influencing factors that contributed to the success of e-collaborative in e-commerce of B2C (Business to Customer) business in Bangkok, Thailand. The influencing factors included organization, people, information technology and the process of e-collaborative. A questionnaire was used to collect data from 200 small e-commerce businesses and the path analysis was utilized as the tool for data analysis. By using the path analysis, it was revealed that the factors concerning with organization, people and information technology played an influence on e-collaborative process and the success of ecollaborative, whereas the process of e-collaborative factor manipulated its success. The findings suggested that B2C ecommerce business in Thailand should opt in improvement approach in terms of managerial structure, leaderships, staff’s skills and knowledge, and investment of information technology in order to capacitate higher efficiency of e-collaborative process that would result in profit and competitive advantage.

Keywords: E-collaborative, E-commerce, B2C.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2043
4122 Customer Value Creation by CRM System in Electronic Device Companies

Authors: Hideki.Kobayashi, Hiroshi.Osada

Abstract:

The service industry accounts for about 70% of GDP of Japan, and the importance of the service innovation is pointed out. The importance of the system use and the support service increases in the information system that is one of the service industries. However, because the system is not used enough, the purpose for which it was originally intended cannot often be achieved in the CRM system. To promote the use of the system, the effective service method is needed. It is thought that the service model's making and the clarification of the success factors are necessary to improve the operation service of the CRM system. In this research the model of the operation service in the CRM system is made.

Keywords: Information system, Operation service, Serviceinnovation, Solution

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1316