Search results for: computer virus classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4924

Search results for: computer virus classification

3784 Human Computer Interaction Using Computer Vision and Speech Processing

Authors: Shreyansh Jain Jeetmal, Shobith P. Chadaga, Shreyas H. Srinivas

Abstract:

Internet of Things (IoT) is seen as the next major step in the ongoing revolution in the Information Age. It is predicted that in the near future billions of embedded devices will be communicating with each other to perform a plethora of tasks with or without human intervention. One of the major ongoing hotbed of research activity in IoT is Human Computer Interaction (HCI). HCI is used to facilitate communication between an intelligent system and a user. An intelligent system typically comprises of a system consisting of various sensors, actuators and embedded controllers which communicate with each other to monitor data collected from the environment. Communication by the user to the system is typically done using voice. One of the major ongoing applications of HCI is in home automation as a personal assistant. The prime objective of our project is to implement a use case of HCI for home automation. Our system is designed to detect and recognize the users and personalize the appliances in the house according to their individual preferences. Our HCI system is also capable of speaking with the user when certain commands are spoken such as searching on the web for information and controlling appliances. Our system can also monitor the environment in the house such as air quality and gas leakages for added safety.

Keywords: human computer interaction, internet of things, computer vision, sensor networks, speech to text, text to speech, android

Procedia PDF Downloads 356
3783 Concussion Prediction for Speed Skater Impacting on Crash Mats by Computer Simulation Modeling

Authors: Yilin Liao, Hewen Li, Paula McConvey

Abstract:

Concussion for speed skaters often occurs when skaters fall on the ice and impact the crash mats during practices and competition races. Gaining insight into the impact of interactions is of essential interest as it is directly related to skaters’ potential health risks and injuries. Precise concussion measurements are challenging and very difficult, making computer simulation the only reliable way to analyze accidents. This research aims to create the crash mat and skater’s multi-body model using Solidworks, develop a computer simulation model for skater-mat impact using ANSYS software, and predict the skater’s concussion degree by evaluating the “head injury criteria” (HIC) through the resulting accelerations. The developed method and results help understand the relationship between impact parameters and concussion risk for speed skaters and inform the design of crash mats and skating rink layouts more specifically by considering athletes’ health risks.

Keywords: computer simulation modeling, concussion, impact, speed skater

Procedia PDF Downloads 134
3782 Fatigue Life Estimation of Tubular Joints - A Comparative Study

Authors: Jeron Maheswaran, Sudath C. Siriwardane

Abstract:

In fatigue analysis, the structural detail of tubular joint has taken great attention among engineers. The DNV-RP-C203 is covering this topic quite well for simple and clear joint cases. For complex joint and geometry, where joint classification isn’t available and limitation on validity range of non-dimensional geometric parameters, the challenges become a fact among engineers. The classification of joint is important to carry out through the fatigue analysis. These joint configurations are identified by the connectivity and the load distribution of tubular joints. To overcome these problems to some extent, this paper compare the fatigue life of tubular joints in offshore jacket according to the stress concentration factors (SCF) in DNV-RP-C203 and finite element method employed Abaqus/CAE. The paper presents the geometric details, material properties and considered load history of the jacket structure. Describe the global structural analysis and identification of critical tubular joints for fatigue life estimation. Hence fatigue life is determined based on the guidelines provided by design codes. Fatigue analysis of tubular joints is conducted using finite element employed Abaqus/CAE [4] as next major step. Finally, obtained SCFs and fatigue lives are compared and their significances are discussed.

Keywords: fatigue life, stress-concentration factor, finite element analysis, offshore jacket structure

Procedia PDF Downloads 443
3781 Chronic Hepatitis C Virus Screening: The Role, Strategies and Challenging of Primary Healthcare Faced to Augment and Identify Asymptomatic Infected Patients

Authors: Tarek K. Jalouta, Jolietta R. Holliman, Kathryn R. Burke, Kathleen M. Bewley-Thomas

Abstract:

Background: Chronic hepatitis C virus (HCV) infection is one of the leading causes of liver cirrhosis and hepatocellular carcinoma. In the United States, HCV screening awareness, treatment, and linkage to care are under continues ascending progress. However, still millions of people are asymptomatically infected and undiagnosed yet. Through this community mission, we sought to identify the best and the newest strategies to identify those infected people to educate them, link them to care and cure them. Methods: We have identified patients that did not have a prior HCV screening in our Electronic medical record (EMR) including all our different hospital locations (South Suburban Chicago, Northern, Western and Central Indiana). Providing education to all Primary care/Gastroenterology/Infectious diseases providers and staff in the clinic to increase awareness of the HCV screening. Health-related quality of life, chronic clinical complications, and demographics data were collected for each patient. All outcomes of HCV antibody-reactive and HCV RNA–positive results were identified and statistically analyzed. Results: From July 2016 to July 2018 we screened 35,720 individuals of birth cohort in our different Franciscan’s health medical centers. Of the screened population, 986 (2.7%) individuals were HCV AB-reactive. Of those, 319 (1%) patients were HCV RNA-positive, and 264 patients were counseled and linked to providers. 34 patients initiated anti-HCV therapy with successful treatment. Conclusions: Our HCV screening augmentation project considered the largest screening program in the Midwest. Augmenting the HCV screening process through creating a Best Practice Alert (BPA) in the EMR (Epic Sys.) and point of care testing could be helpful. Although continued work is required, our team is working on increase screening through adding HCV test to CBC-Panels in Emergency Department settings, phone calls to all birth cohort individuals through Robo-Calling System aimed to reach 75,000 individuals by 2019. However, a better linkage to care and referral monitoring system to all HCV RNA positive patients is still needed, and access to therapy, especially for uninsured patients, is challenging.

Keywords: chronic hepatitis C, chronic hepatitis C treatment, chronic hepatitis C screening, chronic hepatitis C prevention, liver cancer

Procedia PDF Downloads 120
3780 Investigating the Causes of Human Error-Induced Incidents in the Maintenance Operations of Petrochemical Industry by Using Human Factors Analysis and Classification System

Authors: Omid Kalatpour, Mohammadreza Ajdari

Abstract:

This article studied the possible causes of human error-induced incidents in the petrochemical industry maintenance activities by using Human Factors Analysis and Classification System (HFACS). The purpose of the study was anticipating and identifying these causes and proposing corrective and preventive actions. Maintenance department in a petrochemical company was selected for research. A checklist of human error-induced incidents was developed based on four HFACS main levels and nineteen sub-groups. Hierarchical task analysis (HTA) technique was used to identify maintenance activities and tasks. The main causes of possible incidents were identified by checklist and recorded. Corrective and preventive actions were defined depending on priority. Analyzing the worksheets of 444 activities in four levels of HFACS showed 37.6% of the causes were at the level of unsafe actions, 27.5% at the level of unsafe supervision, 20.9% at the level of preconditions for unsafe acts and 14% of the causes were at the level of organizational effects. The HFACS sub-groups showed errors (24.36%) inadequate supervision (14.89%) and violations (13.26%) with the most frequency. According to findings of this study, increasing the training effectiveness of operators and supervision improvement respectively are the most important measures in decreasing the human error-induced incidents in petrochemical industry maintenance.

Keywords: human error, petrochemical industry, maintenance, HFACS

Procedia PDF Downloads 230
3779 Stimulating Young Children Social Interaction Behaviour through Computer Play Activities: The Role of Teachers and Parents Support

Authors: Mahani Razali, Nordin Mamat

Abstract:

The purpose of the study is to explore how computer technology is integrated into pre-school activities and its relationship with children’s social interaction behaviour in pre-school classroom. The major question of interest in the present study is to investigate the social interaction behaviour of children when using computers in the Malaysian pre-school classroom. This research is based on three main objectives which are to identify children`s social interaction during computer play activities, teacher’s role and parent’s participation to develop children`s social interaction. This qualitative study was carried out among 25 pre-school children, three teachers and three parents as the research sample. On the other hand, parent’s support was obtained from their discussions, supervisions and communication at home. The data collection procedures involved structured observation which was to identify social interaction behaviour among pre-school children through computer play activities; as for semi-structured interviews, it was done to study the perception of the teachers and parents on the acquired social interaction behaviour among the children. Besides, documentation analysis method was used as to triangulate acquired information with observations and interviews. In this study, the qualitative data analysis was tabulated in descriptive manner with frequency and percentage format. This study primarily focused on social interaction behaviour elements among the pre-school children. Findings revealed that the children showed positive outcomes on the social interaction behaviour during their computer play. This research summarizes that teacher’s role and parent’s support can improve children`s social interaction behaviour through computer play activities. As a whole, this research highlighted the significance of computer play activities as to stimulate social interaction behavior among the pre-school children.

Keywords: early childhood, emotional development, parent support, play

Procedia PDF Downloads 356
3778 SCNet: A Vehicle Color Classification Network Based on Spatial Cluster Loss and Channel Attention Mechanism

Authors: Fei Gao, Xinyang Dong, Yisu Ge, Shufang Lu, Libo Weng

Abstract:

Vehicle color recognition plays an important role in traffic accident investigation. However, due to the influence of illumination, weather, and noise, vehicle color recognition still faces challenges. In this paper, a vehicle color classification network based on spatial cluster loss and channel attention mechanism (SCNet) is proposed for vehicle color recognition. A channel attention module is applied to extract the features of vehicle color representative regions and reduce the weight of nonrepresentative color regions in the channel. The proposed loss function, called spatial clustering loss (SC-loss), consists of two channel-specific components, such as a concentration component and a diversity component. The concentration component forces all feature channels belonging to the same class to be concentrated through the channel cluster. The diversity components impose additional constraints on the channels through the mean distance coefficient, making them mutually exclusive in spatial dimensions. In the comparison experiments, the proposed method can achieve state-of-the-art performance on the public datasets, VCD, and VeRi, which are 96.1% and 96.2%, respectively. In addition, the ablation experiment further proves that SC-loss can effectively improve the accuracy of vehicle color recognition.

Keywords: feature extraction, convolutional neural networks, intelligent transportation, vehicle color recognition

Procedia PDF Downloads 170
3777 3 Dimensional (3D) Assesment of Hippocampus in Alzheimer’s Disease

Authors: Mehmet Bulent Ozdemir, Sultan Çagirici, Sahika Pinar Akyer, Fikri Turk

Abstract:

Neuroanatomical appearance can be correlated with clinical or other characteristics of illness. With the introduction of diagnostic imaging machines, producing 3D images of anatomic structures, calculating the correlation between subjects and pattern of the structures have become possible. The aim of this study is to examine the 3D structure of hippocampus in cases with Alzheimer disease in different dementia severity. For this purpose, 62 female and 38 male- 68 patients’s (age range between 52 and 88) MR scanning were imported to the computer. 3D model of each right and left hippocampus were developed by a computer aided propramme-Surf Driver 3.5. Every reconstruction was taken by the same investigator. There were different apperance of hippocampus from normal to abnormal. In conclusion, These results might improve the understanding of the correlation between the morphological changes in hippocampus and clinical staging in Alzheimer disease.

Keywords: Alzheimer disease, hippocampus, computer-assisted anatomy, 3D

Procedia PDF Downloads 476
3776 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs

Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.

Abstract:

Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.

Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification

Procedia PDF Downloads 111
3775 A Mechanical Diagnosis Method Based on Vibration Fault Signal down-Sampling and the Improved One-Dimensional Convolutional Neural Network

Authors: Bowei Yuan, Shi Li, Liuyang Song, Huaqing Wang, Lingli Cui

Abstract:

Convolutional neural networks (CNN) have received extensive attention in the field of fault diagnosis. Many fault diagnosis methods use CNN for fault type identification. However, when the amount of raw data collected by sensors is massive, the neural network needs to perform a time-consuming classification task. In this paper, a mechanical fault diagnosis method based on vibration signal down-sampling and the improved one-dimensional convolutional neural network is proposed. Through the robust principal component analysis, the low-rank feature matrix of a large amount of raw data can be separated, and then down-sampling is realized to reduce the subsequent calculation amount. In the improved one-dimensional CNN, a smaller convolution kernel is used to reduce the number of parameters and computational complexity, and regularization is introduced before the fully connected layer to prevent overfitting. In addition, the multi-connected layers can better generalize classification results without cumbersome parameter adjustments. The effectiveness of the method is verified by monitoring the signal of the centrifugal pump test bench, and the average test accuracy is above 98%. When compared with the traditional deep belief network (DBN) and support vector machine (SVM) methods, this method has better performance.

Keywords: fault diagnosis, vibration signal down-sampling, 1D-CNN

Procedia PDF Downloads 125
3774 An Anthropometric and Postural Risk Assessment of Students in Computer Laboratories of a State University

Authors: Sarah Louise Cruz, Jemille Venturina

Abstract:

Ergonomics considers the capabilities and limitations of a person as they interact with tools, equipment, facilities and tasks in their work environment. Workplace is one example of physical work environment, be it a workbench or a desk. In school laboratories, sitting is the most common working posture of the students. Students maintain static sitting posture as they perform different computer-aided activities. The College of Engineering and College of Information and Communication Technology of a State University consist of twenty-two computer laboratories. Normally, students aren’t usually aware of the importance of sustaining proper sitting posture while doing their long hour computer laboratory activities. The study evaluates the perceived discomfort and working postures of students as they are exposed on current workplace design of computer laboratories. The current study utilizes Rapid Upper Limb Assessment (RULA), Body Discomfort Chart using Borg’s CR-10 Scale Rating and Quick Exposure Checklist in order to assess the posture and the current working condition. The result of the study may possibly minimize the body discomfort experienced by the students. The researchers redesign the individual workstations which includes working desk, sitting stool and other workplace design components. Also, the economic variability of each alternative was considered given that the study focused on improvement of facilities of a state university.

Keywords: computer workstation, ergonomics, posture, students, workplace

Procedia PDF Downloads 304
3773 Combination of Artificial Neural Network Model and Geographic Information System for Prediction Water Quality

Authors: Sirilak Areerachakul

Abstract:

Water quality has initiated serious management efforts in many countries. Artificial Neural Network (ANN) models are developed as forecasting tools in predicting water quality trend based on historical data. This study endeavors to automatically classify water quality. The water quality classes are evaluated using 6 factor indices. These factors are pH value (pH), Dissolved Oxygen (DO), Biochemical Oxygen Demand (BOD), Nitrate Nitrogen (NO3N), Ammonia Nitrogen (NH3N) and Total Coliform (T-Coliform). The methodology involves applying data mining techniques using multilayer perceptron (MLP) neural network models. The data consisted of 11 sites of Saen Saep canal in Bangkok, Thailand. The data is obtained from the Department of Drainage and Sewerage Bangkok Metropolitan Administration during 2007-2011. The results of multilayer perceptron neural network exhibit a high accuracy multilayer perception rate at 94.23% in classifying the water quality of Saen Saep canal in Bangkok. Subsequently, this encouraging result could be combined with GIS data improves the classification accuracy significantly.

Keywords: artificial neural network, geographic information system, water quality, computer science

Procedia PDF Downloads 334
3772 Quantitative Structure–Activity Relationship Analysis of Some Benzimidazole Derivatives by Linear Multivariate Method

Authors: Strahinja Z. Kovačević, Lidija R. Jevrić, Sanja O. Podunavac Kuzmanović

Abstract:

The relationship between antibacterial activity of eighteen different substituted benzimidazole derivatives and their molecular characteristics was studied using chemometric QSAR (Quantitative Structure–Activity Relationships) approach. QSAR analysis has been carried out on inhibitory activity towards Staphylococcus aureus, by using molecular descriptors, as well as minimal inhibitory activity (MIC). Molecular descriptors were calculated from the optimized structures. Principal component analysis (PCA) followed by hierarchical cluster analysis (HCA) and multiple linear regression (MLR) was performed in order to select molecular descriptors that best describe the antibacterial behavior of the compounds investigated, and to determine the similarities between molecules. The HCA grouped the molecules in separated clusters which have the similar inhibitory activity. PCA showed very similar classification of molecules as the HCA, and displayed which descriptors contribute to that classification. MLR equations, that represent MIC as a function of the in silico molecular descriptors were established. The statistical significance of the estimated models was confirmed by standard statistical measures and cross-validation parameters (SD = 0.0816, F = 46.27, R = 0.9791, R2CV = 0.8266, R2adj = 0.9379, PRESS = 0.1116). These parameters indicate the possibility of application of the established chemometric models in prediction of the antibacterial behaviour of studied derivatives and structurally very similar compounds.

Keywords: antibacterial, benzimidazole, molecular descriptors, QSAR

Procedia PDF Downloads 358
3771 A Computer-Aided System for Tooth Shade Matching

Authors: Zuhal Kurt, Meral Kurt, Bilge T. Bal, Kemal Ozkan

Abstract:

Shade matching and reproduction is the most important element of success in prosthetic dentistry. Until recently, shade matching procedure was implemented by dentists visual perception with the help of shade guides. Since many factors influence visual perception; tooth shade matching using visual devices (shade guides) is highly subjective and inconsistent. Subjective nature of this process has lead to the development of instrumental devices. Nowadays, colorimeters, spectrophotometers, spectroradiometers and digital image analysing systems are used for instrumental shade selection. Instrumental devices have advantages that readings are quantifiable, can obtain more rapidly and simply, objectively and precisely. However, these devices have noticeable drawbacks. For example, translucent structure and irregular surfaces of teeth lead to defects on measurement with these devices. Also between the results acquired by devices with different measurement principles may make inconsistencies. So, its obligatory to search for new methods for dental shade matching process. A computer-aided system device; digital camera has developed rapidly upon today. Currently, advances in image processing and computing have resulted in the extensive use of digital cameras for color imaging. This procedure has a much cheaper process than the use of traditional contact-type color measurement devices. Digital cameras can be taken by the place of contact-type instruments for shade selection and overcome their disadvantages. Images taken from teeth show morphology and color texture of teeth. In last decades, a new method was recommended to compare the color of shade tabs taken by a digital camera using color features. This method showed that visual and computer-aided shade matching systems should be used as concatenated. Recently using methods of feature extraction techniques are based on shape description and not used color information. However, color is mostly experienced as an essential property in depicting and extracting features from objects in the world around us. When local feature descriptors with color information are extended by concatenating color descriptor with the shape descriptor, that descriptor will be effective on visual object recognition and classification task. Therefore, the color descriptor is to be used in combination with a shape descriptor it does not need to contain any spatial information, which leads us to use local histograms. This local color histogram method is remain reliable under variation of photometric changes, geometrical changes and variation of image quality. So, coloring local feature extraction methods are used to extract features, and also the Scale Invariant Feature Transform (SIFT) descriptor used to for shape description in the proposed method. After the combination of these descriptors, the state-of-art descriptor named by Color-SIFT will be used in this study. Finally, the image feature vectors obtained from quantization algorithm are fed to classifiers such as Nearest Neighbor (KNN), Naive Bayes or Support Vector Machines (SVM) to determine label(s) of the visual object category or matching. In this study, SVM are used as classifiers for color determination and shade matching. Finally, experimental results of this method will be compared with other recent studies. It is concluded from the study that the proposed method is remarkable development on computer aided tooth shade determination system.

Keywords: classifiers, color determination, computer-aided system, tooth shade matching, feature extraction

Procedia PDF Downloads 424
3770 A Study on the Impacts of Computer Aided Design on the Architectural Design Process

Authors: Halleh Nejadriahi, Kamyar Arab

Abstract:

Computer-aided design (CAD) tools have been extensively used by the architects for the several decades. It has evolved from being a simple drafting tool to being an intelligent architectural software and a powerful means of communication for architects. CAD plays an essential role in the profession of architecture and is a basic tool for any architectural firm. It is not possible for an architectural firm to compete without taking the advantage of computer software, due to the high demand and competition in the architectural industry. The aim of this study is to evaluate the impacts of CAD on the architectural design process from conceptual level to final product, particularly in architectural practice. It examines the range of benefits of integrating CAD into the industry and discusses the possible defects limiting the architects. Method of this study is qualitatively based on data collected from the professionals’ perspective. The identified benefits and limitations of CAD on the architectural design process will raise the awareness of professionals on the potentials of CAD and proper utilization of that in the industry, which would result in a higher productivity along with a better quality in the architectural offices.

Keywords: architecture, architectural practice, computer aided design (CAD), design process

Procedia PDF Downloads 351
3769 Feature Selection Approach for the Classification of Hydraulic Leakages in Hydraulic Final Inspection using Machine Learning

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

Manufacturing companies are facing global competition and enormous cost pressure. The use of machine learning applications can help reduce production costs and create added value. Predictive quality enables the securing of product quality through data-supported predictions using machine learning models as a basis for decisions on test results. Furthermore, machine learning methods are able to process large amounts of data, deal with unfavourable row-column ratios and detect dependencies between the covariates and the given target as well as assess the multidimensional influence of all input variables on the target. Real production data are often subject to highly fluctuating boundary conditions and unbalanced data sets. Changes in production data manifest themselves in trends, systematic shifts, and seasonal effects. Thus, Machine learning applications require intensive pre-processing and feature selection. Data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets. Within the used real data set of Bosch hydraulic valves, the comparability of the same production conditions in the production of hydraulic valves within certain time periods can be identified by applying the concept drift method. Furthermore, a classification model is developed to evaluate the feature importance in different subsets within the identified time periods. By selecting comparable and stable features, the number of features used can be significantly reduced without a strong decrease in predictive power. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. In this research, the ada boosting classifier is used to predict the leakage of hydraulic valves based on geometric gauge blocks from machining, mating data from the assembly, and hydraulic measurement data from end-of-line testing. In addition, the most suitable methods are selected and accurate quality predictions are achieved.

Keywords: classification, achine learning, predictive quality, feature selection

Procedia PDF Downloads 157
3768 Roughness Discrimination Using Bioinspired Tactile Sensors

Authors: Zhengkun Yi

Abstract:

Surface texture discrimination using artificial tactile sensors has attracted increasing attentions in the past decade as it can endow technical and robot systems with a key missing ability. However, as a major component of texture, roughness has rarely been explored. This paper presents an approach for tactile surface roughness discrimination, which includes two parts: (1) design and fabrication of a bioinspired artificial fingertip, and (2) tactile signal processing for tactile surface roughness discrimination. The bioinspired fingertip is comprised of two polydimethylsiloxane (PDMS) layers, a polymethyl methacrylate (PMMA) bar, and two perpendicular polyvinylidene difluoride (PVDF) film sensors. This artificial fingertip mimics human fingertips in three aspects: (1) Elastic properties of epidermis and dermis in human skin are replicated by the two PDMS layers with different stiffness, (2) The PMMA bar serves the role analogous to that of a bone, and (3) PVDF film sensors emulate Meissner’s corpuscles in terms of both location and response to the vibratory stimuli. Various extracted features and classification algorithms including support vector machines (SVM) and k-nearest neighbors (kNN) are examined for tactile surface roughness discrimination. Eight standard rough surfaces with roughness values (Ra) of 50 μm, 25 μm, 12.5 μm, 6.3 μm 3.2 μm, 1.6 μm, 0.8 μm, and 0.4 μm are explored. The highest classification accuracy of (82.6 ± 10.8) % can be achieved using solely one PVDF film sensor with kNN (k = 9) classifier and the standard deviation feature.

Keywords: bioinspired fingertip, classifier, feature extraction, roughness discrimination

Procedia PDF Downloads 305
3767 2D and 3D Unsteady Simulation of the Heat Transfer in the Sample during Heat Treatment by Moving Heat Source

Authors: Zdeněk Veselý, Milan Honner, Jiří Mach

Abstract:

The aim of the performed work is to establish the 2D and 3D model of direct unsteady task of sample heat treatment by moving source employing computer model on the basis of finite element method. The complex boundary condition on heat loaded sample surface is the essential feature of the task. Computer model describes heat treatment of the sample during heat source movement over the sample surface. It is started from the 2D task of sample cross section as a basic model. Possibilities of extension from 2D to 3D task are discussed. The effect of the addition of third model dimension on the temperature distribution in the sample is showed. Comparison of various model parameters on the sample temperatures is observed. Influence of heat source motion on the depth of material heat treatment is shown for several velocities of the movement. Presented computer model is prepared for the utilization in laser treatment of machine parts.

Keywords: computer simulation, unsteady model, heat treatment, complex boundary condition, moving heat source

Procedia PDF Downloads 386
3766 Information Technology Approaches to Literature Text Analysis

Authors: Ayse Tarhan, Mustafa Ilkan, Mohammad Karimzadeh

Abstract:

Science was considered as part of philosophy in ancient Greece. By the nineteenth century, it was understood that philosophy was very inclusive and that social and human sciences such as literature, history, and psychology should be separated and perceived as an autonomous branch of science. The computer was also first seen as a tool of mathematical science. Over time, computer science has grown by encompassing every area in which technology exists, and its growth compelled the division of computer science into different disciplines, just as philosophy had been divided into different branches of science. Now there is almost no branch of science in which computers are not used. One of the newer autonomous disciplines of computer science is digital humanities, and one of the areas of digital humanities is literature. The material of literature is words, and thanks to the software tools created using computer programming languages, data that a literature researcher would need months to complete, can be achieved quickly and objectively. In this article, three different tools that literary researchers can use in their work will be introduced. These studies were created with the computer programming languages Python and R and brought to the world of literature. The purpose of introducing the aforementioned studies is to set an example for the development of special tools or programs on Ottoman language and literature in the future and to support such initiatives. The first example to be introduced is the Stylometry tool developed with the R language. The other is The Metrical Tool, which is used to measure data in poems and was developed with Python. The latest literature analysis tool in this article is Voyant Tools, which is a multifunctional and easy-to-use tool.

Keywords: DH, literature, information technologies, stylometry, the metrical tool, voyant tools

Procedia PDF Downloads 144
3765 Management and Evaluation of Developing Medical Device Software in Compliance with Rules

Authors: Arash Sepehri bonab

Abstract:

One of the regions of critical development in medical devices has been the part of the software - as an indispensable component of a therapeutic device, as a standalone device, and more as of late, as applications on portable gadgets. The chance related to a breakdown of the standalone computer program utilized inside healthcare is in itself not a model for its capability or not as a medical device. It is, subsequently, fundamental to clarify a few criteria for the capability of a stand-alone computer program as a medical device. The number of computer program items and therapeutic apps is persistently expanding and so as well is used in wellbeing education (e. g., in clinics and doctors' surgeries) for determination and treatment. Within the last decade, the use of information innovation in healthcare has taken a developing part. In reality, the appropriation of an expanding number of computer devices has driven several benefits related to the method of quiet care and permitted simpler get to social and health care assets. At the same time, this drift gave rise to modern challenges related to the usage of these modern innovations. The program utilized in healthcare can be classified as therapeutic gadgets depending on the way they are utilized and on their useful characteristics. In the event that they are classified as therapeutic gadgets, they must fulfill particular directions. The point of this work is to show a computer program improvement system that can permit the generation of secure and tall, quality restorative gadget computer programs and to highlight the correspondence between each program advancement stage and the fitting standard and/or regulation.

Keywords: medical devices, regulation, software, development, healthcare

Procedia PDF Downloads 104
3764 Emerging Cyber Threats and Cognitive Vulnerabilities: Cyberterrorism

Authors: Oludare Isaac Abiodun, Esther Omolara Abiodun

Abstract:

The purpose of this paper is to demonstrate that cyberterrorism is existing and poses a threat to computer security and national security. Nowadays, people have become excitedly dependent upon computers, phones, the Internet, and the Internet of things systems to share information, communicate, conduct a search, etc. However, these network systems are at risk from a different source that is known and unknown. These network systems risk being caused by some malicious individuals, groups, organizations, or governments, they take advantage of vulnerabilities in the computer system to hawk sensitive information from people, organizations, or governments. In doing so, they are engaging themselves in computer threats, crime, and terrorism, thereby making the use of computers insecure for others. The threat of cyberterrorism is of various forms and ranges from one country to another country. These threats include disrupting communications and information, stealing data, destroying data, leaking, and breaching data, interfering with messages and networks, and in some cases, demanding financial rewards for stolen data. Hence, this study identifies many ways that cyberterrorists utilize the Internet as a tool to advance their malicious mission, which negatively affects computer security and safety. One could identify causes for disparate anomaly behaviors and the theoretical, ideological, and current forms of the likelihood of cyberterrorism. Therefore, for a countermeasure, this paper proposes the use of previous and current computer security models as found in the literature to help in countering cyberterrorism

Keywords: cyberterrorism, computer security, information, internet, terrorism, threat, digital forensic solution

Procedia PDF Downloads 91
3763 Preliminary Evaluation of Decommissioning Wastes for the First Commercial Nuclear Power Reactor in South Korea

Authors: Kyomin Lee, Joohee Kim, Sangho Kang

Abstract:

The commercial nuclear power reactor in South Korea, Kori Unit 1, which was a 587 MWe pressurized water reactor that started operation since 1978, was permanently shut down in June 2017 without an additional operating license extension. The Kori 1 Unit is scheduled to become the nuclear power unit to enter the decommissioning phase. In this study, the preliminary evaluation of the decommissioning wastes for the Kori Unit 1 was performed based on the following series of process: firstly, the plant inventory is investigated based on various documents (i.e., equipment/ component list, construction records, general arrangement drawings). Secondly, the radiological conditions of systems, structures and components (SSCs) are established to estimate the amount of radioactive waste by waste classification. Third, the waste management strategies for Kori Unit 1 including waste packaging are established. Forth, selection of the proper decontamination and dismantling (D&D) technologies is made considering the various factors. Finally, the amount of decommissioning waste by classification for Kori 1 is estimated using the DeCAT program, which was developed by KEPCO-E&C for a decommissioning cost estimation. The preliminary evaluation results have shown that the expected amounts of decommissioning wastes were less than about 2% and 8% of the total wastes generated (i.e., sum of clean wastes and radwastes) before/after waste processing, respectively, and it was found that the majority of contaminated material was carbon or alloy steel and stainless steel. In addition, within the range of availability of information, the results of the evaluation were compared with the results from the various decommissioning experiences data or international/national decommissioning study. The comparison results have shown that the radioactive waste amount from Kori Unit 1 decommissioning were much less than those from the plants decommissioned in U.S. and were comparable to those from the plants in Europe. This result comes from the difference of disposal cost and clearance criteria (i.e., free release level) between U.S. and non-U.S. The preliminary evaluation performed using the methodology established in this study will be useful as a important information in establishing the decommissioning planning for the decommissioning schedule and waste management strategy establishment including the transportation, packaging, handling, and disposal of radioactive wastes.

Keywords: characterization, classification, decommissioning, decontamination and dismantling, Kori 1, radioactive waste

Procedia PDF Downloads 206
3762 Myanmar Character Recognition Using Eight Direction Chain Code Frequency Features

Authors: Kyi Pyar Zaw, Zin Mar Kyu

Abstract:

Character recognition is the process of converting a text image file into editable and searchable text file. Feature Extraction is the heart of any character recognition system. The character recognition rate may be low or high depending on the extracted features. In the proposed paper, 25 features for one character are used in character recognition. Basically, there are three steps of character recognition such as character segmentation, feature extraction and classification. In segmentation step, horizontal cropping method is used for line segmentation and vertical cropping method is used for character segmentation. In the Feature extraction step, features are extracted in two ways. The first way is that the 8 features are extracted from the entire input character using eight direction chain code frequency extraction. The second way is that the input character is divided into 16 blocks. For each block, although 8 feature values are obtained through eight-direction chain code frequency extraction method, we define the sum of these 8 feature values as a feature for one block. Therefore, 16 features are extracted from that 16 blocks in the second way. We use the number of holes feature to cluster the similar characters. We can recognize the almost Myanmar common characters with various font sizes by using these features. All these 25 features are used in both training part and testing part. In the classification step, the characters are classified by matching the all features of input character with already trained features of characters.

Keywords: chain code frequency, character recognition, feature extraction, features matching, segmentation

Procedia PDF Downloads 311
3761 Partial Least Square Regression for High-Dimentional and High-Correlated Data

Authors: Mohammed Abdullah Alshahrani

Abstract:

The research focuses on investigating the use of partial least squares (PLS) methodology for addressing challenges associated with high-dimensional correlated data. Recent technological advancements have led to experiments producing data characterized by a large number of variables compared to observations, with substantial inter-variable correlations. Such data patterns are common in chemometrics, where near-infrared (NIR) spectrometer calibrations record chemical absorbance levels across hundreds of wavelengths, and in genomics, where thousands of genomic regions' copy number alterations (CNA) are recorded from cancer patients. PLS serves as a widely used method for analyzing high-dimensional data, functioning as a regression tool in chemometrics and a classification method in genomics. It handles data complexity by creating latent variables (components) from original variables. However, applying PLS can present challenges. The study investigates key areas to address these challenges, including unifying interpretations across three main PLS algorithms and exploring unusual negative shrinkage factors encountered during model fitting. The research presents an alternative approach to addressing the interpretation challenge of predictor weights associated with PLS. Sparse estimation of predictor weights is employed using a penalty function combining a lasso penalty for sparsity and a Cauchy distribution-based penalty to account for variable dependencies. The results demonstrate sparse and grouped weight estimates, aiding interpretation and prediction tasks in genomic data analysis. High-dimensional data scenarios, where predictors outnumber observations, are common in regression analysis applications. Ordinary least squares regression (OLS), the standard method, performs inadequately with high-dimensional and highly correlated data. Copy number alterations (CNA) in key genes have been linked to disease phenotypes, highlighting the importance of accurate classification of gene expression data in bioinformatics and biology using regularized methods like PLS for regression and classification.

Keywords: partial least square regression, genetics data, negative filter factors, high dimensional data, high correlated data

Procedia PDF Downloads 44
3760 The Use of Computer-Aided Design in Small Contractors in a Local Area of Korea

Authors: Myunghoun Jang

Abstract:

A survey of small-size contractors in Jeju was conducted to investigate college graduate's computer-aided design (CAD) competence. Most of small-size contractors use CAD software to review and update drawings submitted from an architect. This research analyzed the curriculum of the architectural engineering in several national universities. The CAD classes have 4 or 6 hours per week and use AutoCAD primarily. This paper proposes that a CAD class needs 6 hours per week, 2D drawing is the main theme in the curriculum, and exercises to make 3D models are also included in the CAD class. An improved method, for example Internet cafe and real time feedbacks using smartphones, to evaluate the reports and exercise results is necessary.

Keywords: CAD (Computer Aided Design), CAD education, education improvement, small-size contractor

Procedia PDF Downloads 264
3759 Classification of Multiple Cancer Types with Deep Convolutional Neural Network

Authors: Nan Deng, Zhenqiu Liu

Abstract:

Thousands of patients with metastatic tumors were diagnosed with cancers of unknown primary sites each year. The inability to identify the primary cancer site may lead to inappropriate treatment and unexpected prognosis. Nowadays, a large amount of genomics and transcriptomics cancer data has been generated by next-generation sequencing (NGS) technologies, and The Cancer Genome Atlas (TCGA) database has accrued thousands of human cancer tumors and healthy controls, which provides an abundance of resource to differentiate cancer types. Meanwhile, deep convolutional neural networks (CNNs) have shown high accuracy on classification among a large number of image object categories. Here, we utilize 25 cancer primary tumors and 3 normal tissues from TCGA and convert their RNA-Seq gene expression profiling to color images; train, validate and test a CNN classifier directly from these images. The performance result shows that our CNN classifier can archive >80% test accuracy on most of the tumors and normal tissues. Since the gene expression pattern of distant metastases is similar to their primary tumors, the CNN classifier may provide a potential computational strategy on identifying the unknown primary origin of metastatic cancer in order to plan appropriate treatment for patients.

Keywords: bioinformatics, cancer, convolutional neural network, deep leaning, gene expression pattern

Procedia PDF Downloads 296
3758 Semantic Differences between Bug Labeling of Different Repositories via Machine Learning

Authors: Pooja Khanal, Huaming Zhang

Abstract:

Labeling of issues/bugs, also known as bug classification, plays a vital role in software engineering. Some known labels/classes of bugs are 'User Interface', 'Security', and 'API'. Most of the time, when a reporter reports a bug, they try to assign some predefined label to it. Those issues are reported for a project, and each project is a repository in GitHub/GitLab, which contains multiple issues. There are many software project repositories -ranging from individual projects to commercial projects. The labels assigned for different repositories may be dependent on various factors like human instinct, generalization of labels, label assignment policy followed by the reporter, etc. While the reporter of the issue may instinctively give that issue a label, another person reporting the same issue may label it differently. This way, it is not known mathematically if a label in one repository is similar or different to the label in another repository. Hence, the primary goal of this research is to find the semantic differences between bug labeling of different repositories via machine learning. Independent optimal classifiers for individual repositories are built first using the text features from the reported issues. The optimal classifiers may include a combination of multiple classifiers stacked together. Then, those classifiers are used to cross-test other repositories which leads the result to be deduced mathematically. The produce of this ongoing research includes a formalized open-source GitHub issues database that is used to deduce the similarity of the labels pertaining to the different repositories.

Keywords: bug classification, bug labels, GitHub issues, semantic differences

Procedia PDF Downloads 191
3757 Computer-Aided Teaching of Transformers for Undergraduates

Authors: Rajesh Kumar, Roopali Dogra, Puneet Aggarwal

Abstract:

In the era of technological advancement, use of computer technology has become inevitable. Hence it has become the need of the hour to integrate software methods in engineering curriculum as a part to boost pedagogy techniques. Simulations software is a great help to graduates of disciplines such as electrical engineering. Since electrical engineering deals with high voltages and heavy instruments, extra care must be taken while operating with them. The viable solution would be to have appropriate control. The appropriate control could be well designed if engineers have knowledge of kind of waveforms associated with the system. Though these waveforms can be plotted manually, but it consumes a lot of time. Hence aid of simulation helps to understand steady state of system and resulting in better performance. In this paper computer, aided teaching of transformer is carried out using MATLAB/Simulink. The test carried out on a transformer includes open circuit test and short circuit respectively. The respective parameters of transformer are then calculated using the values obtained from open circuit and short circuit test respectively using Simulink.

Keywords: computer aided teaching, open circuit test, short circuit test, simulink, transformer

Procedia PDF Downloads 367
3756 Covid-19 Lockdown Experience of Elderly Female as Reflected in Their Artwork

Authors: Liat Shamri-Zeevi, Neta Ram-Vlasov

Abstract:

Today the world as a whole is attempting to cope with the COVID-19, which has affected all facets of personal and social life from country-wide confinement to maintaining social distance and taking protective measures to maintain hygiene. One of the populations faced with the most severe restrictions is seniors. Various studies have shown that creativity plays a crucial role in dealing with crisis events. Painting - regardless of media - allows for emotional and cognitive processing of these situations, and enables the expression of experiences in a tangible creative way that conveys and endows meaning to the artwork. The current study was conducted in Israel immediately after a 6-week lockdown. It was designed to specifically examine the impact of the COVID-19 pandemic on the quality of life of elderly women as reflected in their artworks. The sample was composed of 21 Israeli women aged 60-90, in good mental health (without diagnosed dementia or Alzheimer's), all of whom were Hebrew-speaking, and retired with an extended family, who indicated that they painted and had engaged in artwork on an ongoing basis throughout the lockdown (from March 12 to May 30, 2020). The participants' artworks were collected, and a semi-structured in-depth interview was conducted that lasted one to two hours. The participants were asked about their feelings during the pandemic and the artworks they produced during this time, and completed a questionnaire on well-being and mental health. The initial analysis of the interviews and artworks revealed themes related to the specific role of each piece of artwork. The first theme included notions that the artwork was an activity and a framework for doing, which supported positive emotions, and provided a sense of vitality during the closure. Most of the participants painted images of nature and growth which were ascribed concrete and symbolic meaning. The second theme was that the artwork enabled the processing of difficult and /or conflicting emotions related to the situation, including anxiety about death and loneliness that were symbolically expressed in the artworks, such as images of the Corona virus and the respiratory machines. The third theme suggested that the time and space prompted by the lockdown gave the participants time for a gathering together of the self, and freed up time for creative activities. Many participants stated that they painted more and more frequently during the Corona lockdown. At the conference, additional themes and findings will be presented.

Keywords: Corona virus, artwork, quality of life of elderly

Procedia PDF Downloads 136
3755 Object-Scene: Deep Convolutional Representation for Scene Classification

Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang

Abstract:

Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.

Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization

Procedia PDF Downloads 324