Search results for: comparative morphological features
6038 Effect of Amine-Functionalized Carbon Nanotubes on the Properties of CNT-PAN Composite Nanofibers
Authors: O. Eren, N. Ucar, A. Onen, N. Kızıldag, O. F. Vurur, N. Demirsoy, I. Karacan
Abstract:
PAN nanofibers reinforced with amine functionalized carbon nanotubes. The effect of amine functionalization and the effect of concentration of CNT on the conductivity and mechanical and morphological properties of composite nanofibers were examined. 1%CNT-NH2 loaded PAN/CNT nanofiber showed the best mechanical properties. Conductivity increased with the incorporation of carbon nanotubes. While an increase of the concentration of CNT increases the diameter of nanofiber, the use of functionalized CNT results to a decrease of diameter of nanofiber.Keywords: amine functionalized carbon nanotube, electrospinning, nanofiber, polyacrylonitrile
Procedia PDF Downloads 3076037 Next Generation of Tunnel Field Effect Transistor: NCTFET
Authors: Naima Guenifi, Shiromani Balmukund Rahi, Amina Bechka
Abstract:
Tunnel FET is one of the most suitable alternatives FET devices for conventional CMOS technology for low-power electronics and applications. Due to its lower subthreshold swing (SS) value, it is a strong follower of low power applications. It is a quantum FET device that follows the band to band (B2B) tunneling transport phenomena of charge carriers. Due to band to band tunneling, tunnel FET is suffering from a lower switching current than conventional metal-oxide-semiconductor field-effect transistor (MOSFET). For improvement of device features and limitations, the newly invented negative capacitance concept of ferroelectric material is implemented in conventional Tunnel FET structure popularly known as NC TFET. The present research work has implemented the idea of high-k gate dielectric added with ferroelectric material on double gate Tunnel FET for implementation of negative capacitance. It has been observed that the idea of negative capacitance further improves device features like SS value. It helps to reduce power dissipation and switching energy. An extensive investigation for circularity uses for digital, analog/RF and linearity features of double gate NCTFET have been adopted here for research work. Several essential designs paraments for analog/RF and linearity parameters like transconductance(gm), transconductance generation factor (gm/IDS), its high-order derivatives (gm2, gm3), cut-off frequency (fT), gain-bandwidth product (GBW), transconductance generation factor (gm/IDS) has been investigated for low power RF applications. The VIP₂, VIP₃, IMD₃, IIP₃, distortion characteristics (HD2, HD3), 1-dB, the compression point, delay and power delay product performance have also been thoroughly studied.Keywords: analog/digital, ferroelectric, linearity, negative capacitance, Tunnel FET, transconductance
Procedia PDF Downloads 1936036 Local Texture and Global Color Descriptors for Content Based Image Retrieval
Authors: Tajinder Kaur, Anu Bala
Abstract:
An image retrieval system is a computer system for browsing, searching, and retrieving images from a large database of digital images a new algorithm meant for content-based image retrieval (CBIR) is presented in this paper. The proposed method combines the color and texture features which are extracted the global and local information of the image. The local texture feature is extracted by using local binary patterns (LBP), which are evaluated by taking into consideration of local difference between the center pixel and its neighbors. For the global color feature, the color histogram (CH) is used which is calculated by RGB (red, green, and blue) spaces separately. In this paper, the combination of color and texture features are proposed for content-based image retrieval. The performance of the proposed method is tested on Corel 1000 database which is the natural database. The results after being investigated show a significant improvement in terms of their evaluation measures as compared to LBP and CH.Keywords: color, texture, feature extraction, local binary patterns, image retrieval
Procedia PDF Downloads 3656035 Clustering Based Level Set Evaluation for Low Contrast Images
Authors: Bikshalu Kalagadda, Srikanth Rangu
Abstract:
The important object of images segmentation is to extract objects with respect to some input features. One of the important methods for image segmentation is Level set method. Generally medical images and synthetic images with low contrast of pixel profile, for such images difficult to locate interested features in images. In conventional level set function, develops irregularity during its process of evaluation of contour of objects, this destroy the stability of evolution process. For this problem a remedy is proposed, a new hybrid algorithm is Clustering Level Set Evolution. Kernel fuzzy particles swarm optimization clustering with the Distance Regularized Level Set (DRLS) and Selective Binary, and Gaussian Filtering Regularized Level Set (SBGFRLS) methods are used. The ability of identifying different regions becomes easy with improved speed. Efficiency of the modified method can be evaluated by comparing with the previous method for similar specifications. Comparison can be carried out by considering medical and synthetic images.Keywords: segmentation, clustering, level set function, re-initialization, Kernel fuzzy, swarm optimization
Procedia PDF Downloads 3516034 Intelligent Production Machine
Authors: A. Şahinoğlu, R. Gürbüz, A. Güllü, M. Karhan
Abstract:
This study in production machines, it is aimed that machine will automatically perceive cutting data and alter cutting parameters. The two most important parameters have to be checked in machine control unit are progress feed rate and speeds. These parameters are aimed to be controlled by sounds of machine. Optimum sound’s features introduced to computer. During process, real time data is received and converted by Matlab software. Data is converted into numerical values. According to them progress and speeds decreases/increases at a certain rate and thus optimum sound is acquired. Cutting process is made in respect of optimum cutting parameters. During chip remove progress, features of cutting tools, kind of cut material, cutting parameters and used machine; affects on various parameters. Instead of required parameters need to be measured such as temperature, vibration, and tool wear that emerged during cutting process; detailed analysis of the sound emerged during cutting process will provide detection of various data that included in the cutting process by the much more easy and economic way. The relation between cutting parameters and sound is being identified.Keywords: cutting process, sound processing, intelligent late, sound analysis
Procedia PDF Downloads 3326033 Work-Integrated Learning Practices: Comparative Case Studies across Three Countries
Authors: Shairn Hollis-Turner
Abstract:
The changing demands of workplace practice in the field of business information and administration have placed considerable pressure on educators to prepare students for the world of work. In this paper, we argue that appropriate forms of work-integrated learning (WIL) could enhance learning experiences in higher education and support educators to meet industry needs for changing times. The study aims to enhance business information and administration education from a practice perspective. The guiding research question is: How can a systematic understanding of work-integrated learning practices enhance learning experiences in higher education? The research design comprised comparative case studies across three countries and was framed by Activity Theory. Analysis of the findings highlighted the similarities across WIL systems for higher education practices and the differences within the activity systems. The findings showed similarities in program practice, content, placement, and in the struggles of students to find placements. The findings also showed misalignments between WIL preparation, delivery, and future focus of WIL at these institutions. The findings suggest that employment requirements vary across countries and that systems could be improved to meet the demands of workplace practice for changing times for the benefit of students’ learning and employability.Keywords: business administration, business information, knowledge, post graduate diploma
Procedia PDF Downloads 506032 Demetallization of Crude Oil: Comparative Analysis of Deasphalting and Electrochemical Removal Methods of Ni and V
Authors: Nurlan Akhmetov, Abilmansur Yeshmuratov, Aliya Kurbanova, Gulnar Sugurbekova, Murat Baisariyev
Abstract:
Extraction of the vanadium and nickel compounds is complex due to the high stability of porphyrin, nickel is catalytic poison which deactivates catalysis during the catalytic cracking of the oil, while vanadyl is abrasive and valuable metal. Thus, high concentration of the Ni and V in the crude oil makes their removal relevant. Two methods of the demetallization of crude oil were tested, therefore, the present research is conducted for comparative analysis of the deasphalting with organic solvents (cyclohexane, carbon tetrachloride, chloroform) and electrochemical method. Percentage of Ni extraction reached maximum of approximately 55% by using the electrochemical method in electrolysis cell, which was developed for this research and consists of three sections: oil and protonating agent (EtOH) solution between two conducting membranes which divides it from two capsules of 10% sulfuric acid and two graphite electrodes which cover all three parts in electrical circuit. Ions of metals pass through membranes and remain in acid solutions. The best result was obtained in 60 minutes with ethanol to oil ratio 25% to 75% respectively, current fits in to the range from 0.3A to 0.4A, voltage changed from 12.8V to 17.3V. Maximum efficiency of deasphalting, with cyclohexane as the solvent, in Soxhlet extractor was 66.4% for Ni and 51.2% for V. Thus, applying the voltammetry, ICP MS (Inductively coupled plasma mass spectrometry) and AAS (atomic absorption spectroscopy), these mentioned types of metal extraction methods were compared in this paper.Keywords: electrochemistry, deasphalting of crude oil, demetallization of crude oil, petrolium engineering
Procedia PDF Downloads 2326031 The Results of the Archaeological Excavations at the Site of Qurh in Al Ula Region
Authors: Ahmad Al Aboudi
Abstract:
The Department of Archaeology at King Saud University conduct a long Term excavations since 2004 at the archaeological site of (Qurh) in Al-Ula area. The history of the site goes back to the eighth century AD. The main aim of the excavations is the training of the students on the archaeological field work associated with the scientific skills of exploring, surveying, classifying, documentations and other necessary in the field archaeology. During the 12th Season of Excavations, an area of 20 × 40 m2 of the site was excavated. The depth of the excavating the site was reached to 2-3 m. Many of the architectural features of a residential area in the northern part of the site were excavated this season. Circular walls made of mud-brick and a brick column drums and tiles made of clay were revealed this season. Additionally, lots of findings such as Gemstones, jars, ceramic plates, metal, glass, and fabric, as well as some jewelers and coins were discovered. This paper will deal with the main results of this field project including the architectural features and phenomena and their interpretations, the classification of excavated material culture remains and stratigraphy.Keywords: Islamic architecture, Islamic art, excavations, early Islamic city
Procedia PDF Downloads 2736030 Securing the Electronic Commerce - The Way Forward: A Comparative Ananlysis
Authors: Sarthak Mishra, Astha Sinha
Abstract:
There’s no doubt about the convenience of making commercial and business transactions over the Internet under the new business model known as the e-Commerce. The term 'Electronic commerce' or e-Commerce refers to the use of an electronic medium to carry out commercial transactions. E-Commerce is one of the parts of Information Science framework and its uses are gradually becoming popular. Thus, the threat of security issues in Information Science has now become an important subject of discussion amongst the concerned users. These two issues i.e. security and privacy are required to be looked into through social, organizational, technical and economic perspectives. The current paper analyses the effect of these two issues in the arena of e-commerce. Here, no specification has been discussed rather an attempt has been made to provide a general overview. Further, attempts have been made to discuss the security and privacy issues in relation to the E-Commerce financial transactions. We shall also discuss in particular different steps required to be taken before online shopping and also shall discuss the purpose of security and privacy in E-Commerce and why it has currently become the need of the present hour. Lastly, an attempt has been made to discuss the plausible future course of development of this practice and its impact upon the global economy and if any changes should be bought about to ensure a smooth evolution of the practice. This paper has adopted a descriptive methodology to undertake its major area of study, wherein the major source of information has been via the secondary resources. Also, the study is of a comparative nature wherein the position of the various national regimes have compared with regards to the research question.Keywords: business-business transaction (B2B), business-consumer transaction (B2C), e-commerce, online transaction, privacy and security threats
Procedia PDF Downloads 2306029 Integrating Cyber-Physical System toward Advance Intelligent Industry: Features, Requirements and Challenges
Authors: V. Reyes, P. Ferreira
Abstract:
In response to high levels of competitiveness, industrial systems have evolved to improve productivity. As a consequence, a rapid increase in volume production and simultaneously, a customization process require lower costs, more variety, and accurate quality of products. Reducing time-cycle production, enabling customizability, and ensure continuous quality improvement are key features in advance intelligent industry. In this scenario, customers and producers will be able to participate in the ongoing production life cycle through real-time interaction. To achieve this vision, transparency, predictability, and adaptability are key features that provide the industrial systems the capability to adapt to customer demands modifying the manufacturing process through an autonomous response and acting preventively to avoid errors. The industrial system incorporates a diversified number of components that in advanced industry are expected to be decentralized, end to end communicating, and with the capability to make own decisions through feedback. The evolving process towards advanced intelligent industry defines a set of stages to empower components of intelligence and enhancing efficiency to achieve the decision-making stage. The integrated system follows an industrial cyber-physical system (CPS) architecture whose real-time integration, based on a set of enabler technologies, links the physical and virtual world generating the digital twin (DT). This instance allows incorporating sensor data from real to virtual world and the required transparency for real-time monitoring and control, contributing to address important features of the advanced intelligent industry and simultaneously improve sustainability. Assuming the industrial CPS as the core technology toward the latest advanced intelligent industry stage, this paper reviews and highlights the correlation and contributions of the enabler technologies for the operationalization of each stage in the path toward advanced intelligent industry. From this research, a real-time integration architecture for a cyber-physical system with applications to collaborative robotics is proposed. The required functionalities and issues to endow the industrial system of adaptability are identified.Keywords: cyber-physical systems, digital twin, sensor data, system integration, virtual model
Procedia PDF Downloads 1166028 A Machine Learning-Based Model to Screen Antituberculosis Compound Targeted against LprG Lipoprotein of Mycobacterium tuberculosis
Authors: Syed Asif Hassan, Syed Atif Hassan
Abstract:
Multidrug-resistant Tuberculosis (MDR-TB) is an infection caused by the resistant strains of Mycobacterium tuberculosis that do not respond either to isoniazid or rifampicin, which are the most important anti-TB drugs. The increase in the occurrence of a drug-resistance strain of MTB calls for an intensive search of novel target-based therapeutics. In this context LprG (Rv1411c) a lipoprotein from MTB plays a pivotal role in the immune evasion of Mtb leading to survival and propagation of the bacterium within the host cell. Therefore, a machine learning method will be developed for generating a computational model that could predict for a potential anti LprG activity of the novel antituberculosis compound. The present study will utilize dataset from PubChem database maintained by National Center for Biotechnology Information (NCBI). The dataset involves compounds screened against MTB were categorized as active and inactive based upon PubChem activity score. PowerMV, a molecular descriptor generator, and visualization tool will be used to generate the 2D molecular descriptors for the actives and inactive compounds present in the dataset. The 2D molecular descriptors generated from PowerMV will be used as features. We feed these features into three different classifiers, namely, random forest, a deep neural network, and a recurring neural network, to build separate predictive models and choosing the best performing model based on the accuracy of predicting novel antituberculosis compound with an anti LprG activity. Additionally, the efficacy of predicted active compounds will be screened using SMARTS filter to choose molecule with drug-like features.Keywords: antituberculosis drug, classifier, machine learning, molecular descriptors, prediction
Procedia PDF Downloads 3896027 Different Goals and Strategies of Smart Cities: Comparative Study between European and Asian Countries
Authors: Yountaik Leem, Sang Ho Lee
Abstract:
In this paper, different goals and the ways to reach smart cities shown in many countries during planning and implementation processes will be discussed. Each country dealt with technologies which have been embedded into space as development of ICTs (information and communication technologies) for their own purposes and by their own ways. For example, European countries tried to adapt technologies to reduce greenhouse gas emission to overcome global warming while US-based global companies focused on the way of life using ICTs such as EasyLiving of Microsoft™ and CoolTown of Hewlett-Packard™ during last decade of 20th century. In the North-East Asian countries, urban space with ICTs were developed in large scale on the viewpoint of capitalism. Ubiquitous city, first introduced in Korea which named after Marc Weiser’s concept of ubiquitous computing pursued new urban development with advanced technologies and high-tech infrastructure including wired and wireless network. Japan has developed smart cities as comprehensive and technology intensive cities which will lead other industries of the nation in the future. Not only the goals and strategies but also new directions to which smart cities are oriented also suggested at the end of the paper. Like a Finnish smart community whose slogan is ‘one more hour a day for citizens,’ recent trend is forwarding everyday lives and cultures of human beings, not capital gains nor physical urban spaces.Keywords: smart cities, urban strategy, future direction, comparative study
Procedia PDF Downloads 2616026 The Impact of Recurring Events in Fake News Detection
Authors: Ali Raza, Shafiq Ur Rehman Khan, Raja Sher Afgun Usmani, Asif Raza, Basit Umair
Abstract:
Detection of Fake news and missing information is gaining popularity, especially after the advancement in social media and online news platforms. Social media platforms are the main and speediest source of fake news propagation, whereas online news websites contribute to fake news dissipation. In this study, we propose a framework to detect fake news using the temporal features of text and consider user feedback to identify whether the news is fake or not. In recent studies, the temporal features in text documents gain valuable consideration from Natural Language Processing and user feedback and only try to classify the textual data as fake or true. This research article indicates the impact of recurring and non-recurring events on fake and true news. We use two models BERT and Bi-LSTM to investigate, and it is concluded from BERT we get better results and 70% of true news are recurring and rest of 30% are non-recurring.Keywords: natural language processing, fake news detection, machine learning, Bi-LSTM
Procedia PDF Downloads 216025 Investigation of VN/TiN Multilayer Coatings on AZ91D Mg Alloys
Authors: M. Ertas, A. C. Onel, G. Ekinci, B. Toydemir, S. Durdu, M. Usta, L. Colakerol Arslan
Abstract:
To develop AZ91D magnesium alloys with improved properties, we have applied TiN and VN/TiN multilayer coatings using DC magnetron sputter technique. Coating structure, surface morphology, chemical bonding and corrosion resistance of coatings were analyzed by x-ray diffraction (XRD), scanning electron microscope (SEM), x-ray photoelectron spectroscopy (XPS), and tafel extrapolation method, respectively. XPS analysis reveal that VN overlayer reacts with oxygen at the VN/TiN interface and forms more stable TiN layer. Morphological investigations and the corrosion results show that VN/TiN multilayer thin film coatings are quite effective to optimize the corrosion resistance of Mg alloys.Keywords: AZ91D Mg alloys, high corrosion resistance, transition metal nitride coatings, magnetron sputter
Procedia PDF Downloads 4736024 Comparative Effects of Convective Drying on the Qualities of Some Leafy Vegetables
Authors: Iyiola Olusola Oluwaleye, Samson A. Adeleye, Omojola Awogbemi
Abstract:
This paper reports an investigation of the comparative effects of drying on the quality of some leafy vegetables at three different temperatures namely: 50ᵒC, 60ᵒC and 70ᵒC. The vegetables investigated are spinach (Amaranthus cruentus); water leaf (Talinum triangulare); lettuce (Lactuca satuva); and fluted pumpkin (Telfaria occidentalis). These vegetables are available in abundance during raining season and are commonly consumed by average Nigerians. A convective dryer was used for the drying process at the stipulated temperatures which were maintained with the aid of a thermostat. The vegetable samples after washing was cut into smaller sizes of 0.4 cm-0.5 cm and loaded into the drying cage of the convective dryer. The daily duration of the drying is six hours from 9:00 am to 3:00 pm. The dried samples were thereafter subjected to microbial and proximate analyses. The result of the tests shows that the microbial load decreases as the drying temperature increases. As temperature increases, the moisture content and carbohydrate of all the samples decreases while the crude fiber, ash and protein increases. Percentage fat content decreases as drying temperature increases with the exception of fluted pumpkin. The shelf life of the vegetable samples increase with drying temperature, Spinach has the lowest shelf life followed by Fluted Pumpkin, followed by lettuce while Water Leaf has the highest shelf life at the three drying temperatures of 50ᵒC, 60ᵒC and 70ᵒC respectively.Keywords: convective drying, leafy vegetables, quality, shelf life
Procedia PDF Downloads 2616023 A Topological Approach for Motion Track Discrimination
Authors: Tegan H. Emerson, Colin C. Olson, George Stantchev, Jason A. Edelberg, Michael Wilson
Abstract:
Detecting small targets at range is difficult because there is not enough spatial information present in an image sub-region containing the target to use correlation-based methods to differentiate it from dynamic confusers present in the scene. Moreover, this lack of spatial information also disqualifies the use of most state-of-the-art deep learning image-based classifiers. Here, we use characteristics of target tracks extracted from video sequences as data from which to derive distinguishing topological features that help robustly differentiate targets of interest from confusers. In particular, we calculate persistent homology from time-delayed embeddings of dynamic statistics calculated from motion tracks extracted from a wide field-of-view video stream. In short, we use topological methods to extract features related to target motion dynamics that are useful for classification and disambiguation and show that small targets can be detected at range with high probability.Keywords: motion tracks, persistence images, time-delay embedding, topological data analysis
Procedia PDF Downloads 1126022 Automatic Classification of Periodic Heart Sounds Using Convolutional Neural Network
Authors: Jia Xin Low, Keng Wah Choo
Abstract:
This paper presents an automatic normal and abnormal heart sound classification model developed based on deep learning algorithm. MITHSDB heart sounds datasets obtained from the 2016 PhysioNet/Computing in Cardiology Challenge database were used in this research with the assumption that the electrocardiograms (ECG) were recorded simultaneously with the heart sounds (phonocardiogram, PCG). The PCG time series are segmented per heart beat, and each sub-segment is converted to form a square intensity matrix, and classified using convolutional neural network (CNN) models. This approach removes the need to provide classification features for the supervised machine learning algorithm. Instead, the features are determined automatically through training, from the time series provided. The result proves that the prediction model is able to provide reasonable and comparable classification accuracy despite simple implementation. This approach can be used for real-time classification of heart sounds in Internet of Medical Things (IoMT), e.g. remote monitoring applications of PCG signal.Keywords: convolutional neural network, discrete wavelet transform, deep learning, heart sound classification
Procedia PDF Downloads 3466021 Comparative Antihyperglycemic Activity of Serpentina (Andrographis paniculata) and Papait (Mollugo oppositifolia linn) Aqueous Extracts in Alloxan-Induced Diabetic Mice
Authors: Karina Marie G. Nicolas, Kimberly M. Visaya, Emmanuel R. Cauinian
Abstract:
A comparative study on the antihyperglycemic activity of aqueous extracts of Serpentina (Andrographis paniculata) and Papait (Mollugo oppositifolia linn) administered at 400mg/kg body weight per orem twice daily for 14 days was investigated using 24 alloxan-induced diabetic male, 6-8 weeks old ICR mice and Metformin as standard control. The blood glucose levels of all the animals in the treatment groups were not reduced to < 200mg/dl so as to consider them as non-diabetic but Papait showed a consistent blood glucose lowering effect from day 0 to 14 causing 36.07% reduction as compared to Serpentina which was observed to cause a fluctuating effect on blood glucose levels and a reduction of only 22.53% while the Metformin treated animals exhibited the highest reduction at 45.29%. The blood glucose levels at day 14 of animals treated with Papait (322.93 mg/dl) had comparable blood glucose levels (p<0.05) with the Metformin treated groups (284.50 mg/dl). Also, all the animals in the three treatment groups were still hypercholesterolemic with an observed consistent weight loss and a decrease in feed intake except for Serpentina which recorded a slight increase. Results of the study showed a superior antihyperglycemic activity of Papait compared with Serpentina.Keywords: antihyperglycemic, diabetes, hypercholesterolemic, papait, serpentina
Procedia PDF Downloads 3576020 Preparation of Papers - Developing a Leukemia Diagnostic System Based on Hybrid Deep Learning Architectures in Actual Clinical Environments
Authors: Skyler Kim
Abstract:
An early diagnosis of leukemia has always been a challenge to doctors and hematologists. On a worldwide basis, it was reported that there were approximately 350,000 new cases in 2012, and diagnosing leukemia was time-consuming and inefficient because of an endemic shortage of flow cytometry equipment in current clinical practice. As the number of medical diagnosis tools increased and a large volume of high-quality data was produced, there was an urgent need for more advanced data analysis methods. One of these methods was the AI approach. This approach has become a major trend in recent years, and several research groups have been working on developing these diagnostic models. However, designing and implementing a leukemia diagnostic system in real clinical environments based on a deep learning approach with larger sets remains complex. Leukemia is a major hematological malignancy that results in mortality and morbidity throughout different ages. We decided to select acute lymphocytic leukemia to develop our diagnostic system since acute lymphocytic leukemia is the most common type of leukemia, accounting for 74% of all children diagnosed with leukemia. The results from this development work can be applied to all other types of leukemia. To develop our model, the Kaggle dataset was used, which consists of 15135 total images, 8491 of these are images of abnormal cells, and 5398 images are normal. In this paper, we design and implement a leukemia diagnostic system in a real clinical environment based on deep learning approaches with larger sets. The proposed diagnostic system has the function of detecting and classifying leukemia. Different from other AI approaches, we explore hybrid architectures to improve the current performance. First, we developed two independent convolutional neural network models: VGG19 and ResNet50. Then, using both VGG19 and ResNet50, we developed a hybrid deep learning architecture employing transfer learning techniques to extract features from each input image. In our approach, fusing the features from specific abstraction layers can be deemed as auxiliary features and lead to further improvement of the classification accuracy. In this approach, features extracted from the lower levels are combined into higher dimension feature maps to help improve the discriminative capability of intermediate features and also overcome the problem of network gradient vanishing or exploding. By comparing VGG19 and ResNet50 and the proposed hybrid model, we concluded that the hybrid model had a significant advantage in accuracy. The detailed results of each model’s performance and their pros and cons will be presented in the conference.Keywords: acute lymphoblastic leukemia, hybrid model, leukemia diagnostic system, machine learning
Procedia PDF Downloads 1866019 Evaluation of Gesture-Based Password: User Behavioral Features Using Machine Learning Algorithms
Authors: Lakshmidevi Sreeramareddy, Komalpreet Kaur, Nane Pothier
Abstract:
Graphical-based passwords have existed for decades. Their major advantage is that they are easier to remember than an alphanumeric password. However, their disadvantage (especially recognition-based passwords) is the smaller password space, making them more vulnerable to brute force attacks. Graphical passwords are also highly susceptible to the shoulder-surfing effect. The gesture-based password method that we developed is a grid-free, template-free method. In this study, we evaluated the gesture-based passwords for usability and vulnerability. The results of the study are significant. We developed a gesture-based password application for data collection. Two modes of data collection were used: Creation mode and Replication mode. In creation mode (Session 1), users were asked to create six different passwords and reenter each password five times. In replication mode, users saw a password image created by some other user for a fixed duration of time. Three different duration timers, such as 5 seconds (Session 2), 10 seconds (Session 3), and 15 seconds (Session 4), were used to mimic the shoulder-surfing attack. After the timer expired, the password image was removed, and users were asked to replicate the password. There were 74, 57, 50, and 44 users participated in Session 1, Session 2, Session 3, and Session 4 respectfully. In this study, the machine learning algorithms have been applied to determine whether the person is a genuine user or an imposter based on the password entered. Five different machine learning algorithms were deployed to compare the performance in user authentication: namely, Decision Trees, Linear Discriminant Analysis, Naive Bayes Classifier, Support Vector Machines (SVMs) with Gaussian Radial Basis Kernel function, and K-Nearest Neighbor. Gesture-based password features vary from one entry to the next. It is difficult to distinguish between a creator and an intruder for authentication. For each password entered by the user, four features were extracted: password score, password length, password speed, and password size. All four features were normalized before being fed to a classifier. Three different classifiers were trained using data from all four sessions. Classifiers A, B, and C were trained and tested using data from the password creation session and the password replication with a timer of 5 seconds, 10 seconds, and 15 seconds, respectively. The classification accuracies for Classifier A using five ML algorithms are 72.5%, 71.3%, 71.9%, 74.4%, and 72.9%, respectively. The classification accuracies for Classifier B using five ML algorithms are 69.7%, 67.9%, 70.2%, 73.8%, and 71.2%, respectively. The classification accuracies for Classifier C using five ML algorithms are 68.1%, 64.9%, 68.4%, 71.5%, and 69.8%, respectively. SVMs with Gaussian Radial Basis Kernel outperform other ML algorithms for gesture-based password authentication. Results confirm that the shorter the duration of the shoulder-surfing attack, the higher the authentication accuracy. In conclusion, behavioral features extracted from the gesture-based passwords lead to less vulnerable user authentication.Keywords: authentication, gesture-based passwords, machine learning algorithms, shoulder-surfing attacks, usability
Procedia PDF Downloads 1026018 Molecular Characterization of Two Thermoplastic Biopolymer-Degrading Fungi Utilizing rRNA-Based Technology
Authors: Nuha Mansour Alhazmi, Magda Mohamed Aly, Fardus M. Bokhari, Ahmed Bahieldin, Sherif Edris
Abstract:
Out of 30 fungal isolates, 2 new isolates were proven to degrade poly-β-hydroxybutyrate (PHB). Enzyme assay for these isolates indicated the optimal environmental conditions required for depolymerase enzyme to induce the highest level of biopolymer degradation. The two isolates were basically characterized at the morphological level as Trichoderma asperellum (isolate S1), and Aspergillus fumigates (isolate S2) using standard approaches. The aim of the present study was to characterize these two isolates at the molecular level based on the highly diverged rRNA gene(s). Within this gene, two domains of the ribosome large subunit (LSU) namely internal transcribed spacer (ITS) and 26S were utilized in the analysis. The first domain comprises the ITS1/5.8S/ITS2 regions ( > 500 bp), while the second domain comprises the D1/D2/D3 regions ( > 1200 bp). Sanger sequencing was conducted at Macrogen (Inc.) for the two isolates using primers ITS1/ITS4 for the first domain, while primers LROR/LR7 for the second domain. Sizes of the first domain ranged between 594-602 bp for S1 isolate and 581-594 bp for S2 isolate, while those of the second domain ranged between 1228-1238 bp for S1 isolate and 1156-1291 for S2 isolate. BLAST analysis indicated 99% identities of the first domain of S1 isolate with T. asperellum isolates XP22 (ID: KX664456.1), CTCCSJ-G-HB40564 (ID: KY750349.1), CTCCSJ-F-ZY40590 (ID: KY750362.1) and TV (ID: KU341015.1). BLAST of the first domain of S2 isolate indicated 100% identities with A. fumigatus isolate YNCA0338 (ID: KP068684.1) and strain MEF-Cr-6 (ID: KU597198.1), while 99% identities with A. fumigatus isolate CCA101 (ID: KT877346.1) and strain CD1621 (ID: JX092088.1). Large numbers of other T. asperellum and A. fumigatus isolates and strains showed high level of identities with S1 and S2 isolates, respectively, based on the diversity of the first domain. BLAST of the second domain of S1 isolate indicated 99 and 100% identities with only two strains of T. asperellum namely TR 3 (ID: HM466685.1) and G (ID: KF723005.1), respectively. However, other T. species (ex., atroviride, hamatum, deliquescens, harzianum, etc.) also showed high level of identities. BLAST of the second domain of S2 isolate indicated 100% identities with A. fumigatus isolate YNCA0338 (ID: KP068684.1) and strain MEF-Cr-6 (ID: KU597198.1), while 99% identities with A. fumigatus isolate CCA101 (ID: KT877346.1) and strain CD1621 (ID: JX092088.1). Large numbers of other A. fumigatus isolates and strains showed high level of identities with S2 isolate. Overall, the results of molecular characterization based on rRNA diversity for the two isolates of T. asperellum and A. fumigatus matched those obtained by morphological characterization. In addition, ITS domain proved to be more sensitive than 26S domain in diversity profiling of fungi at the species level.Keywords: Aspergillus fumigates, Trichoderma asperellum, PHB, degradation, BLAST, ITS, 26S, rRNA
Procedia PDF Downloads 1586017 Improving Fake News Detection Using K-means and Support Vector Machine Approaches
Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy
Abstract:
Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.Keywords: clustering, fake news detection, feature selection, machine learning, social media, support vector machine
Procedia PDF Downloads 1766016 Wind Velocity Mitigation for Conceptual Design: A Spatial Decision (Support Framework)
Authors: Mohamed Khallaf, Hossein M Rizeei
Abstract:
Simulating wind pattern behavior over proposed urban features is critical in the early stage of the conceptual design of both architectural and urban disciplines. However, it is typically not possible for designers to explore the impact of wind flow profiles across new urban developments due to a lack of real data and inaccurate estimation of building parameters. Modeling the details of existing and proposed urban features and testing them against wind flows is the missing part of the conceptual design puzzle where architectural and urban discipline can focus. This research aims to develop a spatial decision-support design method utilizing LiDAR, GIS, and performance-based wind simulation technology to mitigate wind-related hazards on a design by simulating alternative design scenarios at the pedestrian level prior to its implementation in Sydney, Australia. The result of the experiment demonstrates the capability of the proposed framework to improve pedestrian comfort in relation to wind profile.Keywords: spatial decision-support design, performance-based wind simulation, LiDAR, GIS
Procedia PDF Downloads 1216015 Combination between Intrusion Systems and Honeypots
Authors: Majed Sanan, Mohammad Rammal, Wassim Rammal
Abstract:
Today, security is a major concern. Intrusion Detection, Prevention Systems and Honeypot can be used to moderate attacks. Many researchers have proposed to use many IDSs ((Intrusion Detection System) time to time. Some of these IDS’s combine their features of two or more IDSs which are called Hybrid Intrusion Detection Systems. Most of the researchers combine the features of Signature based detection methodology and Anomaly based detection methodology. For a signature based IDS, if an attacker attacks slowly and in organized way, the attack may go undetected through the IDS, as signatures include factors based on duration of the events but the actions of attacker do not match. Sometimes, for an unknown attack there is no signature updated or an attacker attack in the mean time when the database is updating. Thus, signature-based IDS fail to detect unknown attacks. Anomaly based IDS suffer from many false-positive readings. So there is a need to hybridize those IDS which can overcome the shortcomings of each other. In this paper we propose a new approach to IDS (Intrusion Detection System) which is more efficient than the traditional IDS (Intrusion Detection System). The IDS is based on Honeypot Technology and Anomaly based Detection Methodology. We have designed Architecture for the IDS in a packet tracer and then implemented it in real time. We have discussed experimental results performed: both the Honeypot and Anomaly based IDS have some shortcomings but if we hybridized these two technologies, the newly proposed Hybrid Intrusion Detection System (HIDS) is capable enough to overcome these shortcomings with much enhanced performance. In this paper, we present a modified Hybrid Intrusion Detection System (HIDS) that combines the positive features of two different detection methodologies - Honeypot methodology and anomaly based intrusion detection methodology. In the experiment, we ran both the Intrusion Detection System individually first and then together and recorded the data from time to time. From the data we can conclude that the resulting IDS are much better in detecting intrusions from the existing IDSs.Keywords: security, intrusion detection, intrusion prevention, honeypot, anomaly-based detection, signature-based detection, cloud computing, kfsensor
Procedia PDF Downloads 3776014 Feature-Based Summarizing and Ranking from Customer Reviews
Authors: Dim En Nyaung, Thin Lai Lai Thein
Abstract:
Due to the rapid increase of Internet, web opinion sources dynamically emerge which is useful for both potential customers and product manufacturers for prediction and decision purposes. These are the user generated contents written in natural languages and are unstructured-free-texts scheme. Therefore, opinion mining techniques become popular to automatically process customer reviews for extracting product features and user opinions expressed over them. Since customer reviews may contain both opinionated and factual sentences, a supervised machine learning technique applies for subjectivity classification to improve the mining performance. In this paper, we dedicate our work is the task of opinion summarization. Therefore, product feature and opinion extraction is critical to opinion summarization, because its effectiveness significantly affects the identification of semantic relationships. The polarity and numeric score of all the features are determined by Senti-WordNet Lexicon. The problem of opinion summarization refers how to relate the opinion words with respect to a certain feature. Probabilistic based model of supervised learning will improve the result that is more flexible and effective.Keywords: opinion mining, opinion summarization, sentiment analysis, text mining
Procedia PDF Downloads 3306013 Chatbots vs. Websites: A Comparative Analysis Measuring User Experience and Emotions in Mobile Commerce
Authors: Stephan Boehm, Julia Engel, Judith Eisser
Abstract:
During the last decade communication in the Internet transformed from a broadcast to a conversational model by supporting more interactive features, enabling user generated content and introducing social media networks. Another important trend with a significant impact on electronic commerce is a massive usage shift from desktop to mobile devices. However, a presentation of product- or service-related information accumulated on websites, micro pages or portals often remains the pivot and focal point of a customer journey. A more recent change of user behavior –especially in younger user groups and in Asia– is going along with the increasing adoption of messaging applications supporting almost real-time but asynchronous communication on mobile devices. Mobile apps of this type cannot only provide an alternative for traditional one-to-one communication on mobile devices like voice calls or short messaging service. Moreover, they can be used in mobile commerce as a new marketing and sales channel, e.g., for product promotions and direct marketing activities. This requires a new way of customer interaction compared to traditional mobile commerce activities and functionalities provided based on mobile web-sites. One option better aligned to the customer interaction in mes-saging apps are so-called chatbots. Chatbots are conversational programs or dialog systems simulating a text or voice based human interaction. They can be introduced in mobile messaging and social media apps by using rule- or artificial intelligence-based imple-mentations. In this context, a comparative analysis is conducted to examine the impact of using traditional websites or chatbots for promoting a product in an impulse purchase situation. The aim of this study is to measure the impact on the customers’ user experi-ence and emotions. The study is based on a random sample of about 60 smartphone users in the group of 20 to 30-year-olds. Participants are randomly assigned into two groups and participate in a traditional website or innovative chatbot based mobile com-merce scenario. The chatbot-based scenario is implemented by using a Wizard-of-Oz experimental approach for reasons of sim-plicity and to allow for more flexibility when simulating simple rule-based and more advanced artificial intelligence-based chatbot setups. A specific set of metrics is defined to measure and com-pare the user experience in both scenarios. It can be assumed, that users get more emotionally involved when interacting with a system simulating human communication behavior instead of browsing a mobile commerce website. For this reason, innovative face-tracking and analysis technology is used to derive feedback on the emotional status of the study participants while interacting with the website or the chatbot. This study is a work in progress. The results will provide first insights on the effects of chatbot usage on user experiences and emotions in mobile commerce environments. Based on the study findings basic requirements for a user-centered design and implementation of chatbot solutions for mobile com-merce can be derived. Moreover, first indications on situations where chatbots might be favorable in comparison to the usage of traditional website based mobile commerce can be identified.Keywords: chatbots, emotions, mobile commerce, user experience, Wizard-of-Oz prototyping
Procedia PDF Downloads 4586012 Evaluation of Re-mineralization Ability of Nanohydroxyapatite and Coral Calcium with Different Concentrations on Initial Enamel Carious Lesions
Authors: Ali Abdelnabi, Nermeen Hamza
Abstract:
Coral calcium is a boasting natural product and dietary supplement which is considered a source of alkaline calcium carbonate, this study is a comparative study, comparing the remineralization effect of the new product of coral calcium with that of nano-hydroxyapatite. Methodology: a total of 35 extracted molars were collected, examined and sectioned to obtain 70 sound enamel discs, all discs were numbered and examined by scanning electron microscope coupled with Energy Dispersive Analysis of X-rays(EDAX) for mineral content, subjected to artificial caries, and mineral content was re-measured, discs were divided into seven groups according to the remineralizing agent used, where groups 1 to 3 used 10%, 20%, 30% nanohydroxyapatite gel respectively, groups 4 to 6 used 10%, 20%, 30% coral calcium gel and group 7 with no remineralizing agent (control group). All groups were re-examined by EDAX after remineralization; data were calculated and tabulated. Results: All groups showed a statistically significant drop in calcium level after artificial caries; all groups showed a statistically significant rise in calcium content after remineralization except for the control group; groups 1 and 5 showed the highest increase in calcium level after remineralization. Conclusion: coral calcium can be considered a comparative product to nano-hydroxyapatite regarding the remineralization of enamel initial carious lesions.Keywords: artificial caries, coral calcium, nanohydroxyapatite, re-mineralization
Procedia PDF Downloads 1216011 Behave Imbalances Comparative Checking of Children with and without Fathers between the Ages of 7 to 11 in Rasht
Authors: Farnoush Haghanipour
Abstract:
Objective: Father loss as one of the major stress factor, can causethe mental imbalances in children. It's clear that children's family condition of lacking a father is very clearly different from the condition of having a father. The goal of this research is to examine mental imbalances comparative checking in complete form and in five subsidiary categories as aggression, stress and depression, social incompatibility, anti-social behavior, and attention deficit imbalances (wackiness) do between children without father and normal ones. Method: This research is in descriptive and analytical method that reimburse to checking mental imbalances from 50 children that are student in one zone of Rasht’s education and nurture office. Material of this research is RATER behavior questionnaire (teacher form) and data analyses were did by SPSS software. Results: The results showed that there are clear different in relation with behavior imbalances between have father children and children without father and in children without a father behavior imbalance is more. Also showed that there is clearly a difference in aggression, stress, and depression and social incompatibility between children without and without fathers, and in children without a father the proportion increases. However, in antisocial behaviours and attention deficit imbalances there are not a clear difference between them. Conclusion: With upper amount of imbalance behaviour detection in children without fathers compared with children with fathers, it is essential that practitioners of society hygienic and remedy put efforts in order to primary and secondary prevention, for mental health of this group of society.Keywords: child, behave imbalances, children without father, mental imbalances
Procedia PDF Downloads 2556010 A Comparative Analysis of Vocabulary Learning Strategies among EFL Freshmen and Senior Medical Sciences Students across Different Fields of Study
Authors: M. Hadavi, Z. Hashemi
Abstract:
Learning strategies play an important role in the development of language skills. Vocabulary learning strategies as the backbone of these strategies have become a major part of English language teaching. This study is a comparative analysis of Vocabulary Learning Strategies (VLS) use and preference among freshmen and senior EFL medical sciences students with different fields of study. 449 students (236 freshman and 213 seniors) participated in the study. 64.6% were female and 35.4% were male. The instrument utilized in this research was a questionnaire consisting of 41 items related to the students’ approach to vocabulary learning. The items were classified under eight sections as dictionary strategies, guessing strategies, study preferences, memory strategies, autonomy, note- taking strategies, selective attention, and social strategies. The participants were asked to answer each item with a 5-point Likert-style frequency scale as follows:1) I never or almost never do this, 2) I don’t usually do this, 3) I sometimes do this, 4) I usually do this, and 5)I always or almost always do this. The results indicated that freshmen students and particularly surgical technology students used more strategies compared to the seniors. Overall guessing and dictionary strategies were the most frequently used strategies among all the learners (p=0/000). The mean and standard deviation of using VLS in the students who had no previous history of participating in the private English language classes was less than the students who had attended these type of classes (p=0/000). Female students tended to use social and study preference strategies whereas male students used mostly guessing and dictionary strategies. It can be concluded that the senior students under instruction from the university have learned to rely on themselves and choose the autonomous strategies more, while freshmen students use more strategies that are related to the study preferences.Keywords: vocabulary leaning strategies, medical sciences, students, linguistics
Procedia PDF Downloads 4506009 Analyzing the Commentator Network Within the French YouTube Environment
Authors: Kurt Maxwell Kusterer, Sylvain Mignot, Annick Vignes
Abstract:
To our best knowledge YouTube is the largest video hosting platform in the world. A high number of creators, viewers, subscribers and commentators act in this specific eco-system which generates huge sums of money. Views, subscribers, and comments help to increase the popularity of content creators. The most popular creators are sponsored by brands and participate in marketing campaigns. For a few of them, this becomes a financially rewarding profession. This is made possible through the YouTube Partner Program, which shares revenue among creators based on their popularity. We believe that the role of comments in increasing the popularity is to be emphasized. In what follows, YouTube is considered as a bilateral network between the videos and the commentators. Analyzing a detailed data set focused on French YouTubers, we consider each comment as a link between a commentator and a video. Our research question asks what are the predominant features of a video which give it the highest probability to be commented on. Following on from this question, how can we use these features to predict the action of the agent in commenting one video instead of another, considering the characteristics of the commentators, videos, topics, channels, and recommendations. We expect to see that the videos of more popular channels generate higher viewer engagement and thus are more frequently commented. The interest lies in discovering features which have not classically been considered as markers for popularity on the platform. A quick view of our data set shows that 96% of the commentators comment only once on a certain video. Thus, we study a non-weighted bipartite network between commentators and videos built on the sub-sample of 96% of unique comments. A link exists between two nodes when a commentator makes a comment on a video. We run an Exponential Random Graph Model (ERGM) approach to evaluate which characteristics influence the probability of commenting a video. The creation of a link will be explained in terms of common video features, such as duration, quality, number of likes, number of views, etc. Our data is relevant for the period of 2020-2021 and focuses on the French YouTube environment. From this set of 391 588 videos, we extract the channels which can be monetized according to YouTube regulations (channels with at least 1000 subscribers and more than 4000 hours of viewing time during the last twelve months).In the end, we have a data set of 128 462 videos which consist of 4093 channels. Based on these videos, we have a data set of 1 032 771 unique commentators, with a mean of 2 comments per a commentator, a minimum of 1 comment each, and a maximum of 584 comments.Keywords: YouTube, social networks, economics, consumer behaviour
Procedia PDF Downloads 67