Search results for: Databases and Information Retrieval
2710 Process-Oriented Learning Requirements for Employees and for Organizations
Authors: Richard Pircher, Lukas Zenk, Hanna Risku
Abstract:
Using activity theory, organisational theory and didactics as theoretical foundations, a comprehensive model of the organisational dimensions relevant for learning and knowledge transfer will be developed. In a second step, a Learning Assessment Guideline will be elaborated. This guideline will be designed to permit a targeted analysis of organisations to identify the status quo in those areas crucial to the implementation of learning and knowledge transfer. In addition, this self-analysis tool will enable learning managers to select adequate didactic models for e- and blended learning. As part of the European Integrated Project "Process-oriented Learning and Information Exchange" (PROLIX), this model of organisational prerequisites for learning and knowledge transfer will be empirically tested in four profit and non-profit organisations in Great Britain, Germany and France (to be finalized in autumn 2006). The findings concern not only the capability of the model of organisational dimensions, but also the predominant perceptions of and obstacles to learning in organisations.Keywords: Activity theory, knowledge management organisational theory, "Process-oriented Learning and Information Exchange" (PROLIX).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17442709 EU Families and Adolescents Quit Tobacco Focus Group Analysis in Hungary
Authors: Szilvia Gergely Seuss, Mihaela Nistor, Lilla Csáky, Péter Molnár
Abstract:
In the frame of the European Union project entitled EU-Families and Adolescents Quit Tobacco (www.eufaqt.eu) focus group analysis has been carried out in Hungary to acquire qualitative information on attitudes towards smoking in groups of adolescents, parents and educators, respectively. It rendered to identify methods for smoking prevention/ intervention with family approach. The results explored the role of the family in smoking behaviour. Teachers do not feel responsibility in prevention or cessation of smoking. Adolescents are not aware of the addictive effect of the cigarette. Water pipe is popular among adolescent, therefore spreading of more information needed on the harmful effects of water pipe. We outlined the requirement for professionals to provide interventions. Partnership of EU-FAQT project has worked out antismoking interventions for adolescents and their families conducted by psychologists to ensure skill development to prevent and quit tobacco.
Keywords: Smoking of adolescents, family approach, focus group analysis, water pipe.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18052708 An Improved Greedy Routing Algorithm for Grid using Pheromone-Based Landmarks
Authors: Lada-On Lertsuwanakul, Herwig Unger
Abstract:
This paper objects to extend Jon Kleinberg-s research. He introduced the structure of small-world in a grid and shows with a greedy algorithm using only local information able to find route between source and target in delivery time O(log2n). His fundamental model for distributed system uses a two-dimensional grid with longrange random links added between any two node u and v with a probability proportional to distance d(u,v)-2. We propose with an additional information of the long link nearby, we can find the shorter path. We apply the ant colony system as a messenger distributed their pheromone, the long-link details, in surrounding area. The subsequence forwarding decision has more option to move to, select among local neighbors or send to node has long link closer to its target. Our experiment results sustain our approach, the average routing time by Color Pheromone faster than greedy method.
Keywords: Routing algorithm, Small-World network, Ant Colony Optimization, and Peer-to-peer System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18592707 Concept Indexing using Ontology and Supervised Machine Learning
Authors: Rossitza M. Setchi, Qiao Tang
Abstract:
Nowadays, ontologies are the only widely accepted paradigm for the management of sharable and reusable knowledge in a way that allows its automatic interpretation. They are collaboratively created across the Web and used to index, search and annotate documents. The vast majority of the ontology based approaches, however, focus on indexing texts at document level. Recently, with the advances in ontological engineering, it became clear that information indexing can largely benefit from the use of general purpose ontologies which aid the indexing of documents at word level. This paper presents a concept indexing algorithm, which adds ontology information to words and phrases and allows full text to be searched, browsed and analyzed at different levels of abstraction. This algorithm uses a general purpose ontology, OntoRo, and an ontologically tagged corpus, OntoCorp, both developed for the purpose of this research. OntoRo and OntoCorp are used in a two-stage supervised machine learning process aimed at generating ontology tagging rules. The first experimental tests show a tagging accuracy of 78.91% which is encouraging in terms of the further improvement of the algorithm.Keywords: Concepts, indexing, machine learning, ontology, tagging.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16772706 Assessment of the Situation and the Cause of Junk Food Consumption in Iranians: A Qualitative Study
Authors: A. Rezazadeh, B Damari, S. Riazi-Esfahani, M. Hajian
Abstract:
The consumption of junk food in Iran is alarmingly increasing. This study aimed to investigate the influencing factors of junk food consumption and amendable interventions that are criticized and approved by stakeholders, in order to presented to health policy makers. The articles and documents related to the content of study were collected by using the appropriate key words such as junk food, carbonated beverage, chocolate, candy, sweets, industrial fruit juices, potato chips, French fries, puffed corn, cakes, biscuits, sandwiches, prepared foods and popsicles, ice cream, bar, chewing gum, pastilles and snack, in scholar.google.com, pubmed.com, eric.ed.gov, cochrane.org, magiran.com, medlib.ir, irandoc.ac.ir, who.int, iranmedex.com, sid.ir, pubmed.org and sciencedirect.com databases. The main key points were extracted and included in a checklist and qualitatively analyzed. Then a summarized abstract was prepared in a format of a questionnaire to be presented to stakeholders. The design of this was qualitative (Delphi). According to this method, a questionnaire was prepared based on reviewing the articles and documents and it was emailed to stakeholders, who were asked to prioritize and choose the main problems and effective interventions. After three rounds, consensus was obtained. Studies revealed high consumption of junk foods in the Iranian population, especially in children and adolescents. The most important affecting factors include availability, low price, media advertisements, preference of fast foods taste, the variety of the packages and their attractiveness, low awareness and changing in lifestyle. Main interventions recommended by stakeholders include developing a protective environment, educational interventions, increasing healthy food access and controlling media advertisements and putting pressure from the Industry and Mining Ministry on producers to produce healthy snacks. According to the findings, the results of this study may be proposed to public health policymakers as an advocacy paper and to be integrated in the interventional programs of Health and Education ministries and the media. Also, implementation of supportive meetings with the producers of alternative healthy products is suggested.
Keywords: Junk foods, situation, qualitative study, Iran.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12922705 Medical Image Segmentation and Detection of MR Images Based on Spatial Multiple-Kernel Fuzzy C-Means Algorithm
Authors: J. Mehena, M. C. Adhikary
Abstract:
In this paper, a spatial multiple-kernel fuzzy C-means (SMKFCM) algorithm is introduced for segmentation problem. A linear combination of multiples kernels with spatial information is used in the kernel FCM (KFCM) and the updating rules for the linear coefficients of the composite kernels are derived as well. Fuzzy cmeans (FCM) based techniques have been widely used in medical image segmentation problem due to their simplicity and fast convergence. The proposed SMKFCM algorithm provides us a new flexible vehicle to fuse different pixel information in medical image segmentation and detection of MR images. To evaluate the robustness of the proposed segmentation algorithm in noisy environment, we add noise in medical brain tumor MR images and calculated the success rate and segmentation accuracy. From the experimental results it is clear that the proposed algorithm has better performance than those of other FCM based techniques for noisy medical MR images.Keywords: Clustering, fuzzy C-means, image segmentation, MR images, multiple kernels.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21282704 Supply Chain Management and E-Commerce Technology Adoption among Logistics Service Providers in Malaysia
Authors: Mohd Iskandar bin Illyas Tan, Iziati Saadah bt Ibrahim
Abstract:
Logistics is part of the supply chain processes that plans, implements, and controls the efficient and effective forward and reverse flow and storage of goods, services, and related information between the point of origin and the point of consumption in order to meet customer requirements. This research aims to investigate the current status and future direction of the use of Information Technology (IT) for logistics, focusing on Supply Chain Management (SCM) and E-Commerce adoption in Malaysia. Therefore, this research stresses on the type of technology being adopted, factors, benefits and barriers affecting the innovation in SCM and E-Commerce technology adoption among Logistics Service Providers (LSP). A mailed questionnaire survey was conducted to collect data from 265 logistics companies in Johor. The research revealed a high level of SCM technology adoption among LSP as they had adopted SCM technology in various business processes while they perceived a high level of benefits from SCM adoption.Keywords: E-Commerce, Logistics Service Providers, Malaysia, Supply Chain Management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 49202703 DocPro: A Framework for Processing Semantic and Layout Information in Business Documents
Authors: Ming-Jen Huang, Chun-Fang Huang, Chiching Wei
Abstract:
With the recent advance of the deep neural network, we observe new applications of NLP (natural language processing) and CV (computer vision) powered by deep neural networks for processing business documents. However, creating a real-world document processing system needs to integrate several NLP and CV tasks, rather than treating them separately. There is a need to have a unified approach for processing documents containing textual and graphical elements with rich formats, diverse layout arrangement, and distinct semantics. In this paper, a framework that fulfills this unified approach is presented. The framework includes a representation model definition for holding the information generated by various tasks and specifications defining the coordination between these tasks. The framework is a blueprint for building a system that can process documents with rich formats, styles, and multiple types of elements. The flexible and lightweight design of the framework can help build a system for diverse business scenarios, such as contract monitoring and reviewing.
Keywords: Document processing, framework, formal definition, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6362702 Main Cause of Children's Deaths in Indigenous Wayuu Community from Department of La Guajira: A Research Developed through Data Mining Use
Authors: Isaura Esther Solano Núñez, David Suarez
Abstract:
The main purpose of this research is to discover what causes death in children of the Wayuu community, and deeply analyze those results in order to take corrective measures to properly control infant mortality. We consider important to determine the reasons that are producing early death in this specific type of population, since they are the most vulnerable to high risk environmental conditions. In this way, the government, through competent authorities, may develop prevention policies and the right measures to avoid an increase of this tragic fact. The methodology used to develop this investigation is data mining, which consists in gaining and examining large amounts of data to produce new and valuable information. Through this technique it has been possible to determine that the child population is dying mostly from malnutrition. In short, this technique has been very useful to develop this study; it has allowed us to transform large amounts of information into a conclusive and important statement, which has made it easier to take appropriate steps to resolve a particular situation.
Keywords: Malnutrition, datamining, analytical, descriptive, population, wayuu, indigenous.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6942701 Sentiment Analysis of Fake Health News Using Naive Bayes Classification Models
Authors: Danielle Shackley, Yetunde Folajimi
Abstract:
As more people turn to the internet seeking health related information, there is more risk of finding false, inaccurate, or dangerous information. Sentiment analysis is a natural language processing technique that assigns polarity scores of text, ranging from positive, neutral and negative. In this research, we evaluate the weight of a sentiment analysis feature added to fake health news classification models. The dataset consists of existing reliably labeled health article headlines that were supplemented with health information collected about COVID-19 from social media sources. We started with data preprocessing, tested out various vectorization methods such as Count and TFIDF vectorization. We implemented 3 Naive Bayes classifier models, including Bernoulli, Multinomial and Complement. To test the weight of the sentiment analysis feature on the dataset, we created benchmark Naive Bayes classification models without sentiment analysis, and those same models were reproduced and the feature was added. We evaluated using the precision and accuracy scores. The Bernoulli initial model performed with 90% precision and 75.2% accuracy, while the model supplemented with sentiment labels performed with 90.4% precision and stayed constant at 75.2% accuracy. Our results show that the addition of sentiment analysis did not improve model precision by a wide margin; while there was no evidence of improvement in accuracy, we had a 1.9% improvement margin of the precision score with the Complement model. Future expansion of this work could include replicating the experiment process, and substituting the Naive Bayes for a deep learning neural network model.
Keywords: Sentiment analysis, Naive Bayes model, natural language processing, topic analysis, fake health news classification model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4852700 Activation Parameters of the Low Temperature Creep Controlling Mechanism in Martensitic Steels
Abstract:
Martensitic steels with an ultimate tensile strength beyond 2000 MPa are applied in the powertrain of vehicles due to their excellent fatigue strength and high creep resistance. However, the creep controlling mechanism in martensitic steels at ambient temperatures up to 423 K is not evident. The purpose of this study is to review the low temperature creep (LTC) behavior of martensitic steels at temperatures from 363 K to 523 K. Thus, the validity of a logarithmic creep law is reviewed and the stress and temperature dependence of the creep parameters α and β are revealed. Furthermore, creep tests are carried out, which include stepped changes in temperature or stress, respectively. On one hand, the change of the creep rate due to a temperature step provides information on the magnitude of the activation energy of the LTC controlling mechanism and on the other hand, the stress step approach provides information on the magnitude of the activation volume. The magnitude, the temperature dependency, and the stress dependency of both material specific activation parameters may deliver a significant contribution to the disclosure of the nature of the LTC rate controlling mechanism.
Keywords: Activation parameters, creep mechanisms, high strength steels, low temperature creep.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7122699 Thermodynamic Analyses of Information Dissipation along the Passive Dendritic Trees and Active Action Potential
Authors: Bahar Hazal Yalçınkaya, Bayram Yılmaz, Mustafa Özilgen
Abstract:
Brain information transmission in the neuronal network occurs in the form of electrical signals. Neural work transmits information between the neurons or neurons and target cells by moving charged particles in a voltage field; a fraction of the energy utilized in this process is dissipated via entropy generation. Exergy loss and entropy generation models demonstrate the inefficiencies of the communication along the dendritic trees. In this study, neurons of 4 different animals were analyzed with one dimensional cable model with N=6 identical dendritic trees and M=3 order of symmetrical branching. Each branch symmetrically bifurcates in accordance with the 3/2 power law in an infinitely long cylinder with the usual core conductor assumptions, where membrane potential is conserved in the core conductor at all branching points. In the model, exergy loss and entropy generation rates are calculated for each branch of equivalent cylinders of electrotonic length (L) ranging from 0.1 to 1.5 for four different dendritic branches, input branch (BI), and sister branch (BS) and two cousin branches (BC-1 & BC-2). Thermodynamic analysis with the data coming from two different cat motoneuron studies show that in both experiments nearly the same amount of exergy is lost while generating nearly the same amount of entropy. Guinea pig vagal motoneuron loses twofold more exergy compared to the cat models and the squid exergy loss and entropy generation were nearly tenfold compared to the guinea pig vagal motoneuron model. Thermodynamic analysis show that the dissipated energy in the dendritic tress is directly proportional with the electrotonic length, exergy loss and entropy generation. Entropy generation and exergy loss show variability not only between the vertebrate and invertebrates but also within the same class. Concurrently, single action potential Na+ ion load, metabolic energy utilization and its thermodynamic aspect contributed for squid giant axon and mammalian motoneuron model. Energy demand is supplied to the neurons in the form of Adenosine triphosphate (ATP). Exergy destruction and entropy generation upon ATP hydrolysis are calculated. ATP utilization, exergy destruction and entropy generation showed differences in each model depending on the variations in the ion transport along the channels.
Keywords: ATP utilization, entropy generation, exergy loss, neuronal information transmittance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10132698 The Public Law Studies: Relationship between Accountability, Environmental Education and Smart Cities
Authors: Aline Alves Bandeira, Luís Pedro Lima, Maria Cecília de Paula Silva, Paulo Henrique de Viveiros Tavares
Abstract:
Nowadays, the study of public policies regarding management efficiency is essential. Public policies are about what governments do or do not do, being an area that has grown worldwide, contributing through the knowledge of technologies and methodologies that monitor and evaluate the performance of public administrators. The information published on official government websites needs to provide for transparency and responsiveness of managers. Thus, transparency is a primordial factor for the execution of accountability, providing, in this way, services to the citizen with the expansion of transparent, efficient, democratic information and that value administrative eco-efficiency. The ecologically balanced management of a Smart City must optimize environmental education, building a fairer society, which brings about equality in the use of quality environmental resources. Smart Cities add value in the construction of public management, enabling interaction between people, enhancing environmental education and the practical applicability of administrative eco-efficiency, fostering economic development and improving the quality of life.
Keywords: Accountability, environmental education, new public administration, smart cities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6172697 Integrated Models of Reading Comprehension: Understanding to Impact Teaching: The Teacher’s Central Role
Authors: Sally A. Brown
Abstract:
Over the last 30 years, researchers have developed models or frameworks to provide a more structured understanding of the reading comprehension process. Cognitive information processing models and social cognitive theories both provide frameworks to inform reading comprehension instruction. The purpose of this paper is to (a) provide an overview of the historical development of reading comprehension theory, (b) review the literature framed by cognitive information processing, social cognitive, and integrated reading comprehension theories, and (c) demonstrate how these frameworks inform instruction. As integrated models of reading can guide the interpretation of various factors related to student learning, an integrated framework designed by the researcher will be presented. Results indicated that features of cognitive processing and social cognitivism theory—represented in the integrated framework—highlight the importance of the role of the teacher. This model can aide teachers in not only improving reading comprehension instruction but in identifying areas of challenge for students.
Keywords: Explicit instruction, integrated models of reading comprehension, reading comprehension, teacher’s role.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1872696 Design and Implementation of Reed Solomon Encoder on FPGA
Authors: Amandeep Singh, Mandeep Kaur
Abstract:
Error correcting codes are used for detection and correction of errors in digital communication system. Error correcting coding is based on appending of redundancy to the information message according to a prescribed algorithm. Reed Solomon codes are part of channel coding and withstand the effect of noise, interference and fading. Galois field arithmetic is used for encoding and decoding reed Solomon codes. Galois field multipliers and linear feedback shift registers are used for encoding the information data block. The design of Reed Solomon encoder is complex because of use of LFSR and Galois field arithmetic. The purpose of this paper is to design and implement Reed Solomon (255, 239) encoder with optimized and lesser number of Galois Field multipliers. Symmetric generator polynomial is used to reduce the number of GF multipliers. To increase the capability toward error correction, convolution interleaving will be used with RS encoder. The Design will be implemented on Xilinx FPGA Spartan II.
Keywords: Galois Field, Generator polynomial, LFSR, Reed Solomon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 48432695 Simulated Annealing Algorithm for Data Aggregation Trees in Wireless Sensor Networks and Comparison with Genetic Algorithm
Authors: Ladan Darougaran, Hossein Shahinzadeh, Hajar Ghotb, Leila Ramezanpour
Abstract:
In ad hoc networks, the main issue about designing of protocols is quality of service, so that in wireless sensor networks the main constraint in designing protocols is limited energy of sensors. In fact, protocols which minimize the power consumption in sensors are more considered in wireless sensor networks. One approach of reducing energy consumption in wireless sensor networks is to reduce the number of packages that are transmitted in network. The technique of collecting data that combines related data and prevent transmission of additional packages in network can be effective in the reducing of transmitted packages- number. According to this fact that information processing consumes less power than information transmitting, Data Aggregation has great importance and because of this fact this technique is used in many protocols [5]. One of the Data Aggregation techniques is to use Data Aggregation tree. But finding one optimum Data Aggregation tree to collect data in networks with one sink is a NP-hard problem. In the Data Aggregation technique, related information packages are combined in intermediate nodes and form one package. So the number of packages which are transmitted in network reduces and therefore, less energy will be consumed that at last results in improvement of longevity of network. Heuristic methods are used in order to solve the NP-hard problem that one of these optimization methods is to solve Simulated Annealing problems. In this article, we will propose new method in order to build data collection tree in wireless sensor networks by using Simulated Annealing algorithm and we will evaluate its efficiency whit Genetic Algorithm.
Keywords: Data aggregation, wireless sensor networks, energy efficiency, simulated annealing algorithm, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16822694 Neural Network Based Approach of Software Maintenance Prediction for Laboratory Information System
Authors: Vuk M. Popovic, Dunja D. Popovic
Abstract:
Software maintenance phase is started once a software project has been developed and delivered. After that, any modification to it corresponds to maintenance. Software maintenance involves modifications to keep a software project usable in a changed or a changing environment, to correct discovered faults, and modifications, and to improve performance or maintainability. Software maintenance and management of software maintenance are recognized as two most important and most expensive processes in a life of a software product. This research is basing the prediction of maintenance, on risks and time evaluation, and using them as data sets for working with neural networks. The aim of this paper is to provide support to project maintenance managers. They will be able to pass the issues planned for the next software-service-patch to the experts, for risk and working time evaluation, and afterward to put all data to neural networks in order to get software maintenance prediction. This process will lead to the more accurate prediction of the working hours needed for the software-service-patch, which will eventually lead to better planning of budget for the software maintenance projects.
Keywords: Laboratory information system, maintenance engineering, neural networks, software maintenance, software maintenance costs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11262693 Lexical Based Method for Opinion Detection on Tripadvisor Collection
Authors: Faiza Belbachir, Thibault Schienhinski
Abstract:
The massive development of online social networks allows users to post and share their opinions on various topics. With this huge volume of opinion, it is interesting to extract and interpret these information for different domains, e.g., product and service benchmarking, politic, system of recommendation. This is why opinion detection is one of the most important research tasks. It consists on differentiating between opinion data and factual data. The difficulty of this task is to determine an approach which returns opinionated document. Generally, there are two approaches used for opinion detection i.e. Lexical based approaches and Machine Learning based approaches. In Lexical based approaches, a dictionary of sentimental words is used, words are associated with weights. The opinion score of document is derived by the occurrence of words from this dictionary. In Machine learning approaches, usually a classifier is trained using a set of annotated document containing sentiment, and features such as n-grams of words, part-of-speech tags, and logical forms. Majority of these works are based on documents text to determine opinion score but dont take into account if these texts are really correct. Thus, it is interesting to exploit other information to improve opinion detection. In our work, we will develop a new way to consider the opinion score. We introduce the notion of trust score. We determine opinionated documents but also if these opinions are really trustable information in relation with topics. For that we use lexical SentiWordNet to calculate opinion and trust scores, we compute different features about users like (numbers of their comments, numbers of their useful comments, Average useful review). After that, we combine opinion score and trust score to obtain a final score. We applied our method to detect trust opinions in TRIPADVISOR collection. Our experimental results report that the combination between opinion score and trust score improves opinion detection.Keywords: Tripadvisor, Opinion detection, SentiWordNet, trust score.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7492692 Data Mining Classification Methods Applied in Drug Design
Authors: Mária Stachová, Lukáš Sobíšek
Abstract:
Data mining incorporates a group of statistical methods used to analyze a set of information, or a data set. It operates with models and algorithms, which are powerful tools with the great potential. They can help people to understand the patterns in certain chunk of information so it is obvious that the data mining tools have a wide area of applications. For example in the theoretical chemistry data mining tools can be used to predict moleculeproperties or improve computer-assisted drug design. Classification analysis is one of the major data mining methodologies. The aim of thecontribution is to create a classification model, which would be able to deal with a huge data set with high accuracy. For this purpose logistic regression, Bayesian logistic regression and random forest models were built using R software. TheBayesian logistic regression in Latent GOLD software was created as well. These classification methods belong to supervised learning methods. It was necessary to reduce data matrix dimension before construct models and thus the factor analysis (FA) was used. Those models were applied to predict the biological activity of molecules, potential new drug candidates.Keywords: data mining, classification, drug design, QSAR
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28482691 Validation of Reverse Engineered Web Application Models
Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini
Abstract:
Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.Keywords: Validation, Dynamic Analysis, MutationAnalysis, Reverse Engineering, Web Applications
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16222690 Low-Latency and Low-Overhead Path Planning for In-band Network-Wide Telemetry
Authors: Penghui Zhang, Hua Zhang, Jun-Bo Wang, Cheng Zeng, Zijian Cao
Abstract:
With the development of software-defined networks and programmable data planes, in-band network telemetry (INT) has become an emerging technology in communications because it can get accurate and real-time network information. However, due to the expansion of the network scale, existing telemetry systems, to the best of the authors’ knowledge, have difficulty in meeting the common requirements of low overhead, low latency and full coverage for traffic measurement. This paper proposes a network-wide telemetry system with a low-latency low-overhead path planning (INT-LLPP). This paper builds a mathematical model to analyze the telemetry overhead and latency of INT systems. Then, we adopt a greedy-based path planning algorithm to reduce the overhead and latency of the network telemetry with the full network coverage. The simulation results show that network-wide telemetry is achieved and the telemetry overhead can be reduced significantly compared with existing INT systems. INT-LLPP can control the system latency to get real-time network information.
Keywords: Network telemetry, network monitoring, path planning, low latency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2502689 A Procedure to Assess Streamflow Rating Curves and Streamflow Sequences
Authors: Elena Carcano, Mirzi Betasolo
Abstract:
This study aims to provide sub-hourly streamflow predictions and associated rating curves for small catchments of intermittent and torrential flow regime characterized by flash floods occurring especially during April and November. The methodology entails two lumped conceptual hydrological models which work in series. The total model is based upon eleven parameters and shows good flexibility in handling different input sets. Runoff Coefficient has contributed to improving the model’s performances and has been treated as an additional parameter; while Sensitivity Analysis has highlighted how slight changes in the model’s input can lead to changes in model’s output. The adopted procedure is steady and useful to give very practical engineering information at the expense of a parsimonious request both in input data and in the number of adopted parameters. According to the obtained results, the authors encourage the test of this combined procedure on different hydrological scenarios in order to provide information for poorly monitored catchments and not updated sites.
Keywords: Streamflow rating curve, chronological data, streamflow sequences, conceptual models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4192688 Accuracy of Displacement Estimation and Selection of Capacitors for a Four Degrees of Freedom Capacitive Force Sensor
Authors: Chisato Murakami, Makoto Takahashi
Abstract:
Force sensor has been used as requisite for knowing information on the amount and the directions of forces on the skin surface. We have developed a four-degrees-of-freedom capacitive force sensor (approximately 20×20×5 mm3) that has a flexible structure and sixteen parallel plate capacitors. An iterative algorithm was developed for estimating four displacements from the sixteen capacitances using fourth-order polynomial approximation of characteristics between capacitance and displacement. The estimation results from measured capacitances had large error caused by deterioration of the characteristics. In this study, effective capacitors had major information were selected on the basis of the capacitance change range and the characteristic shape. Maximum errors in calibration and non-calibration points were 25%and 6.8%.However the maximum error was larger than desired value, the smallness of averaged value indicated the occurrence of a few large error points. On the other hand, error in non-calibration point was within desired value.
Keywords: Force sensors, capacitive sensors, estimation, iterative algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16152687 Attacks Classification in Adaptive Intrusion Detection using Decision Tree
Authors: Dewan Md. Farid, Nouria Harbi, Emna Bahri, Mohammad Zahidur Rahman, Chowdhury Mofizur Rahman
Abstract:
Recently, information security has become a key issue in information technology as the number of computer security breaches are exposed to an increasing number of security threats. A variety of intrusion detection systems (IDS) have been employed for protecting computers and networks from malicious network-based or host-based attacks by using traditional statistical methods to new data mining approaches in last decades. However, today's commercially available intrusion detection systems are signature-based that are not capable of detecting unknown attacks. In this paper, we present a new learning algorithm for anomaly based network intrusion detection system using decision tree algorithm that distinguishes attacks from normal behaviors and identifies different types of intrusions. Experimental results on the KDD99 benchmark network intrusion detection dataset demonstrate that the proposed learning algorithm achieved 98% detection rate (DR) in comparison with other existing methods.Keywords: Detection rate, decision tree, intrusion detectionsystem, network security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36282686 A New Approach to Image Segmentation via Fuzzification of Rènyi Entropy of Generalized Distributions
Authors: Samy Sadek, Ayoub Al-Hamadi, Axel Panning, Bernd Michaelis, Usama Sayed
Abstract:
In this paper, we propose a novel approach for image segmentation via fuzzification of Rènyi Entropy of Generalized Distributions (REGD). The fuzzy REGD is used to precisely measure the structural information of image and to locate the optimal threshold desired by segmentation. The proposed approach draws upon the postulation that the optimal threshold concurs with maximum information content of the distribution. The contributions in the paper are as follow: Initially, the fuzzy REGD as a measure of the spatial structure of image is introduced. Then, we propose an efficient entropic segmentation approach using fuzzy REGD. However the proposed approach belongs to entropic segmentation approaches (i.e. these approaches are commonly applied to grayscale images), it is adapted to be viable for segmenting color images. Lastly, diverse experiments on real images that show the superior performance of the proposed method are carried out.Keywords: Entropy of generalized distributions, entropy fuzzification, entropic image segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32312685 Examination of Readiness of Teachers in the Use of Information-Communication Technologies in the Classroom
Authors: Nikolina Ribarić
Abstract:
This paper compares the readiness of chemistry teachers to use information and communication technologies in chemistry in 2018 and 2021. A survey conducted in 2018 on a sample of teachers showed that most teachers occasionally use visualization and digitization tools in chemistry teaching (65%), but feel that they are not educated enough to use them (56%). Also, most teachers do not have adequate equipment in their schools and are not able to use ICT in teaching or digital tools for visualization and digitization of content (44%). None of the teachers find the use of digitization and visualization tools useless. Furthermore, a survey conducted in 2021 shows that most teachers occasionally use visualization and digitization tools in chemistry teaching (83%). Also, the research shows that some teachers still do not have adequate equipment in their schools and are not able to use ICT in chemistry teaching or digital tools for visualization and digitization of content (14%). Advances in the use of ICT in chemistry teaching are linked to pandemic conditions and the obligation to conduct online teaching. The share of 14% of teachers who still do not have adequate equipment to use digital tools in teaching is worrying.
Keywords: Chemistry, digital content, e-learning, ICT, visualization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4082684 Online Topic Model for Broadcasting Contents Using Semantic Correlation Information
Authors: Chang-Uk Kwak, Sun-Joong Kim, Seong-Bae Park, Sang-Jo Lee
Abstract:
This paper proposes a method of learning topics for broadcasting contents. There are two kinds of texts related to broadcasting contents. One is a broadcasting script, which is a series of texts including directions and dialogues. The other is blogposts, which possesses relatively abstracted contents, stories, and diverse information of broadcasting contents. Although two texts range over similar broadcasting contents, words in blogposts and broadcasting script are different. When unseen words appear, it needs a method to reflect to existing topic. In this paper, we introduce a semantic vocabulary expansion method to reflect unseen words. We expand topics of the broadcasting script by incorporating the words in blogposts. Each word in blogposts is added to the most semantically correlated topics. We use word2vec to get the semantic correlation between words in blogposts and topics of scripts. The vocabularies of topics are updated and then posterior inference is performed to rearrange the topics. In experiments, we verified that the proposed method can discover more salient topics for broadcasting contents.
Keywords: Broadcasting script analysis, topic expansion, semantic correlation analysis, word2vec.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17592683 A Survey on Supply Chain Management and E Commerce Technology Adoption among Logistics Service Providers in Johor
Authors: Mohd Iskandar bin Illyas Tan, Iziati Saadah bt Ibrahim
Abstract:
Logistics is part of the supply chain processes that plans, implements, and controls the efficient and effective forward and reverse flow and storage of goods, services, and related information between the point of origin and the point of consumption in order to meet customer requirements. This research aims to investigate the current status and future direction of the use of Information Technology (IT) for logistics, focusing on Supply Chain Management (SCM) and E-Commerce adoption in Johor. Therefore, this research stresses on the type of technology being adopted, factors, benefits and barriers affecting the innovation in SCM and ECommerce technology adoption among Logistics Service Providers (LSP). A mailed questionnaire survey was conducted to collect data from 265 logistics companies in Johor. The research revealed that SCM technology adoption among LSP was higher as they had adopted SCM technology in various business processes while they perceived a high level of benefits from SCM adoption. Obviously, ECommerce technology adoption among LSP is relatively low.
Keywords: E-Commerce, Johor, Logistics Service Providers, Supply Chain Management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31142682 Dynamic Reroute Modeling for Emergency Evacuation: Case Study of Brunswick City, Germany
Authors: Yun-Pang Flötteröd, Jakob Erdmann
Abstract:
The human behaviors during evacuations are quite complex. One of the critical behaviors which affect the efficiency of evacuation is route choice. Therefore, the respective simulation modeling work needs to function properly. In this paper, Simulation of Urban Mobility’s (SUMO) current dynamic route modeling during evacuation, i.e. the rerouting functions, is examined with a real case study. The result consistency of the simulation and the reality is checked as well. Four influence factors (1) time to get information, (2) probability to cancel a trip, (3) probability to use navigation equipment, and (4) rerouting and information updating period are considered to analyze possible traffic impacts during the evacuation and to examine the rerouting functions in SUMO. Furthermore, some behavioral characters of the case study are analyzed with use of the corresponding detector data and applied in the simulation. The experiment results show that the dynamic route modeling in SUMO can deal with the proposed scenarios properly. Some issues and function needs related to route choice are discussed and further improvements are suggested.
Keywords: Evacuation, microscopic traffic simulation, rerouting, SUMO.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11802681 Automatic 3D Reconstruction of Coronary Artery Centerlines from Monoplane X-ray Angiogram Images
Authors: Ali Zifan, Panos Liatsis, Panagiotis Kantartzis, Manolis Gavaises, Nicos Karcanias, Demosthenes Katritsis
Abstract:
We present a new method for the fully automatic 3D reconstruction of the coronary artery centerlines, using two X-ray angiogram projection images from a single rotating monoplane acquisition system. During the first stage, the input images are smoothed using curve evolution techniques. Next, a simple yet efficient multiscale method, based on the information of the Hessian matrix, for the enhancement of the vascular structure is introduced. Hysteresis thresholding using different image quantiles, is used to threshold the arteries. This stage is followed by a thinning procedure to extract the centerlines. The resulting skeleton image is then pruned using morphological and pattern recognition techniques to remove non-vessel like structures. Finally, edge-based stereo correspondence is solved using a parallel evolutionary optimization method based on f symbiosis. The detected 2D centerlines combined with disparity map information allow the reconstruction of the 3D vessel centerlines. The proposed method has been evaluated on patient data sets for evaluation purposes.Keywords: Vessel enhancement, centerline extraction, symbiotic reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2271