Search results for: Network and Information Security
3051 Mining Association Rules from Unstructured Documents
Authors: Hany Mahgoub
Abstract:
This paper presents a system for discovering association rules from collections of unstructured documents called EART (Extract Association Rules from Text). The EART system treats texts only not images or figures. EART discovers association rules amongst keywords labeling the collection of textual documents. The main characteristic of EART is that the system integrates XML technology (to transform unstructured documents into structured documents) with Information Retrieval scheme (TF-IDF) and Data Mining technique for association rules extraction. EART depends on word feature to extract association rules. It consists of four phases: structure phase, index phase, text mining phase and visualization phase. Our work depends on the analysis of the keywords in the extracted association rules through the co-occurrence of the keywords in one sentence in the original text and the existing of the keywords in one sentence without co-occurrence. Experiments applied on a collection of scientific documents selected from MEDLINE that are related to the outbreak of H5N1 avian influenza virus.Keywords: Association rules, information retrieval, knowledgediscovery in text, text mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24473050 Main Cause of Children's Deaths in Indigenous Wayuu Community from Department of La Guajira: A Research Developed through Data Mining Use
Authors: Isaura Esther Solano Núñez, David Suarez
Abstract:
The main purpose of this research is to discover what causes death in children of the Wayuu community, and deeply analyze those results in order to take corrective measures to properly control infant mortality. We consider important to determine the reasons that are producing early death in this specific type of population, since they are the most vulnerable to high risk environmental conditions. In this way, the government, through competent authorities, may develop prevention policies and the right measures to avoid an increase of this tragic fact. The methodology used to develop this investigation is data mining, which consists in gaining and examining large amounts of data to produce new and valuable information. Through this technique it has been possible to determine that the child population is dying mostly from malnutrition. In short, this technique has been very useful to develop this study; it has allowed us to transform large amounts of information into a conclusive and important statement, which has made it easier to take appropriate steps to resolve a particular situation.
Keywords: Malnutrition, datamining, analytical, descriptive, population, wayuu, indigenous.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7053049 Activation Parameters of the Low Temperature Creep Controlling Mechanism in Martensitic Steels
Abstract:
Martensitic steels with an ultimate tensile strength beyond 2000 MPa are applied in the powertrain of vehicles due to their excellent fatigue strength and high creep resistance. However, the creep controlling mechanism in martensitic steels at ambient temperatures up to 423 K is not evident. The purpose of this study is to review the low temperature creep (LTC) behavior of martensitic steels at temperatures from 363 K to 523 K. Thus, the validity of a logarithmic creep law is reviewed and the stress and temperature dependence of the creep parameters α and β are revealed. Furthermore, creep tests are carried out, which include stepped changes in temperature or stress, respectively. On one hand, the change of the creep rate due to a temperature step provides information on the magnitude of the activation energy of the LTC controlling mechanism and on the other hand, the stress step approach provides information on the magnitude of the activation volume. The magnitude, the temperature dependency, and the stress dependency of both material specific activation parameters may deliver a significant contribution to the disclosure of the nature of the LTC rate controlling mechanism.
Keywords: Activation parameters, creep mechanisms, high strength steels, low temperature creep.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7203048 The Public Law Studies: Relationship between Accountability, Environmental Education and Smart Cities
Authors: Aline Alves Bandeira, Luís Pedro Lima, Maria Cecília de Paula Silva, Paulo Henrique de Viveiros Tavares
Abstract:
Nowadays, the study of public policies regarding management efficiency is essential. Public policies are about what governments do or do not do, being an area that has grown worldwide, contributing through the knowledge of technologies and methodologies that monitor and evaluate the performance of public administrators. The information published on official government websites needs to provide for transparency and responsiveness of managers. Thus, transparency is a primordial factor for the execution of accountability, providing, in this way, services to the citizen with the expansion of transparent, efficient, democratic information and that value administrative eco-efficiency. The ecologically balanced management of a Smart City must optimize environmental education, building a fairer society, which brings about equality in the use of quality environmental resources. Smart Cities add value in the construction of public management, enabling interaction between people, enhancing environmental education and the practical applicability of administrative eco-efficiency, fostering economic development and improving the quality of life.
Keywords: Accountability, environmental education, new public administration, smart cities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6343047 Integrated Models of Reading Comprehension: Understanding to Impact Teaching: The Teacher’s Central Role
Authors: Sally A. Brown
Abstract:
Over the last 30 years, researchers have developed models or frameworks to provide a more structured understanding of the reading comprehension process. Cognitive information processing models and social cognitive theories both provide frameworks to inform reading comprehension instruction. The purpose of this paper is to (a) provide an overview of the historical development of reading comprehension theory, (b) review the literature framed by cognitive information processing, social cognitive, and integrated reading comprehension theories, and (c) demonstrate how these frameworks inform instruction. As integrated models of reading can guide the interpretation of various factors related to student learning, an integrated framework designed by the researcher will be presented. Results indicated that features of cognitive processing and social cognitivism theory—represented in the integrated framework—highlight the importance of the role of the teacher. This model can aide teachers in not only improving reading comprehension instruction but in identifying areas of challenge for students.
Keywords: Explicit instruction, integrated models of reading comprehension, reading comprehension, teacher’s role.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2043046 Design and Implementation of Reed Solomon Encoder on FPGA
Authors: Amandeep Singh, Mandeep Kaur
Abstract:
Error correcting codes are used for detection and correction of errors in digital communication system. Error correcting coding is based on appending of redundancy to the information message according to a prescribed algorithm. Reed Solomon codes are part of channel coding and withstand the effect of noise, interference and fading. Galois field arithmetic is used for encoding and decoding reed Solomon codes. Galois field multipliers and linear feedback shift registers are used for encoding the information data block. The design of Reed Solomon encoder is complex because of use of LFSR and Galois field arithmetic. The purpose of this paper is to design and implement Reed Solomon (255, 239) encoder with optimized and lesser number of Galois Field multipliers. Symmetric generator polynomial is used to reduce the number of GF multipliers. To increase the capability toward error correction, convolution interleaving will be used with RS encoder. The Design will be implemented on Xilinx FPGA Spartan II.
Keywords: Galois Field, Generator polynomial, LFSR, Reed Solomon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 48493045 Computer Proven Correctness of the Rabin Public-Key Scheme
Authors: Johannes Buchmann, Markus Kaiser
Abstract:
We decribe a formal specification and verification of the Rabin public-key scheme in the formal proof system Is-abelle/HOL. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. The analysis presented uses a given database to prove formal properties of our implemented functions with computer support. Thema in task in designing a practical formalization of correctness as well as security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as eficient formal proofs. This yields the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Consequently, we get reliable proofs with a minimal error rate augmenting the used database. This provides a formal basis for more computer proof constructions in this area.Keywords: public-key encryption, Rabin public-key scheme, formalproof system, higher-order logic, formal verification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15963044 Benchmarking of Pentesting Tools
Authors: Esteban Alejandro Armas Vega, Ana Lucila Sandoval Orozco, Luis Javier García Villalba
Abstract:
The benchmarking of tools for dynamic analysis of vulnerabilities in web applications is something that is done periodically, because these tools from time to time update their knowledge base and search algorithms, in order to improve their accuracy. Unfortunately, the vast majority of these evaluations are made by software enthusiasts who publish their results on blogs or on non-academic websites and always with the same evaluation methodology. Similarly, academics who have carried out this type of analysis from a scientific approach, the majority, make their analysis within the same methodology as well the empirical authors. This paper is based on the interest of finding answers to questions that many users of this type of tools have been asking over the years, such as, to know if the tool truly test and evaluate every vulnerability that it ensures do, or if the tool, really, deliver a real report of all the vulnerabilities tested and exploited. This kind of questions have also motivated previous work but without real answers. The aim of this paper is to show results that truly answer, at least on the tested tools, all those unanswered questions. All the results have been obtained by changing the common model of benchmarking used for all those previous works.Keywords: Cybersecurity, IDS, security, web scanners, web vulnerabilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18163043 Lexical Based Method for Opinion Detection on Tripadvisor Collection
Authors: Faiza Belbachir, Thibault Schienhinski
Abstract:
The massive development of online social networks allows users to post and share their opinions on various topics. With this huge volume of opinion, it is interesting to extract and interpret these information for different domains, e.g., product and service benchmarking, politic, system of recommendation. This is why opinion detection is one of the most important research tasks. It consists on differentiating between opinion data and factual data. The difficulty of this task is to determine an approach which returns opinionated document. Generally, there are two approaches used for opinion detection i.e. Lexical based approaches and Machine Learning based approaches. In Lexical based approaches, a dictionary of sentimental words is used, words are associated with weights. The opinion score of document is derived by the occurrence of words from this dictionary. In Machine learning approaches, usually a classifier is trained using a set of annotated document containing sentiment, and features such as n-grams of words, part-of-speech tags, and logical forms. Majority of these works are based on documents text to determine opinion score but dont take into account if these texts are really correct. Thus, it is interesting to exploit other information to improve opinion detection. In our work, we will develop a new way to consider the opinion score. We introduce the notion of trust score. We determine opinionated documents but also if these opinions are really trustable information in relation with topics. For that we use lexical SentiWordNet to calculate opinion and trust scores, we compute different features about users like (numbers of their comments, numbers of their useful comments, Average useful review). After that, we combine opinion score and trust score to obtain a final score. We applied our method to detect trust opinions in TRIPADVISOR collection. Our experimental results report that the combination between opinion score and trust score improves opinion detection.Keywords: Tripadvisor, Opinion detection, SentiWordNet, trust score.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7573042 Measuring Government’s Performance (Services) Oman Service Maturity Model (OSMM)
Authors: Khalid Al Siyabi, Angie Al Habib
Abstract:
To measure or asses any government’s efficiency we need to measure the performance of this government in regards to the quality of the service it provides. Using a technological platform in service provision became a trend and a public demand. It is also a public need to make sure these services are aligned to values and to the whole government’s strategy, vision and goals as well. Providing services using technology tools and channels can enhance the internal business process and also help establish many essential values to government services like transparency and excellence, since in order to establish e-services many standards and policies must be put in place to enable the handing over of decision making to a mature system oriented mechanism. There was no doubt that the Sultanate of Oman wanted to enhance its services and move it towards automation and establishes a smart government as well as links its services to life events. Measuring government efficiency is very essential in achieving social security and economic growth, since it can provide a clear dashboard of all projects and improvements. Based on this data we can improve the strategies and align the country goals to them.
Keywords: Government, Maturity, Oman, Performance, Service.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18093041 Data Mining Classification Methods Applied in Drug Design
Authors: Mária Stachová, Lukáš Sobíšek
Abstract:
Data mining incorporates a group of statistical methods used to analyze a set of information, or a data set. It operates with models and algorithms, which are powerful tools with the great potential. They can help people to understand the patterns in certain chunk of information so it is obvious that the data mining tools have a wide area of applications. For example in the theoretical chemistry data mining tools can be used to predict moleculeproperties or improve computer-assisted drug design. Classification analysis is one of the major data mining methodologies. The aim of thecontribution is to create a classification model, which would be able to deal with a huge data set with high accuracy. For this purpose logistic regression, Bayesian logistic regression and random forest models were built using R software. TheBayesian logistic regression in Latent GOLD software was created as well. These classification methods belong to supervised learning methods. It was necessary to reduce data matrix dimension before construct models and thus the factor analysis (FA) was used. Those models were applied to predict the biological activity of molecules, potential new drug candidates.Keywords: data mining, classification, drug design, QSAR
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28583040 The Visualizer for Real-Time Analysis of Internet Trends
Authors: Radek Malinský, Ivan Jelínek
Abstract:
The current web has become a modern encyclopedia, where people share their thoughts and ideas on various topics around them. This kind of encyclopedia is very useful for other people who are looking for answers to their questions. However, with the growing popularity of social networking and blogging and ever expanding network services, there has also been a growing diversity of technologies along with a different structure of individual web sites. It is therefore difficult to directly find a relevant answer for a common Internet user. This paper presents a web application for the real-time end-to-end analysis of selected Internet trends where the trend can be whatever the people post online. The application integrates fully configurable tools for data collection and analysis using selected webometric algorithms, and for its chronological visualization to user. It can be assumed that the application facilitates the users to evaluate the quality of various products that are mentioned online.Keywords: Trend, visualizer, web analysis, web 2.0.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22433039 Validation of Reverse Engineered Web Application Models
Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini
Abstract:
Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.Keywords: Validation, Dynamic Analysis, MutationAnalysis, Reverse Engineering, Web Applications
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16283038 An Exact Algorithm for Location–Transportation Problems in Humanitarian Relief
Authors: Chansiri Singhtaun
Abstract:
This paper proposes a mathematical model and examines the performance of an exact algorithm for a location– transportation problems in humanitarian relief. The model determines the number and location of distribution centers in a relief network, the amount of relief supplies to be stocked at each distribution center and the vehicles to take the supplies to meet the needs of disaster victims under capacity restriction, transportation and budgetary constraints. The computational experiments are conducted on the various sizes of problems that are generated. Branch and bound algorithm is applied for these problems. The results show that this algorithm can solve problem sizes of up to three candidate locations with five demand points and one candidate location with up to twenty demand points without premature termination.
Keywords: Disaster response, facility location, humanitarian relief, transportation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23853037 A Procedure to Assess Streamflow Rating Curves and Streamflow Sequences
Authors: Elena Carcano, Mirzi Betasolo
Abstract:
This study aims to provide sub-hourly streamflow predictions and associated rating curves for small catchments of intermittent and torrential flow regime characterized by flash floods occurring especially during April and November. The methodology entails two lumped conceptual hydrological models which work in series. The total model is based upon eleven parameters and shows good flexibility in handling different input sets. Runoff Coefficient has contributed to improving the model’s performances and has been treated as an additional parameter; while Sensitivity Analysis has highlighted how slight changes in the model’s input can lead to changes in model’s output. The adopted procedure is steady and useful to give very practical engineering information at the expense of a parsimonious request both in input data and in the number of adopted parameters. According to the obtained results, the authors encourage the test of this combined procedure on different hydrological scenarios in order to provide information for poorly monitored catchments and not updated sites.
Keywords: Streamflow rating curve, chronological data, streamflow sequences, conceptual models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4263036 Representing Uncertainty in Computer-Generated Forces
Authors: Ruibiao J. Guo, Brad Cain, Pierre Meunier
Abstract:
The Integrated Performance Modelling Environment (IPME) is a powerful simulation engine for task simulation and performance analysis. However, it has no high level cognition such as memory and reasoning for complex simulation. This article introduces a knowledge representation and reasoning scheme that can accommodate uncertainty in simulations of military personnel with IPME. This approach demonstrates how advanced reasoning models that support similarity-based associative process, rule-based abstract process, multiple reasoning methods and real-time interaction can be integrated with conventional task network modelling to provide greater functionality and flexibility when modelling operator performance.Keywords: Computer-Generated Forces, Human Behaviour Representation, IPME, Modelling and Simulation, Uncertainty Reasoning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21253035 Accuracy of Displacement Estimation and Selection of Capacitors for a Four Degrees of Freedom Capacitive Force Sensor
Authors: Chisato Murakami, Makoto Takahashi
Abstract:
Force sensor has been used as requisite for knowing information on the amount and the directions of forces on the skin surface. We have developed a four-degrees-of-freedom capacitive force sensor (approximately 20×20×5 mm3) that has a flexible structure and sixteen parallel plate capacitors. An iterative algorithm was developed for estimating four displacements from the sixteen capacitances using fourth-order polynomial approximation of characteristics between capacitance and displacement. The estimation results from measured capacitances had large error caused by deterioration of the characteristics. In this study, effective capacitors had major information were selected on the basis of the capacitance change range and the characteristic shape. Maximum errors in calibration and non-calibration points were 25%and 6.8%.However the maximum error was larger than desired value, the smallness of averaged value indicated the occurrence of a few large error points. On the other hand, error in non-calibration point was within desired value.
Keywords: Force sensors, capacitive sensors, estimation, iterative algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16223034 Developing of Fragility Curve for Two-Span Simply Supported Concrete Bridge in Near-Fault Area
Authors: S. Shirazian, M.R. Ghayamghamian, G.R. Nouri
Abstract:
Bridges are one of the main components of transportation networks. They should be functional before and after earthquake for emergency services. Therefore we need to assess seismic performance of bridges under different seismic loadings. Fragility curve is one of the popular tools in seismic evaluations. The fragility curves are conditional probability statements, which give the probability of a bridge reaching or exceeding a particular damage level for a given intensity level. In this study, the seismic performance of a two-span simply supported concrete bridge is assessed. Due to usual lack of empirical data, the analytical fragility curve was developed by results of the dynamic analysis of bridge subjected to the different time histories in near-fault area.Keywords: Fragility curve, Seismic behavior, Time historyanalysis, Transportation Network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28023033 A New Approach to Image Segmentation via Fuzzification of Rènyi Entropy of Generalized Distributions
Authors: Samy Sadek, Ayoub Al-Hamadi, Axel Panning, Bernd Michaelis, Usama Sayed
Abstract:
In this paper, we propose a novel approach for image segmentation via fuzzification of Rènyi Entropy of Generalized Distributions (REGD). The fuzzy REGD is used to precisely measure the structural information of image and to locate the optimal threshold desired by segmentation. The proposed approach draws upon the postulation that the optimal threshold concurs with maximum information content of the distribution. The contributions in the paper are as follow: Initially, the fuzzy REGD as a measure of the spatial structure of image is introduced. Then, we propose an efficient entropic segmentation approach using fuzzy REGD. However the proposed approach belongs to entropic segmentation approaches (i.e. these approaches are commonly applied to grayscale images), it is adapted to be viable for segmenting color images. Lastly, diverse experiments on real images that show the superior performance of the proposed method are carried out.Keywords: Entropy of generalized distributions, entropy fuzzification, entropic image segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32383032 Determining the Best Method of Stability Landslide by Using of DSS (Case Study: Landslide in Hasan Salaran, Kurdistan Province in Iran)
Authors: S. Kamyabi, M. Salari, H. Shahabi
Abstract:
One of the processes of slope that occurs every year in Iran and some parts of world and cause a lot of criminal and financial harms is called landslide. They are plenty of method to stability landslide in soil and rock slides. The use of the best method with the least cost and in the shortest time is important for researchers. In this research, determining the best method of stability is investigated by using of Decision Support systems. DSS is made for this purpose and was used (for Hasan Salaran area in Kurdistan). Field study data from topography, slope, geology, geometry of landslide and the related features was used. The related data entered decision making managements programs (DSS) (ALES).Analysis of mass stability indicated the instability potential at present. Research results show that surface and sub surface drainage the best method of stabilizing. Analysis of stability shows that acceptable increase in security coefficient is a consequence of drainage.
Keywords: Landslide, Decision Support systems, stability, Hasan Salaran landslide, Kurdistan province, Iran.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17603031 Examination of Readiness of Teachers in the Use of Information-Communication Technologies in the Classroom
Authors: Nikolina Ribarić
Abstract:
This paper compares the readiness of chemistry teachers to use information and communication technologies in chemistry in 2018 and 2021. A survey conducted in 2018 on a sample of teachers showed that most teachers occasionally use visualization and digitization tools in chemistry teaching (65%), but feel that they are not educated enough to use them (56%). Also, most teachers do not have adequate equipment in their schools and are not able to use ICT in teaching or digital tools for visualization and digitization of content (44%). None of the teachers find the use of digitization and visualization tools useless. Furthermore, a survey conducted in 2021 shows that most teachers occasionally use visualization and digitization tools in chemistry teaching (83%). Also, the research shows that some teachers still do not have adequate equipment in their schools and are not able to use ICT in chemistry teaching or digital tools for visualization and digitization of content (14%). Advances in the use of ICT in chemistry teaching are linked to pandemic conditions and the obligation to conduct online teaching. The share of 14% of teachers who still do not have adequate equipment to use digital tools in teaching is worrying.
Keywords: Chemistry, digital content, e-learning, ICT, visualization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4323030 Online Topic Model for Broadcasting Contents Using Semantic Correlation Information
Authors: Chang-Uk Kwak, Sun-Joong Kim, Seong-Bae Park, Sang-Jo Lee
Abstract:
This paper proposes a method of learning topics for broadcasting contents. There are two kinds of texts related to broadcasting contents. One is a broadcasting script, which is a series of texts including directions and dialogues. The other is blogposts, which possesses relatively abstracted contents, stories, and diverse information of broadcasting contents. Although two texts range over similar broadcasting contents, words in blogposts and broadcasting script are different. When unseen words appear, it needs a method to reflect to existing topic. In this paper, we introduce a semantic vocabulary expansion method to reflect unseen words. We expand topics of the broadcasting script by incorporating the words in blogposts. Each word in blogposts is added to the most semantically correlated topics. We use word2vec to get the semantic correlation between words in blogposts and topics of scripts. The vocabularies of topics are updated and then posterior inference is performed to rearrange the topics. In experiments, we verified that the proposed method can discover more salient topics for broadcasting contents.
Keywords: Broadcasting script analysis, topic expansion, semantic correlation analysis, word2vec.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17633029 Dimensioning of Subsynchronous Cascade for Speed Regulation of Two-Motors 6kv Conveyer Drives
Authors: M. Kasumović, A. Hodžić, M. Tešanović
Abstract:
One way for optimum loading of overdimensioning conveyers is speed (capacity) decrement, with attention for production capabilities and demands. At conveyers which drives with three phase slip-ring induction motor, technically reasonable solution for conveyer (driving motors) speed regulation is using constant torque subsynchronous cascade with static semiconductor converter and transformer for energy reversion to the power network. In the paper is described mathematical model for parameter calculation of two-motors 6 kV subsynchronous cascade. It is also demonstrated that applying of this cascade gave several good properties, foremost in electrical energy saving, also in improving of other energy indexes, and finally that results in cost reduction of complete electrical motor drive.Keywords: Conveyer with rubber belt, electrical motor drive, sub synchronous cascade
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19503028 Wireless Sensor Networks:Delay Guarentee and Energy Efficient MAC Protocols
Authors: Marwan Ihsan Shukur, Lee Sheng Chyan, Vooi Voon Yap
Abstract:
Wireless sensor networks is an emerging technology that serves as environment monitors in many applications. Yet these miniatures suffer from constrained resources in terms of computation capabilities and energy resources. Limited energy resource in these nodes demands an efficient consumption of that resource either by developing the modules itself or by providing an efficient communication protocols. This paper presents a comprehensive summarization and a comparative study of the available MAC protocols proposed for Wireless Sensor Networks showing their capabilities and efficiency in terms of energy consumption and delay guarantee.Keywords: MAC (Medium Access Control), SEA (Simple EnergyAware), WSNs (Wireless Sensor Nodes or Networks) RTS (RequestTo Send), CTS (Clear To Send), SYNCH (Synchronize), NS2(Network Simulator 2).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21263027 What Managers Think of Informal Networks and Knowledge Sharing by Means of Personal Networking?
Authors: Mahmood Q.K. Ghaznavi, Martin Perry, Paul Toulson, Keri Logan
Abstract:
The importance of nurturing, accumulating, and efficiently deploying knowledge resources through formal structures and organisational mechanisms is well understood. Recent trends in knowledge management (KM) highlight that the effective creation and transfer of knowledge can also rely upon extra-organisational channels, such as, informal networks. The perception exists that the role of informal networks in knowledge creation and performance has been underestimated in the organisational context. Literature indicates that many managers fail to comprehend and successfully exploit the potential role of informal networks to create value for their organisations. This paper investigates: 1) whether managers share work-specific knowledge with informal contacts within and outside organisational boundaries; and 2) what do they think is the importance of this knowledge collaboration in their learning and work outcomes.
Keywords: Informal network, knowledge management, knowledge sharing, performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21193026 Security over OFDM Fading Channels with Friendly Jammer
Authors: Munnujahan Ara
Abstract:
In this paper, we investigate the effect of friendly jamming power allocation strategies on the achievable average secrecy rate over a bank of parallel fading wiretap channels. We investigate the achievable average secrecy rate in parallel fading wiretap channels subject to Rayleigh and Rician fading. The achievable average secrecy rate, due to the presence of a line-of-sight component in the jammer channel is also evaluated. Moreover, we study the detrimental effect of correlation across the parallel sub-channels, and evaluate the corresponding decrease in the achievable average secrecy rate for the various fading configurations. We also investigate the tradeoff between the transmission power and the jamming power for a fixed total power budget. Our results, which are applicable to current orthogonal frequency division multiplexing (OFDM) communications systems, shed further light on the achievable average secrecy rates over a bank of parallel fading channels in the presence of friendly jammers.
Keywords: Fading parallel channels, Wire-tap channel, OFDM, Secrecy capacity, Power allocation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22543025 Feature Extraction of Dorsal Hand Vein Pattern Using a Fast Modified PCA Algorithm Based On Cholesky Decomposition and Lanczos Technique
Authors: Maleika Heenaye- Mamode Khan , Naushad Mamode Khan, Raja K.Subramanian
Abstract:
Dorsal hand vein pattern is an emerging biometric which is attracting the attention of researchers, of late. Research is being carried out on existing techniques in the hope of improving them or finding more efficient ones. In this work, Principle Component Analysis (PCA) , which is a successful method, originally applied on face biometric is being modified using Cholesky decomposition and Lanczos algorithm to extract the dorsal hand vein features. This modified technique decreases the number of computation and hence decreases the processing time. The eigenveins were successfully computed and projected onto the vein space. The system was tested on a database of 200 images and using a threshold value of 0.9 to obtain the False Acceptance Rate (FAR) and False Rejection Rate (FRR). This modified algorithm is desirable when developing biometric security system since it significantly decreases the matching time.
Keywords: Dorsal hand vein pattern, PCA, Cholesky decomposition, Lanczos algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18413024 A Survey on Supply Chain Management and E Commerce Technology Adoption among Logistics Service Providers in Johor
Authors: Mohd Iskandar bin Illyas Tan, Iziati Saadah bt Ibrahim
Abstract:
Logistics is part of the supply chain processes that plans, implements, and controls the efficient and effective forward and reverse flow and storage of goods, services, and related information between the point of origin and the point of consumption in order to meet customer requirements. This research aims to investigate the current status and future direction of the use of Information Technology (IT) for logistics, focusing on Supply Chain Management (SCM) and E-Commerce adoption in Johor. Therefore, this research stresses on the type of technology being adopted, factors, benefits and barriers affecting the innovation in SCM and ECommerce technology adoption among Logistics Service Providers (LSP). A mailed questionnaire survey was conducted to collect data from 265 logistics companies in Johor. The research revealed that SCM technology adoption among LSP was higher as they had adopted SCM technology in various business processes while they perceived a high level of benefits from SCM adoption. Obviously, ECommerce technology adoption among LSP is relatively low.
Keywords: E-Commerce, Johor, Logistics Service Providers, Supply Chain Management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31223023 Multi-Level Meta-Modeling for Enabling Dynamic Subtyping for Industrial Automation
Authors: Zoltan Theisz, Gergely Mezei
Abstract:
Modern industrial automation relies on service oriented concepts of Internet of Things (IoT) device modeling in order to provide a flexible and extendable environment for service meta-repository. However, state-of-the-art meta-modeling techniques prefer design-time modeling, which results in a heavy usage of class sometimes unnecessary static subtyping. Although this approach benefits from clear-cut object-oriented design principles, it also seals the model repository for further dynamic extensions. In this paper, a dynamic multi-level modeling approach is introduced that enables dynamic subtyping through a more relaxed partial instantiation mechanism. The approach is demonstrated on a simple sensor network example.Keywords: Meta-modeling, dynamic subtyping, DMLA, industrial automation, arrowhead.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11293022 Dynamic Reroute Modeling for Emergency Evacuation: Case Study of Brunswick City, Germany
Authors: Yun-Pang Flötteröd, Jakob Erdmann
Abstract:
The human behaviors during evacuations are quite complex. One of the critical behaviors which affect the efficiency of evacuation is route choice. Therefore, the respective simulation modeling work needs to function properly. In this paper, Simulation of Urban Mobility’s (SUMO) current dynamic route modeling during evacuation, i.e. the rerouting functions, is examined with a real case study. The result consistency of the simulation and the reality is checked as well. Four influence factors (1) time to get information, (2) probability to cancel a trip, (3) probability to use navigation equipment, and (4) rerouting and information updating period are considered to analyze possible traffic impacts during the evacuation and to examine the rerouting functions in SUMO. Furthermore, some behavioral characters of the case study are analyzed with use of the corresponding detector data and applied in the simulation. The experiment results show that the dynamic route modeling in SUMO can deal with the proposed scenarios properly. Some issues and function needs related to route choice are discussed and further improvements are suggested.
Keywords: Evacuation, microscopic traffic simulation, rerouting, SUMO.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1184