Search results for: Computational techniques
2533 Recognizing an Individual, Their Topic of Conversation, and Cultural Background from 3D Body Movement
Authors: Gheida J. Shahrour, Martin J. Russell
Abstract:
The 3D body movement signals captured during human-human conversation include clues not only to the content of people’s communication but also to their culture and personality. This paper is concerned with automatic extraction of this information from body movement signals. For the purpose of this research, we collected a novel corpus from 27 subjects, arranged them into groups according to their culture. We arranged each group into pairs and each pair communicated with each other about different topics. A state-of-art recognition system is applied to the problems of person, culture, and topic recognition. We borrowed modeling, classification, and normalization techniques from speech recognition. We used Gaussian Mixture Modeling (GMM) as the main technique for building our three systems, obtaining 77.78%, 55.47%, and 39.06% from the person, culture, and topic recognition systems respectively. In addition, we combined the above GMM systems with Support Vector Machines (SVM) to obtain 85.42%, 62.50%, and 40.63% accuracy for person, culture, and topic recognition respectively. Although direct comparison among these three recognition systems is difficult, it seems that our person recognition system performs best for both GMM and GMM-SVM, suggesting that intersubject differences (i.e. subject’s personality traits) are a major source of variation. When removing these traits from culture and topic recognition systems using the Nuisance Attribute Projection (NAP) and the Intersession Variability Compensation (ISVC) techniques, we obtained 73.44% and 46.09% accuracy from culture and topic recognition systems respectively.
Keywords: Person Recognition, Topic Recognition, Culture Recognition, 3D Body Movement Signals, Variability Compensation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21742532 Transforming Construction: Integrating Off-Site Techniques and Advanced Technologies
Authors: Layla Mujahed, Gang Feng, Jianghua Wang
Abstract:
An increasing number of construction projects are adopting off-site construction techniques over traditional methods to address longstanding challenges. This research paper explores the integration of design for manufacture and assembly (DfMA), modern methods of construction (MMC), and building information modeling (BIM) within the construction industry. This study employs a mixed-methods approach, using case studies and a review of the existing literature, to examine the role and combined application of each methodology in building projects of varying scales and durations. The study focuses on application mechanisms, stakeholder engagement, knowledge sharing, feedback, and performance metrics to explore the benefits, challenges, and transformative potential of integrating these methodologies. The findings indicate that the synergy among DfMA, MMC, and BIM significantly improves project efficiency, cost reduction, and overall quality. Standardization, increased collaboration among stakeholders, and the adoption of advanced technologies are also highlighted as necessary considerations to fully realize the benefits of this integration. The paper concludes with practical recommendations for industry practitioners seeking to efficiently implement these integrated approaches.
Keywords: BIM, building information modeling, case study, DfMA, design for manufacture and assembly, MMC, modern methods of construction, prefabrication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 732531 Extracting Therapeutic Grade Essential Oils from the Lamiaceae Plant Family in the United Arab Emirates (UAE): Highlights on Great Possibilities and Sever Difficulties
Authors: Suzan M. Shahin, Mohammed A. Salem
Abstract:
Essential oils are expensive phytochemicals produced and extracted from specific species belonging to particular families in the plant kingdom. In the United Arab Emirates country (UAE), is located in the arid region of the world, nine species, from the Lamiaceae family, having the capability to produce therapeutic grade essential oils. These species include; Mentha spicata, Ocimum forskolei, Salvia macrosiphon, Salvia aegyptiaca, Salvia macilenta, Salvia spinosa, Teucrium polium, Teucrium stocksianum and Zataria multiflora. Although, such potential species are indigenous to the UAE, however, there are almost no studies available to investigate the chemical composition and the quality of the extracted essential oils under the UAE climatological conditions. Therefore, great attention has to be given to such valuable natural resources, through conducting highly supported research projects, tailored to the UAE conditions, and investigating different extraction techniques, including the application of the latest available technologies, such as superficial fluid CO2. This is crucially needed; in order to accomplish the greatest possibilities in the medicinal field, specifically in the discovery of new therapeutic chemotypes, as well as, to achieve the sustainability of this natural resource in the country.
Keywords: Essential oils, extraction techniques, Lamiaceae, traditional medicine, United Arab Emirates (UAE).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25502530 Technique for Online Condition Monitoring of Surge Arrestors
Authors: Anil S. Khopkar, Kartik S. Pandya
Abstract:
Lightning overvoltage phenomenon in power systems cannot be avoided; however, it can be controlled to certain extent. To prevent system failure, power system equipment must be protected against overvoltage. Metal Oxide Surge Arrestors (MOSA) are connected in the system to provide protection against overvoltages. Under normal working conditions, MOSA function as, insulators, offering a conductive path during overvoltage events. MOSA consists of zinc oxide elements (ZnO Blocks) which has non-linear V-I characteristics. The ZnO blocks are connected in series and fitted in ceramic or polymer housing. Over time, these components degrade due to continuous operation. The degradation of zinc oxide elements increases the leakage current flowing through the surge arrestors. This increased leakage current results in elevated temperatures within the surge arrester, further decreasing the resistance of the zinc oxide elements. Consequently, the leakage current increases, leading to higher temperatures within the MOSA. This cycle creates thermal runaway conditions for the MOSA. Once a surge arrester reaches the thermal runaway condition, it cannot return to normal working conditions. This condition is a primary cause of premature failure of surge arrestors. Given that MOSA constitutes a core protective device for electrical power systems against transients, it contributes significantly to the reliable operation of power system networks. Therefore, periodic condition monitoring of surge arrestors is essential. Both online and offline condition monitoring techniques are available for surge arrestors. Offline condition monitoring techniques are not as popular because they require the removal of surge arrestors from the system, which requires system shutdown. Therefore, online condition monitoring techniques are more commonly used. This paper presents an evaluation technique for the surge arrester condition based on leakage current analysis. The maximum amplitudes of total leakage current (IT), fundamental resistive leakage current (IR), and third harmonic resistive leakage current (I3rd) are analyzed as indicators for surge arrester condition monitoring.
Keywords: Metal Oxide Surge Arrester, MOSA, Over voltage, total leakage current, resistive leakage current, third harmonic resistive leakage current, capacitive leakage current.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 862529 ADA Tool for Satellite InSAR-Based Ground Displacement Analysis: The Granada Region
Authors: M. Cuevas-González, O. Monserrat, A. Barra, C. Reyes-Carmona, R. M. Mateos, J. P. Galve, R. Sarro, M. Cantalejo, E. Peña, M. Martínez-Corbella, J. A. Luque, J. M. Azañón, A. Millares, M. Béjar, J. A. Navarro, L. Solari
Abstract:
Geohazard prone areas require continuous monitoring to detect risks, understand the phenomena occurring in those regions and prevent disasters. Satellite interferometry (InSAR) has come to be a trustworthy technique for ground movement detection and monitoring in the last few years. InSAR based techniques allow to process large areas providing high number of displacement measurements at low cost. However, the results provided by such techniques are usually not easy to interpret by non-experienced users hampering its use for decision makers. This work presents a set of tools developed in the framework of different projects (Momit, Safety, U-Geohaz, Riskcoast) and an example of their use in the Granada Coastal area (Spain) is shown. The ADA (Active Displacement Areas) tool has been developed with the aim of easing the management, use and interpretation of InSAR based results. It provides a semi-automatic extraction of the most significant ADAs through the application ADAFinder tool. This tool aims to support the exploitation of the European Ground Motion Service (EU-GMS), which will offer reliable and systematic information on natural and anthropogenic ground motion phenomena across Europe.
Keywords: Ground displacements, InSAR, natural hazards, satellite imagery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4132528 Computing Visibility Subsets in an Orthogonal Polyhedron
Authors: Jefri Marzal, Hong Xie, Chun Che Fung
Abstract:
Visibility problems are central to many computational geometry applications. One of the typical visibility problems is computing the view from a given point. In this paper, a linear time procedure is proposed to compute the visibility subsets from a corner of a rectangular prism in an orthogonal polyhedron. The proposed algorithm could be useful to solve classic 3D problems.
Keywords: Visibility, rectangular prism, orthogonal polyhedron.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13902527 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference
Authors: Hussein Alahmer, Amr Ahmed
Abstract:
Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate. This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.
Keywords: CAD system, difference of feature, Fuzzy c means, Liver segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14212526 A Bibliometric Assessment on Sustainability and Clustering
Authors: Fernanda M. Assef, Maria Teresinha A. Steiner, David Gabriel F. de Barros
Abstract:
Review researches are useful in terms of analysis of research problems. Between the types of review documents, we commonly find bibliometric studies. This type of application often helps the global visualization of a research problem and helps academics worldwide to understand the context of a research area better. In this document, a bibliometric view surrounding clustering techniques and sustainability problems is presented. The authors aimed at which issues mostly use clustering techniques and even which sustainability issue would be more impactful on today’s moment of research. During the bibliometric analysis, we found 10 different groups of research in clustering applications for sustainability issues: Energy; Environmental; Non-urban Planning; Sustainable Development; Sustainable Supply Chain; Transport; Urban Planning; Water; Waste Disposal; and, Others. Moreover, by analyzing the citations of each group, it was discovered that the Environmental group could be classified as the most impactful research cluster in the area mentioned. After the content analysis of each paper classified in the environmental group, it was found that the k-means technique is preferred for solving sustainability problems with clustering methods since it appeared the most amongst the documents. The authors finally conclude that a bibliometric assessment could help indicate a gap of researches on waste disposal – which was the group with the least amount of publications – and the most impactful research on environmental problems.
Keywords: Bibliometric assessment, clustering, sustainability, territorial partitioning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3912525 Defining a Semantic Web-based Framework for Enabling Automatic Reasoning on CIM-based Management Platforms
Authors: Fernando Alonso, Rafael Fernandez, Sonia Frutos, Javier Soriano
Abstract:
CIM is the standard formalism for modeling management information developed by the Distributed Management Task Force (DMTF) in the context of its WBEM proposal, designed to provide a conceptual view of the managed environment. In this paper, we propose the inclusion of formal knowledge representation techniques, based on Description Logics (DLs) and the Web Ontology Language (OWL), in CIM-based conceptual modeling, and then we examine the benefits of such a decision. The proposal is specified as a CIM metamodel level mapping to a highly expressive subset of DLs capable of capturing all the semantics of the models. The paper shows how the proposed mapping provides CIM diagrams with precise semantics and can be used for automatic reasoning about the management information models, as a design aid, by means of newgeneration CASE tools, thanks to the use of state-of-the-art automatic reasoning systems that support the proposed logic and use algorithms that are sound and complete with respect to the semantics. Such a CASE tool framework has been developed by the authors and its architecture is also introduced. The proposed formalization is not only useful at design time, but also at run time through the use of rational autonomous agents, in response to a need recently recognized by the DMTF.Keywords: CIM, Knowledge-based Information Models, OntologyLanguages, OWL, Description Logics, Integrated Network Management, Intelligent Agents, Automatic Reasoning Techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15562524 GC and GCxGC-MS Composition of Volatile Compounds from Carum carvi by Using Techniques Assisted by Microwaves
Authors: F. Benkaci-Ali, R. Mékaoui, G. Scholl, G. Eppe
Abstract:
The new methods as accelerated steam distillation assisted by microwave (ASDAM) is a combination of microwave heating and steam distillation, performed at atmospheric pressure at very short extraction time. Isolation and concentration of volatile compounds are performed by a single stage. (ASDAM) has been compared with (ASDAM) with cryogrinding of seeds (CG) and a conventional technique, hydrodistillation assisted by microwave (HDAM), hydro-distillation (HD) for the extraction of essential oil from aromatic herb as caraway and cumin seeds. The essential oils extracted by (ASDAM) for 1 min were quantitatively (yield) and qualitatively (aromatic profile) no similar to those obtained by ASDAM-CG (1 min) and HD (for 3 h). The accelerated microwave extraction with cryogrinding inhibits numerous enzymatic reactions as hydrolysis of oils. Microwave radiations constitute the adequate mean for the extraction operations from the yields and high content in major component majority point view, and allow to minimise considerably the energy consumption, but especially heating time too, which is one of essential parameters of artifacts formation. The ASDAM and ASDAM-CG are green techniques and yields an essential oil with higher amounts of more valuable oxygenated compounds comparable to the biosynthesis compounds, and allows substantial savings of costs, in terms of time, energy and plant material.Keywords: Microwave, steam distillation, caraway, cumin, cryogrinding, GC-MS, GCxGC-MS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20352523 3D Rendering of American Sign Language Finger-Spelling: A Comparative Study of Two Animation Techniques
Authors: Nicoletta Adamo-Villani
Abstract:
In this paper we report a study aimed at determining the most effective animation technique for representing ASL (American Sign Language) finger-spelling. Specifically, in the study we compare two commonly used 3D computer animation methods (keyframe animation and motion capture) in order to ascertain which technique produces the most 'accurate', 'readable', and 'close to actual signing' (i.e. realistic) rendering of ASL finger-spelling. To accomplish this goal we have developed 20 animated clips of fingerspelled words and we have designed an experiment consisting of a web survey with rating questions. 71 subjects ages 19-45 participated in the study. Results showed that recognition of the words was correlated with the method used to animate the signs. In particular, keyframe technique produced the most accurate representation of the signs (i.e., participants were more likely to identify the words correctly in keyframed sequences rather than in motion captured ones). Further, findings showed that the animation method had an effect on the reported scores for readability and closeness to actual signing; the estimated marginal mean readability and closeness was greater for keyframed signs than for motion captured signs. To our knowledge, this is the first study aimed at measuring and comparing accuracy, readability and realism of ASL animations produced with different techniques.Keywords: 3D Animation, American Sign Language, DeafEducation, Motion Capture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19982522 Performance Comparison and Analysis of Different Schemes and Limiters
Authors: Wang Wen-long, Li Hua, Pan Sha
Abstract:
Eight difference schemes and five limiters are applied to numerical computation of Riemann problem. The resolution of discontinuities of each scheme produced is compared. Numerical dissipation and its estimation are discussed. The result shows that the numerical dissipation of each scheme is vital to improve scheme-s accuracy and stability. MUSCL methodology is an effective approach to increase computational efficiency and resolution. Limiter should be selected appropriately by balancing compressive and diffusive performance.
Keywords: Scheme; Limiter, Numerical simulation, Riemannproblem.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24842521 Methodology: A Review in Modelling and Predictability of Embankment in Soft Ground
Authors: Bhim Kumar Dahal
Abstract:
Transportation network development in the developing country is in rapid pace. The majority of the network belongs to railway and expressway which passes through diverse topography, landform and geological conditions despite the avoidance principle during route selection. Construction of such networks demand many low to high embankment which required improvement in the foundation soil. This paper is mainly focused on the various advanced ground improvement techniques used to improve the soft soil, modelling approach and its predictability for embankments construction. The ground improvement techniques can be broadly classified in to three groups i.e. densification group, drainage and consolidation group and reinforcement group which are discussed with some case studies. Various methods were used in modelling of the embankments from simple 1-dimensional to complex 3-dimensional model using variety of constitutive models. However, the reliability of the predictions is not found systematically improved with the level of sophistication. And sometimes the predictions are deviated more than 60% to the monitored value besides using same level of erudition. This deviation is found mainly due to the selection of constitutive model, assumptions made during different stages, deviation in the selection of model parameters and simplification during physical modelling of the ground condition. This deviation can be reduced by using optimization process, optimization tools and sensitivity analysis of the model parameters which will guide to select the appropriate model parameters.
Keywords: Embankment, ground improvement, modelling, model prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9522520 Design of an Ensemble Learning Behavior Anomaly Detection Framework
Authors: Abdoulaye Diop, Nahid Emad, Thierry Winter, Mohamed Hilia
Abstract:
Data assets protection is a crucial issue in the cybersecurity field. Companies use logical access control tools to vault their information assets and protect them against external threats, but they lack solutions to counter insider threats. Nowadays, insider threats are the most significant concern of security analysts. They are mainly individuals with legitimate access to companies information systems, which use their rights with malicious intents. In several fields, behavior anomaly detection is the method used by cyber specialists to counter the threats of user malicious activities effectively. In this paper, we present the step toward the construction of a user and entity behavior analysis framework by proposing a behavior anomaly detection model. This model combines machine learning classification techniques and graph-based methods, relying on linear algebra and parallel computing techniques. We show the utility of an ensemble learning approach in this context. We present some detection methods tests results on an representative access control dataset. The use of some explored classifiers gives results up to 99% of accuracy.Keywords: Cybersecurity, data protection, access control, insider threat, user behavior analysis, ensemble learning, high performance computing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11522519 Development of Material Analyzing Software Using X-Ray Diffraction
Authors: Le Chi Cuong
Abstract:
X-ray diffraction is an effective mean for analyzing material properties. This paper developed a new computational software for determining the properties of crystalline materials such as elastic constants, residual stresses, surface hardness, phase components, and etc. The results computed from the X-ray diffraction method were compared to those from the traditional methods and they are in the 95% confidential limits, showing that the newly developed software has high reproducibility, opening a possibility of its commercialization.
Keywords: X-ray diffraction, Nondestructive evaluation, Hardness, Residual stress, Phase determination.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19592518 Detecting Earnings Management via Statistical and Neural Network Techniques
Authors: Mohammad Namazi, Mohammad Sadeghzadeh Maharluie
Abstract:
Predicting earnings management is vital for the capital market participants, financial analysts and managers. The aim of this research is attempting to respond to this query: Is there a significant difference between the regression model and neural networks’ models in predicting earnings management, and which one leads to a superior prediction of it? In approaching this question, a Linear Regression (LR) model was compared with two neural networks including Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). The population of this study includes 94 listed companies in Tehran Stock Exchange (TSE) market from 2003 to 2011. After the results of all models were acquired, ANOVA was exerted to test the hypotheses. In general, the summary of statistical results showed that the precision of GRNN did not exhibit a significant difference in comparison with MLP. In addition, the mean square error of the MLP and GRNN showed a significant difference with the multi variable LR model. These findings support the notion of nonlinear behavior of the earnings management. Therefore, it is more appropriate for capital market participants to analyze earnings management based upon neural networks techniques, and not to adopt linear regression models.Keywords: Earnings management, generalized regression neural networks, linear regression, multi-layer perceptron, Tehran stock exchange.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21052517 Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences
Authors: Yuan-Hsiang Chang, Pin-Chi Lin, Li-Der Jeng
Abstract:
Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.).
Keywords: Motion detection, motion tracking, trajectory analysis, video surveillance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17322516 Customer Segmentation Model in E-commerce Using Clustering Techniques and LRFM Model: The Case of Online Stores in Morocco
Authors: Rachid Ait daoud, Abdellah Amine, Belaid Bouikhalene, Rachid Lbibb
Abstract:
Given the increase in the number of e-commerce sites, the number of competitors has become very important. This means that companies have to take appropriate decisions in order to meet the expectations of their customers and satisfy their needs. In this paper, we present a case study of applying LRFM (length, recency, frequency and monetary) model and clustering techniques in the sector of electronic commerce with a view to evaluating customers’ values of the Moroccan e-commerce websites and then developing effective marketing strategies. To achieve these objectives, we adopt LRFM model by applying a two-stage clustering method. In the first stage, the self-organizing maps method is used to determine the best number of clusters and the initial centroid. In the second stage, kmeans method is applied to segment 730 customers into nine clusters according to their L, R, F and M values. The results show that the cluster 6 is the most important cluster because the average values of L, R, F and M are higher than the overall average value. In addition, this study has considered another variable that describes the mode of payment used by customers to improve and strengthen clusters’ analysis. The clusters’ analysis demonstrates that the payment method is one of the key indicators of a new index which allows to assess the level of customers’ confidence in the company's Website.Keywords: Customer value, LRFM model, Cluster analysis, Self-Organizing Maps method (SOM), K-means algorithm, loyalty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 62532515 In silico Simulations for DNA Shuffling Experiments
Authors: Luciana Montera
Abstract:
DNA shuffling is a powerful method used for in vitro evolute molecules with specific functions and has application in areas such as, for example, pharmaceutical, medical and agricultural research. The success of such experiments is dependent on a variety of parameters and conditions that, sometimes, can not be properly pre-established. Here, two computational models predicting DNA shuffling results is presented and their use and results are evaluated against an empirical experiment. The in silico and in vitro results show agreement indicating the importance of these two models and motivating the study and development of new models.Keywords: Computer simulation, DNA shuffling, in silico andin vitro comparison.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17282514 Computational Assistance of the Research, Using Dynamic Vector Logistics of Processes for Critical Infrastructure Subjects Continuity
Authors: J. Urbánek Jiří, Krahulec Josef, Johanidesová Jitka, F. Urbánek Jiří
Abstract:
This paper deals with using of prevailing operation system MS Office (SmartArt...) for mathematical models, using DYVELOP (Dynamic Vector Logistics of Processes) method. It serves for crisis situations investigation and modelling within the organizations of critical infrastructure. In first part of paper, it will be introduced entities, operators, and actors of DYVELOP method. It uses just three operators of Boolean algebra and four types of the entities: the Environments, the Process Systems, the Cases, and the Controlling. The Process Systems (PrS) have five “brothers”: Management PrS, Transformation PrS, Logistic PrS, Event PrS and Operation PrS. The Cases have three “sisters”: Process Cell Case, Use Case, and Activity Case. They all need for the controlling of their functions special Ctrl actors, except ENV – it can do without Ctrl. Model´s maps are named the Blazons and they are able mathematically - graphically express the relationships among entities, actors and processes. In second part of this paper, the rich blazons of DYVELOP method will be used for the discovering and modelling of the cycling cases and their phases. The blazons need live PowerPoint presentation for better comprehension of this paper mission. The crisis management of energetic crisis infrastructure organization is obliged to use the cycles for successful coping of crisis situations. Several times cycling of these cases is necessary condition for the encompassment for both emergency events and the mitigation of organization´s damages. Uninterrupted and continuous cycling process brings for crisis management fruitfulness and it is good indicator and controlling actor of organizational continuity and its sustainable development advanced possibilities. The research reliable rules are derived for the safety and reliable continuity of energetic critical infrastructure organization in the crisis situation.Keywords: Blazons, computational assistance, DYVELOP method, critical infrastructure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16342513 The Relevance of Data Warehousing and Data Mining in the Field of Evidence-based Medicine to Support Healthcare Decision Making
Authors: Nevena Stolba, A Min Tjoa
Abstract:
Evidence-based medicine is a new direction in modern healthcare. Its task is to prevent, diagnose and medicate diseases using medical evidence. Medical data about a large patient population is analyzed to perform healthcare management and medical research. In order to obtain the best evidence for a given disease, external clinical expertise as well as internal clinical experience must be available to the healthcare practitioners at right time and in the right manner. External evidence-based knowledge can not be applied directly to the patient without adjusting it to the patient-s health condition. We propose a data warehouse based approach as a suitable solution for the integration of external evidence-based data sources into the existing clinical information system and data mining techniques for finding appropriate therapy for a given patient and a given disease. Through integration of data warehousing, OLAP and data mining techniques in the healthcare area, an easy to use decision support platform, which supports decision making process of care givers and clinical managers, is built. We present three case studies, which show, that a clinical data warehouse that facilitates evidence-based medicine is a reliable, powerful and user-friendly platform for strategic decision making, which has a great relevance for the practice and acceptance of evidence-based medicine.
Keywords: data mining, data warehousing, decision-support systems, evidence-based medicine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38112512 Extraction of Data from Web Pages: A Vision Based Approach
Authors: P. S. Hiremath, Siddu P. Algur
Abstract:
With the explosive growth of information sources available on the World Wide Web, it has become increasingly difficult to identify the relevant pieces of information, since web pages are often cluttered with irrelevant content like advertisements, navigation-panels, copyright notices etc., surrounding the main content of the web page. Hence, tools for the mining of data regions, data records and data items need to be developed in order to provide value-added services. Currently available automatic techniques to mine data regions from web pages are still unsatisfactory because of their poor performance and tag-dependence. In this paper a novel method to extract data items from the web pages automatically is proposed. It comprises of two steps: (1) Identification and Extraction of the data regions based on visual clues information. (2) Identification of data records and extraction of data items from a data region. For step1, a novel and more effective method is proposed based on visual clues, which finds the data regions formed by all types of tags using visual clues. For step2 a more effective method namely, Extraction of Data Items from web Pages (EDIP), is adopted to mine data items. The EDIP technique is a list-based approach in which the list is a linear data structure. The proposed technique is able to mine the non-contiguous data records and can correctly identify data regions, irrespective of the type of tag in which it is bound. Our experimental results show that the proposed technique performs better than the existing techniques.
Keywords: Web data records, web data regions, web mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19012511 Distribution Sampling of Vector Variance without Duplications
Authors: Erna T. Herdiani, Maman A. Djauhari
Abstract:
In recent years, the use of vector variance as a measure of multivariate variability has received much attention in wide range of statistics. This paper deals with a more economic measure of multivariate variability, defined as vector variance minus all duplication elements. For high dimensional data, this will increase the computational efficiency almost 50 % compared to the original vector variance. Its sampling distribution will be investigated to make its applications possible.Keywords: Asymptotic distribution, covariance matrix, likelihood ratio test, vector variance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15742510 Solar Thermal Aquaculture System Controller Based on Artificial Neural Network
Authors: A. Doaa M. Atia, Faten H. Fahmy, Ninet M. Ahmed, Hassen T. Dorrah
Abstract:
Temperature is one of the most principle factors affects aquaculture system. It can cause stress and mortality or superior environment for growth and reproduction. This paper presents the control of pond water temperature using artificial intelligence technique. The water temperature is very important parameter for shrimp growth. The required temperature for optimal growth is 34oC, if temperature increase up to 38oC it cause death of the shrimp, so it is important to control water temperature. Solar thermal water heating system is designed to supply an aquaculture pond with the required hot water in Mersa Matruh in Egypt. Neural networks are massively parallel processors that have the ability to learn patterns through a training experience. Because of this feature, they are often well suited for modeling complex and non-linear processes such as those commonly found in the heating system. Artificial neural network is proposed to control water temperature due to Artificial intelligence (AI) techniques are becoming useful as alternate approaches to conventional techniques. They have been used to solve complicated practical problems. Moreover this paper introduces a complete mathematical modeling and MATLAB SIMULINK model for the aquaculture system. The simulation results indicate that, the control unit success in keeping water temperature constant at the desired temperature by controlling the hot water flow rate.
Keywords: artificial neural networks, aquaculture, forced circulation hot water system,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20562509 On the Optimality Assessment of Nanoparticle Size Spectrometry and Its Association to the Entropy Concept
Authors: A. Shaygani, R. Saifi, M. S. Saidi, M. Sani
Abstract:
Particle size distribution, the most important characteristics of aerosols, is obtained through electrical characterization techniques. The dynamics of charged nanoparticles under the influence of electric field in Electrical Mobility Spectrometer (EMS) reveals the size distribution of these particles. The accuracy of this measurement is influenced by flow conditions, geometry, electric field and particle charging process, therefore by the transfer function (transfer matrix) of the instrument. In this work, a wire-cylinder corona charger was designed and the combined fielddiffusion charging process of injected poly-disperse aerosol particles was numerically simulated as a prerequisite for the study of a multichannel EMS. The result, a cloud of particles with no uniform charge distribution, was introduced to the EMS. The flow pattern and electric field in the EMS were simulated using Computational Fluid Dynamics (CFD) to obtain particle trajectories in the device and therefore to calculate the reported signal by each electrometer. According to the output signals (resulted from bombardment of particles and transferring their charges as currents), we proposed a modification to the size of detecting rings (which are connected to electrometers) in order to evaluate particle size distributions more accurately. Based on the capability of the system to transfer information contents about size distribution of the injected particles, we proposed a benchmark for the assessment of optimality of the design. This method applies the concept of Von Neumann entropy and borrows the definition of entropy from information theory (Shannon entropy) to measure optimality. Entropy, according to the Shannon entropy, is the ''average amount of information contained in an event, sample or character extracted from a data stream''. Evaluating the responses (signals) which were obtained via various configurations of detecting rings, the best configuration which gave the best predictions about the size distributions of injected particles, was the modified configuration. It was also the one that had the maximum amount of entropy. A reasonable consistency was also observed between the accuracy of the predictions and the entropy content of each configuration. In this method, entropy is extracted from the transfer matrix of the instrument for each configuration. Ultimately, various clouds of particles were introduced to the simulations and predicted size distributions were compared to the exact size distributions.Keywords: Aerosol Nano-Particle, CFD, Electrical Mobility Spectrometer, Von Neumann entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18592508 GridNtru: High Performance PKCS
Authors: Narasimham Challa, Jayaram Pradhan
Abstract:
Cryptographic algorithms play a crucial role in the information society by providing protection from unauthorized access to sensitive data. It is clear that information technology will become increasingly pervasive, Hence we can expect the emergence of ubiquitous or pervasive computing, ambient intelligence. These new environments and applications will present new security challenges, and there is no doubt that cryptographic algorithms and protocols will form a part of the solution. The efficiency of a public key cryptosystem is mainly measured in computational overheads, key size and bandwidth. In particular the RSA algorithm is used in many applications for providing the security. Although the security of RSA is beyond doubt, the evolution in computing power has caused a growth in the necessary key length. The fact that most chips on smart cards can-t process key extending 1024 bit shows that there is need for alternative. NTRU is such an alternative and it is a collection of mathematical algorithm based on manipulating lists of very small integers and polynomials. This allows NTRU to high speeds with the use of minimal computing power. NTRU (Nth degree Truncated Polynomial Ring Unit) is the first secure public key cryptosystem not based on factorization or discrete logarithm problem. This means that given sufficient computational resources and time, an adversary, should not be able to break the key. The multi-party communication and requirement of optimal resource utilization necessitated the need for the present day demand of applications that need security enforcement technique .and can be enhanced with high-end computing. This has promoted us to develop high-performance NTRU schemes using approaches such as the use of high-end computing hardware. Peer-to-peer (P2P) or enterprise grids are proven as one of the approaches for developing high-end computing systems. By utilizing them one can improve the performance of NTRU through parallel execution. In this paper we propose and develop an application for NTRU using enterprise grid middleware called Alchemi. An analysis and comparison of its performance for various text files is presented.Keywords: Alchemi, GridNtru, Ntru, PKCS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16912507 Relevance Feedback within CBIR Systems
Authors: Mawloud Mosbah, Bachir Boucheham
Abstract:
We present here the results for a comparative study of some techniques, available in the literature, related to the relevance feedback mechanism in the case of a short-term learning. Only one method among those considered here is belonging to the data mining field which is the K-nearest neighbors algorithm (KNN) while the rest of the methods is related purely to the information retrieval field and they fall under the purview of the following three major axes: Shifting query, Feature Weighting and the optimization of the parameters of similarity metric. As a contribution, and in addition to the comparative purpose, we propose a new version of the KNN algorithm referred to as an incremental KNN which is distinct from the original version in the sense that besides the influence of the seeds, the rate of the actual target image is influenced also by the images already rated. The results presented here have been obtained after experiments conducted on the Wang database for one iteration and utilizing color moments on the RGB space. This compact descriptor, Color Moments, is adequate for the efficiency purposes needed in the case of interactive systems. The results obtained allow us to claim that the proposed algorithm proves good results; it even outperforms a wide range of techniques available in the literature.
Keywords: CBIR, Category Search, Relevance Feedback (RFB), Query Point Movement, Standard Rocchio’s Formula, Adaptive Shifting Query, Feature Weighting, Optimization of the Parameters of Similarity Metric, Original KNN, Incremental KNN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23422506 Development of Genetic-based Machine Learning for Network Intrusion Detection (GBML-NID)
Authors: Wafa' S.Al-Sharafat, Reyadh Naoum
Abstract:
Society has grown to rely on Internet services, and the number of Internet users increases every day. As more and more users become connected to the network, the window of opportunity for malicious users to do their damage becomes very great and lucrative. The objective of this paper is to incorporate different techniques into classier system to detect and classify intrusion from normal network packet. Among several techniques, Steady State Genetic-based Machine Leaning Algorithm (SSGBML) will be used to detect intrusions. Where Steady State Genetic Algorithm (SSGA), Simple Genetic Algorithm (SGA), Modified Genetic Algorithm and Zeroth Level Classifier system are investigated in this research. SSGA is used as a discovery mechanism instead of SGA. SGA replaces all old rules with new produced rule preventing old good rules from participating in the next rule generation. Zeroth Level Classifier System is used to play the role of detector by matching incoming environment message with classifiers to determine whether the current message is normal or intrusion and receiving feedback from environment. Finally, in order to attain the best results, Modified SSGA will enhance our discovery engine by using Fuzzy Logic to optimize crossover and mutation probability. The experiments and evaluations of the proposed method were performed with the KDD 99 intrusion detection dataset.Keywords: MSSGBML, Network Intrusion Detection, SGA, SSGA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16722505 Simulation Study of Asphaltene Deposition and Solubility of CO2 in the Brine during Cyclic CO2 Injection Process in Unconventional Tight Reservoirs
Authors: Rashid S. Mohammad, Shicheng Zhang, Sun Lu, Syed Jamal-Ud-Din, Xinzhe Zhao
Abstract:
A compositional reservoir simulation model (CMG-GEM) was used for cyclic CO2 injection process in unconventional tight reservoir. Cyclic CO2 injection is an enhanced oil recovery process consisting of injection, shut-in, and production. The study of cyclic CO2 injection and hydrocarbon recovery in ultra-low permeability reservoirs is mainly a function of rock, fluid, and operational parameters. CMG-GEM was used to study several design parameters of cyclic CO2 injection process to distinguish the parameters with maximum effect on the oil recovery and to comprehend the behavior of cyclic CO2 injection in tight reservoir. On the other hand, permeability reduction induced by asphaltene precipitation is one of the major issues in the oil industry due to its plugging onto the porous media which reduces the oil productivity. In addition to asphaltene deposition, solubility of CO2 in the aquifer is one of the safest and permanent trapping techniques when considering CO2 storage mechanisms in geological formations. However, the effects of the above uncertain parameters on the process of CO2 enhanced oil recovery have not been understood systematically. Hence, it is absolutely necessary to study the most significant parameters which dominate the process. The main objective of this study is to improve techniques for designing cyclic CO2 injection process while considering the effects of asphaltene deposition and solubility of CO2 in the brine in order to prevent asphaltene precipitation, minimize CO2 emission, optimize cyclic CO2 injection, and maximize oil production.
Keywords: Tight reservoirs, cyclic O2 injection, asphaltene, solubility, reservoir simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18222504 Performance Evaluation and Economic Analysis of Minimum Quantity Lubrication with Pressurized/Non-Pressurized Air and Nanofluid Mixture
Authors: M. Amrita, R. R. Srikant, A. V. Sita Rama Raju
Abstract:
Water miscible cutting fluids are conventionally used to lubricate and cool the machining zone. But issues related to health hazards, maintenance and disposal costs have limited their usage, leading to application of Minimum Quantity Lubrication (MQL). To increase the effectiveness of MQL, nanocutting fluids are proposed. In the present work, water miscible nanographite cutting fluids of varying concentration are applied at cutting zone by two systems A and B. System A utilizes high pressure air and supplies cutting fluid at a flow rate of 1ml/min. System B uses low pressure air and supplies cutting fluid at a flow rate of 5ml/min. Their performance in machining is evaluated by measuring cutting temperatures, tool wear, cutting forces and surface roughness and compared with dry machining and flood machining. Application of nanocutting fluid using both systems showed better performance than dry machining. Cutting temperatures and cutting forces obtained by both techniques are more than flood machining. But tool wear and surface roughness showed improvement compared to flood machining. Economic analysis has been carried out in all the cases to decide the applicability of the techniques.
Keywords: Economic analysis, Machining, Minimum Quantity lubrication, nanofluid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2279