Search results for: fuzzy modeling and rule extraction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7078

Search results for: fuzzy modeling and rule extraction

6088 Chitosan Magnetic Nanoparticles and Its Analytical Applications

Authors: Eman Alzahrani

Abstract:

Efficient extraction of proteins by removing interfering materials is necessary in proteomics, since most instruments cannot handle such contaminated sample matrices directly. In this study, chitosan-coated magnetic nanoparticles (CS-MNPs) for purification of myoglobin were successfully fabricated. First, chitosan (CS) was prepared by a deacetylation reaction during its extraction from shrimp-shell waste. Second, magnetic nanoparticles (MNPs) were synthesised, using the coprecipitation method, from aqueous Fe2+ and Fe3+ salt solutions by the addition of a base under an inert atmosphere, followed by modification of the surface of MNPs with chitosan. The morphology of the formed nanoparticles, which were about 23 nm in average diameter, was observed by transmission electron microscopy (TEM). In addition, nanoparticles were characterised using X-ray diffraction patterns (XRD), which showed the naked magnetic nanoparticles have a spinel structure and the surface modification did not result in phase change of the Fe3O4. The coating of MNPs was also demonstrated by scanning electron microscopy (SEM) analysis, energy dispersive analysis of X-ray spectroscopy (EDAX), and Fourier transform infrared (FT-IR) spectroscopy. The adsorption behaviour of MNPs and CS-MNPs towards myoglobin was investigated. It was found that the difference in adsorption capacity between MNPs and CS-MNPs was larger for CS-MNPs. This result makes CS-MNPs good adsorbents and attractive for using in protein extraction from biological samples.

Keywords: chitosan, magnetic nanoparticles, coprecipitation, adsorption

Procedia PDF Downloads 414
6087 Understanding Tourism Innovation through Fuzzy Measures

Authors: Marcella De Filippo, Delio Colangelo, Luca Farnia

Abstract:

In recent decades, the hyper-competition of tourism scenario has implicated the maturity of many businesses, attributing a central role to innovative processes and their dissemination in the economy of company management. At the same time, it has defined the need for monitoring the application of innovations, in order to govern and improve the performance of companies and destinations. The study aims to analyze and define the innovation in the tourism sector. The research actions have concerned, on the one hand, some in-depth interviews with experts, identifying innovation in terms of process and product, digitalization, sustainability policies and, on the other hand, to evaluate the interaction between these factors, in terms of substitutability and complementarity in management scenarios, in order to identify which one is essential to be competitive in the global scenario. Fuzzy measures and Choquet integral were used to elicit Experts’ preferences. This method allows not only to evaluate the relative importance of each pillar, but also and more interestingly, the level of interaction, ranging from complementarity to substitutability, between pairs of factors. The results of the survey are the following: in terms of Shapley values, Experts assert that Innovation is the most important factor (32.32), followed by digitalization (31.86), Network (20.57) and Sustainability (15.25). In terms of Interaction indices, given the low degree of consensus among experts, the interaction between couples of criteria on average could be ignored; however, it is worth to note that the factors innovations and digitalization are those in which experts express the highest degree of interaction. However for some of them, these factors have a moderate level of complementarity (with a pick of 57.14), and others consider them moderately substitutes (with a pick of -39.58). Another example, although outlier is the interaction between network and digitalization, in which an expert consider them markedly substitutes (-77.08).

Keywords: innovation, business model, tourism, fuzzy

Procedia PDF Downloads 263
6086 Second Time’s a Charm: The Intervention of the European Patent Office on the Strategic Use of Divisional Applications

Authors: Alissa Lefebre

Abstract:

It might seem intuitive to hope for a fast decision on the patent grant. After all, a granted patent provides you with a monopoly position, which allows you to obstruct others from using your technology. However, this does not take into account the strategic advantages one can obtain from keeping their patent applications pending. First, you have the financial advantage of postponing certain fees, although many applicants would probably agree that this is not the main benefit. As the scope of the patent protection is only decided upon at the grant, the pendency period introduces uncertainty amongst rivals. This uncertainty entails not knowing whether the patent will actually get granted and what the scope of protection will be. Consequently, rivals can only depend upon limited and uncertain information when deciding what technology is worth pursuing. One way to keep patent applications pending, is the use of divisional applications. These applicants can be filed out of a parent application as long as that parent application is still pending. This allows the applicant to pursue (part of) the content of the parent application in another application, as the divisional application cannot exceed the scope of the parent application. In a fast-moving and complex market such as the tele- and digital communications, it might allow applicants to obtain an actual monopoly position as competitors are discouraged to pursue a certain technology. Nevertheless, this practice also has downsides to it. First of all, it has an impact on the workload of the examiners at the patent office. As the number of patent filings have been increasing over the last decades, using strategies that increase this number even more, is not desirable from the patent examiners point of view. Secondly, a pending patent does not provide you with the protection of a granted patent, thus not only create uncertainty for the rivals, but also for the applicant. Consequently, the European patent office (EPO) has come up with a “raising the bar initiative” in which they have decided to tackle the strategic use of divisional applications. Over the past years, two rules have been implemented. The first rule in 2010 introduced a time limit, upon which divisional applications could only be filed within a 24-month limit after the first communication with the patent office. However, after carrying-out a user feedback survey, the EPO abolished the rule again in 2014 and replaced it by a fee mechanism. The fee mechanism is still in place today, which might be an indication of a better result compared to the first rule change. This study tests the impact of these rules on the strategic use of divisional applications in the tele- and digital communication industry and provides empirical evidence on their success. Upon using three different survival models, we find overall evidence that divisional applications prolong the pendency time and that only the second rule is able to tackle the strategic patenting and thus decrease the pendency time.

Keywords: divisional applications, regulatory changes, strategic patenting, EPO

Procedia PDF Downloads 124
6085 Applying the Fuzzy Analytic Network Process to Establish the Relative Importance of Knowledge Sharing Barriers

Authors: Van Dong Phung, Igor Hawryszkiewycz, Kyeong Kang, Muhammad Hatim Binsawad

Abstract:

Knowledge sharing (KS) is the key to creativity and innovation in any organizations. Overcoming the KS barriers has created new challenges for designing in dynamic and complex environment. There may be interrelations and interdependences among the barriers. The purpose of this paper is to present a review of literature of KS barriers and impute the relative importance of them through the fuzzy analytic network process that is a generalization of the analytical hierarchy process (AHP). It helps to prioritize the barriers to find ways to remove them to facilitate KS. The study begins with a brief description of KS barriers and the most critical ones. The FANP and its role in identifying the relative importance of KS barriers are explained. The paper, then, proposes the model for research and expected outcomes. The study suggests that the use of the FANP is appropriate to impute the relative importance of KS barriers which are intertwined and interdependent. Implications and future research are also proposed.

Keywords: FANP, ANP, knowledge sharing barriers, knowledge sharing, removing barriers, knowledge management

Procedia PDF Downloads 327
6084 Decision Tree Modeling in Emergency Logistics Planning

Authors: Yousef Abu Nahleh, Arun Kumar, Fugen Daver, Reham Al-Hindawi

Abstract:

Despite the availability of natural disaster related time series data for last 110 years, there is no forecasting tool available to humanitarian relief organizations to determine forecasts for emergency logistics planning. This study develops a forecasting tool based on identifying probability of disaster for each country in the world by using decision tree modeling. Further, the determination of aggregate forecasts leads to efficient pre-disaster planning. Based on the research findings, the relief agencies can optimize the various resources allocation in emergency logistics planning.

Keywords: decision tree modeling, forecasting, humanitarian relief, emergency supply chain

Procedia PDF Downloads 477
6083 Classification of Political Affiliations by Reduced Number of Features

Authors: Vesile Evrim, Aliyu Awwal

Abstract:

By the evolvement in technology, the way of expressing opinions switched the direction to the digital world. The domain of politics as one of the hottest topics of opinion mining research merged together with the behavior analysis for affiliation determination in text which constitutes the subject of this paper. This study aims to classify the text in news/blogs either as Republican or Democrat with the minimum number of features. As an initial set, 68 features which 64 are constituted by Linguistic Inquiry and Word Count (LIWC) features are tested against 14 benchmark classification algorithms. In the later experiments, the dimensions of the feature vector reduced based on the 7 feature selection algorithms. The results show that Decision Tree, Rule Induction and M5 Rule classifiers when used with SVM and IGR feature selection algorithms performed the best up to 82.5% accuracy on a given dataset. Further tests on a single feature and the linguistic based feature sets showed the similar results. The feature “function” as an aggregate feature of the linguistic category, is obtained as the most differentiating feature among the 68 features with 81% accuracy by itself in classifying articles either as Republican or Democrat.

Keywords: feature selection, LIWC, machine learning, politics

Procedia PDF Downloads 379
6082 A Temporal QoS Ontology For ERTMS/ETCS

Authors: Marc Sango, Olimpia Hoinaru, Christophe Gransart, Laurence Duchien

Abstract:

Ontologies offer a means for representing and sharing information in many domains, particularly in complex domains. For example, it can be used for representing and sharing information of System Requirement Specification (SRS) of complex systems like the SRS of ERTMS/ETCS written in natural language. Since this system is a real-time and critical system, generic ontologies, such as OWL and generic ERTMS ontologies provide minimal support for modeling temporal information omnipresent in these SRS documents. To support the modeling of temporal information, one of the challenges is to enable representation of dynamic features evolving in time within a generic ontology with a minimal redesign of it. The separation of temporal information from other information can help to predict system runtime operation and to properly design and implement them. In addition, it is helpful to provide a reasoning and querying techniques to reason and query temporal information represented in the ontology in order to detect potential temporal inconsistencies. Indeed, a user operation, such as adding a new constraint on existing planning constraints can cause temporal inconsistencies, which can lead to system failures. To address this challenge, we propose a lightweight 3-layer temporal Quality of Service (QoS) ontology for representing, reasoning and querying over temporal and non-temporal information in a complex domain ontology. Representing QoS entities in separated layers can clarify the distinction between the non QoS entities and the QoS entities in an ontology. The upper generic layer of the proposed ontology provides an intuitive knowledge of domain components, specially ERTMS/ETCS components. The separation of the intermediate QoS layer from the lower QoS layer allows us to focus on specific QoS Characteristics, such as temporal or integrity characteristics. In this paper, we focus on temporal information that can be used to predict system runtime operation. To evaluate our approach, an example of the proposed domain ontology for handover operation, as well as a reasoning rule over temporal relations in this domain-specific ontology, are given.

Keywords: system requirement specification, ERTMS/ETCS, temporal ontologies, domain ontologies

Procedia PDF Downloads 415
6081 Transition 1970 Volkswagen Beetle from Internal Combustion Engine Vehicle to Electric Vehicle, Modeling and Simulation

Authors: Jamil Khalil Izraqi

Abstract:

This paper investigates the transition of a 1970 Volkswagen Beetle from an internal combustion engine (ICE) to an EV using Matlab/Simulink modeling and simulation. The performance of the EV drivetrain system was simulated under various operating conditions, including standard and custom driving cycles in Turkey and Jordan (Amman), respectively. The results of this paper indicate that the transition is viable and that modeling and simulation can help in understanding the performance and efficiency of the electric drivetrain system, including battery pack, power electronics, and brushless direct current (BLDC) Motor.

Keywords: BLDC, buck-boost, inverter, SOC, drive-cycle

Procedia PDF Downloads 96
6080 A New Approach of Preprocessing with SVM Optimization Based on PSO for Bearing Fault Diagnosis

Authors: Tawfik Thelaidjia, Salah Chenikher

Abstract:

Bearing fault diagnosis has attracted significant attention over the past few decades. It consists of two major parts: vibration signal feature extraction and condition classification for the extracted features. In this paper, feature extraction from faulty bearing vibration signals is performed by a combination of the signal’s Kurtosis and features obtained through the preprocessing of the vibration signal samples using Db2 discrete wavelet transform at the fifth level of decomposition. In this way, a 7-dimensional vector of the vibration signal feature is obtained. After feature extraction from vibration signal, the support vector machine (SVM) was applied to automate the fault diagnosis procedure. To improve the classification accuracy for bearing fault prediction, particle swarm optimization (PSO) is employed to simultaneously optimize the SVM kernel function parameter and the penalty parameter. The results have shown feasibility and effectiveness of the proposed approach

Keywords: condition monitoring, discrete wavelet transform, fault diagnosis, kurtosis, machine learning, particle swarm optimization, roller bearing, rotating machines, support vector machine, vibration measurement

Procedia PDF Downloads 434
6079 Study Mercapto-Nanoscavenger as a Promising Analytical Tool

Authors: Mohammed M. Algaradah

Abstract:

A chelating mercapto- nanoscavenger has been developed exploiting the high surface area of monodisperse nano-sized mesoporous silica. The nanoscavenger acts as a solid phase trace metal extractant whilst suspended as a quasi-stable sol in aqueous samples. This mode of extraction requires no external agitation as the particles move naturally through the sample by Brownian motion, convection and slow sedimentation. Careful size selection enables the nanoscavenger to be easily recovered together with the extracted analyte by conventional filtration or centrifugation. The research describes the successful attachment of chelator mercapto to ca. 136 ± 15 nm high surface area (BET surface area = 1006 m2 g-1) mesoporous silica particles. The resulting material had a copper capacity of ca. 1.34 ± 0.10 mmol g-1 and was successfully applied to the collection of a trace element from water. Essentially complete recovery of Cu (II) has been achieved from freshwater samples giving typical preconcentration factors of 100 from 50 µg/l samples. Data obtained from a nanoscavenger-based extraction of copper from samples were not significantly different from those obtained by using a conventional colorimetric procedure employing complexation/solvent extraction.

Keywords: nano scavenger, mesoporous silica, trace metal, preconcentration

Procedia PDF Downloads 81
6078 An Event Relationship Extraction Method Incorporating Deep Feedback Recurrent Neural Network and Bidirectional Long Short-Term Memory

Authors: Yin Yuanling

Abstract:

A Deep Feedback Recurrent Neural Network (DFRNN) and Bidirectional Long Short-Term Memory (BiLSTM) are designed to address the problem of low accuracy of traditional relationship extraction models. This method combines a deep feedback-based recurrent neural network (DFRNN) with a bi-directional long short-term memory (BiLSTM) approach. The method combines DFRNN, which extracts local features of text based on deep feedback recurrent mechanism, BiLSTM, which better extracts global features of text, and Self-Attention, which extracts semantic information. Experiments show that the method achieves an F1 value of 76.69% on the CEC dataset, which is 0.0652 better than the BiLSTM+Self-ATT model, thus optimizing the performance of the deep learning method in the event relationship extraction task.

Keywords: event relations, deep learning, DFRNN models, bi-directional long and short-term memory networks

Procedia PDF Downloads 140
6077 An Automated System for the Detection of Citrus Greening Disease Based on Visual Descriptors

Authors: Sidra Naeem, Ayesha Naeem, Sahar Rahim, Nadia Nawaz Qadri

Abstract:

Citrus greening is a bacterial disease that causes considerable damage to citrus fruits worldwide. Efficient method for this disease detection must be carried out to minimize the production loss. This paper presents a pattern recognition system that comprises three stages for the detection of citrus greening from Orange leaves: segmentation, feature extraction and classification. Image segmentation is accomplished by adaptive thresholding. The feature extraction stage comprises of three visual descriptors i.e. shape, color and texture. From shape feature we have used asymmetry index, from color feature we have used histogram of Cb component from YCbCr domain and from texture feature we have used local binary pattern. Classification was done using support vector machines and k nearest neighbors. The best performances of the system is Accuracy = 88.02% and AUROC = 90.1% was achieved by automatic segmented images. Our experiments validate that: (1). Segmentation is an imperative preprocessing step for computer assisted diagnosis of citrus greening, and (2). The combination of shape, color and texture features form a complementary set towards the identification of citrus greening disease.

Keywords: citrus greening, pattern recognition, feature extraction, classification

Procedia PDF Downloads 179
6076 Psychophysiological Adaptive Automation Based on Fuzzy Controller

Authors: Liliana Villavicencio, Yohn Garcia, Pallavi Singh, Luis Fernando Cruz, Wilfrido Moreno

Abstract:

Psychophysiological adaptive automation is a concept that combines human physiological data and computer algorithms to create personalized interfaces and experiences for users. This approach aims to enhance human learning by adapting to individual needs and preferences and optimizing the interaction between humans and machines. According to neurosciences, the working memory demand during the student learning process is modified when the student is learning a new subject or topic, managing and/or fulfilling a specific task goal. A sudden increase in working memory demand modifies the level of students’ attention, engagement, and cognitive load. The proposed psychophysiological adaptive automation system will adapt the task requirements to optimize cognitive load, the process output variable, by monitoring the student's brain activity. Cognitive load changes according to the student’s previous knowledge, the type of task, the difficulty level of the task, and the overall psychophysiological state of the student. Scaling the measured cognitive load as low, medium, or high; the system will assign a task difficulty level to the next task according to the ratio between the previous-task difficulty level and student stress. For instance, if a student becomes stressed or overwhelmed during a particular task, the system detects this through signal measurements such as brain waves, heart rate variability, or any other psychophysiological variables analyzed to adjust the task difficulty level. The control of engagement and stress are considered internal variables for the hypermedia system which selects between three different types of instructional material. This work assesses the feasibility of a fuzzy controller to track a student's physiological responses and adjust the learning content and pace accordingly. Using an industrial automation approach, the proposed fuzzy logic controller is based on linguistic rules that complement the instrumentation of the system to monitor and control the delivery of instructional material to the students. From the test results, it can be proved that the implemented fuzzy controller can satisfactorily regulate the delivery of academic content based on the working memory demand without compromising students’ health. This work has a potential application in the instructional design of virtual reality environments for training and education.

Keywords: fuzzy logic controller, hypermedia control system, personalized education, psychophysiological adaptive automation

Procedia PDF Downloads 77
6075 A Modeling Approach for Blockchain-Oriented Information Systems Design

Authors: Jiaqi Yan, Yani Shi

Abstract:

The blockchain technology is regarded as the most promising technology that has the potential to trigger a technological revolution. However, besides the bitcoin industry, we have not yet seen a large-scale application of blockchain in those domains that are supposed to be impacted, such as supply chain, financial network, and intelligent manufacturing. The reasons not only lie in the difficulties of blockchain implementation, but are also root in the challenges of blockchain-oriented information systems design. As the blockchain members are self-interest actors that belong to organizations with different existing information systems. As they expect different information inputs and outputs of the blockchain application, a common language protocol is needed to facilitate communications between blockchain members. Second, considering the decentralization of blockchain organization, there is not any central authority to organize and coordinate the business processes. Thus, the information systems built on blockchain should support more adaptive business process. This paper aims to address these difficulties by providing a modeling approach for blockchain-oriented information systems design. We will investigate the information structure of distributed-ledger data with conceptual modeling techniques and ontology theories, and build an effective ontology mapping method for the inter-organization information flow and blockchain information records. Further, we will study the distributed-ledger-ontology based business process modeling to support adaptive enterprise on blockchain.

Keywords: blockchain, ontology, information systems modeling, business process

Procedia PDF Downloads 445
6074 Transport Related Air Pollution Modeling Using Artificial Neural Network

Authors: K. D. Sharma, M. Parida, S. S. Jain, Anju Saini, V. K. Katiyar

Abstract:

Air quality models form one of the most important components of an urban air quality management plan. Various statistical modeling techniques (regression, multiple regression and time series analysis) have been used to predict air pollution concentrations in the urban environment. These models calculate pollution concentrations due to observed traffic, meteorological and pollution data after an appropriate relationship has been obtained empirically between these parameters. Artificial neural network (ANN) is increasingly used as an alternative tool for modeling the pollutants from vehicular traffic particularly in urban areas. In the present paper, an attempt has been made to model traffic air pollution, specifically CO concentration using neural networks. In case of CO concentration, two scenarios were considered. First, with only classified traffic volume input and the second with both classified traffic volume and meteorological variables. The results showed that CO concentration can be predicted with good accuracy using artificial neural network (ANN).

Keywords: air quality management, artificial neural network, meteorological variables, statistical modeling

Procedia PDF Downloads 518
6073 Model-Based Field Extraction from Different Class of Administrative Documents

Authors: Jinen Daghrir, Anis Kricha, Karim Kalti

Abstract:

The amount of incoming administrative documents is massive and manually processing these documents is a costly task especially on the timescale. In fact, this problem has led an important amount of research and development in the context of automatically extracting fields from administrative documents, in order to reduce the charges and to increase the citizen satisfaction in administrations. In this matter, we introduce an administrative document understanding system. Given a document in which a user has to select fields that have to be retrieved from a document class, a document model is automatically built. A document model is represented by an attributed relational graph (ARG) where nodes represent fields to extract, and edges represent the relation between them. Both of vertices and edges are attached with some feature vectors. When another document arrives to the system, the layout objects are extracted and an ARG is generated. The fields extraction is translated into a problem of matching two ARGs which relies mainly on the comparison of the spatial relationships between layout objects. Experimental results yield accuracy rates from 75% to 100% tested on eight document classes. Our proposed method has a good performance knowing that the document model is constructed using only one single document.

Keywords: administrative document understanding, logical labelling, logical layout analysis, fields extraction from administrative documents

Procedia PDF Downloads 209
6072 The Image of Saddam Hussein and Collective Memory: The Semiotics of Ba'ath Regime's Mural in Iraq (1980-2003)

Authors: Maryam Pirdehghan

Abstract:

During the Ba'ath Party's rule in Iraq, propaganda was utilized to justify and to promote Saddam Hussein's image in the collective memory as the greatest Arab leader. Consequently, urban walls were routinely covered with images of Saddam. Relying on these images, the regime aimed to provide a basis for evoking meanings in the public opinion, which would supposedly strengthen Saddam’s power and reconstruct facts to legitimize his political ideology. Nonetheless, Saddam was not always portrayed with common and explicit elements but in certain periods of his rule, the paintings depicted him in an unusual context, where various historical and contemporary elements were combined in a narrative background. Therefore, an understanding of the implied socio-political references of these elements is required to fully elucidate the impact of these images on forming the memory and collective unconscious of the Iraqi people. To obtain such understanding, one needs to address the following questions: a) How Saddam Hussein is portrayed in mural during his rule? b) What of elements and mythical-historical narratives are found in the paintings? c) Which Saddam's political views were subject to the collective memory through mural? Employing visual semiotics, this study reveals that during Saddam Hussein's regime, the paintings were initially simple portraits but gradually transformed into narrative images, characterized by a complex network of historical, mythical and religious elements. These elements demonstrate the transformation of a secular-nationalist politician into a Muslim ruler who tried to instill three major policies in domestic and international relations i.e. the arabization of Iraq, as well as the propagation of pan-arabism ideology (first period), the implementation of anti-Israel policy (second period) and the implementation of anti-American-British policy (last period).

Keywords: Ba'ath Party, Saddam Hussein, mural, Iraq, propaganda, collective memory

Procedia PDF Downloads 322
6071 Integrating Time-Series and High-Spatial Remote Sensing Data Based on Multilevel Decision Fusion

Authors: Xudong Guan, Ainong Li, Gaohuan Liu, Chong Huang, Wei Zhao

Abstract:

Due to the low spatial resolution of MODIS data, the accuracy of small-area plaque extraction with a high degree of landscape fragmentation is greatly limited. To this end, the study combines Landsat data with higher spatial resolution and MODIS data with higher temporal resolution for decision-level fusion. Considering the importance of the land heterogeneity factor in the fusion process, it is superimposed with the weighting factor, which is to linearly weight the Landsat classification result and the MOIDS classification result. Three levels were used to complete the process of data fusion, that is the pixel of MODIS data, the pixel of Landsat data, and objects level that connect between these two levels. The multilevel decision fusion scheme was tested in two sites of the lower Mekong basin. We put forth a comparison test, and it was proved that the classification accuracy was improved compared with the single data source classification results in terms of the overall accuracy. The method was also compared with the two-level combination results and a weighted sum decision rule-based approach. The decision fusion scheme is extensible to other multi-resolution data decision fusion applications.

Keywords: image classification, decision fusion, multi-temporal, remote sensing

Procedia PDF Downloads 118
6070 A Review of BIM Applications for Heritage and Historic Buildings: Challenges and Solutions

Authors: Reza Yadollahi, Arash Hejazi, Dante Savasta

Abstract:

Building Information Modeling (BIM) is growing so fast in construction projects around the world. Considering BIM's weaknesses in implementing existing heritage and historical buildings, it is critical to facilitate BIM application for such structures. One of the pieces of information to build a model in BIM is to import material and its characteristics. Material library is essential to speed up the entry of project information. To save time and prevent cost overrun, a BIM object material library should be provided. However, historical buildings' lack of information and documents is typically a challenge in renovation and retrofitting projects. Due to the lack of case documents for historic buildings, importing data is a time-consuming task, which can be improved by creating BIM libraries. Based on previous research, this paper reviews the complexities and challenges in BIM modeling for heritage, historic, and architectural buildings. Through identifying the strengths and weaknesses of the standard BIM systems, recommendations are provided to enhance the modeling platform.

Keywords: building Information modeling, historic, heritage buildings, material library

Procedia PDF Downloads 112
6069 Branched Chain Amino Acid Kinesio PVP Gel Tape from Extract of Pea (Pisum sativum L.) Based on Ultrasound-Assisted Extraction Technology

Authors: Doni Dermawan

Abstract:

Modern sports competition as a consequence of the increase in the value of the business and entertainment in the field of sport has been demanding athletes to always have excellent physical endurance performance. Physical exercise is done in a long time, and intensive may pose a risk of muscle tissue damage caused by the increase of the enzyme creatine kinase. Branched Chain Amino Acids (BCAA) is an essential amino acid that is composed of leucine, isoleucine, and valine which serves to maintain muscle tissue, keeping the immune system, and prevent further loss of coordination and muscle pain. Pea (Pisum sativum L.) is a kind of leguminous plants that are rich in Branched Chain Amino Acids (BCAA) where every one gram of protein pea contains 82.7 mg of leucine; 56.3 mg isoleucine; and 56.0 mg of valine. This research aims to develop Branched Chain Amino Acids (BCAA) from pea extract is applied in dosage forms Gel PVP Kinesio Tape technology using Ultrasound-assisted Extraction. The method used in the writing of this paper is the Cochrane Collaboration Review that includes literature studies, testing the quality of the study, the characteristics of the data collection, analysis, interpretation of results, and clinical trials as well as recommendations for further research. Extraction of BCAA in pea done using ultrasound-assisted extraction technology with optimization variables includes the type of solvent extraction (NaOH 0.1%), temperature (20-250C), time (15-30 minutes) power (80 watt) and ultrasonic frequency (35 KHz). The advantages of this extraction method are the level of penetration of the solvent into the membrane of the cell is high and can increase the transfer period so that the BCAA substance separation process more efficient. BCAA extraction results are then applied to the polymer PVP (Polyvinylpyrrolidone) Gel powder composed of PVP K30 and K100 HPMC dissolved in 10 mL of water-methanol (1: 1) v / v. Preparations Kinesio Tape Gel PVP is the BCAA in the gel are absorbed into the muscle tissue, and joints through tensile force then provides stimulation to the muscle circulation with variable pressure so that the muscle can increase the biomechanical movement and prevent damage to the muscle enzyme creatine kinase. Analysis and evaluation of test preparation include interaction, thickness, weight uniformity, humidity, water vapor permeability, the levels of the active substance, content uniformity, percentage elongation, stability testing, release profile, permeation in vitro and in vivo skin irritation testing.

Keywords: branched chain amino acid, BCAA, Kinesio tape, pea, PVP gel, ultrasound-assisted extraction

Procedia PDF Downloads 286
6068 Multi-Stage Classification for Lung Lesion Detection on CT Scan Images Applying Medical Image Processing Technique

Authors: Behnaz Sohani, Sahand Shahalinezhad, Amir Rahmani, Aliyu Aliyu

Abstract:

Recently, medical imaging and specifically medical image processing is becoming one of the most dynamically developing areas of medical science. It has led to the emergence of new approaches in terms of the prevention, diagnosis, and treatment of various diseases. In the process of diagnosis of lung cancer, medical professionals rely on computed tomography (CT) scans, in which failure to correctly identify masses can lead to incorrect diagnosis or sampling of lung tissue. Identification and demarcation of masses in terms of detecting cancer within lung tissue are critical challenges in diagnosis. In this work, a segmentation system in image processing techniques has been applied for detection purposes. Particularly, the use and validation of a novel lung cancer detection algorithm have been presented through simulation. This has been performed employing CT images based on multilevel thresholding. The proposed technique consists of segmentation, feature extraction, and feature selection and classification. More in detail, the features with useful information are selected after featuring extraction. Eventually, the output image of lung cancer is obtained with 96.3% accuracy and 87.25%. The purpose of feature extraction applying the proposed approach is to transform the raw data into a more usable form for subsequent statistical processing. Future steps will involve employing the current feature extraction method to achieve more accurate resulting images, including further details available to machine vision systems to recognise objects in lung CT scan images.

Keywords: lung cancer detection, image segmentation, lung computed tomography (CT) images, medical image processing

Procedia PDF Downloads 94
6067 Investigation of the EEG Signal Parameters during Epileptic Seizure Phases in Consequence to the Application of External Healing Therapy on Subjects

Authors: Karan Sharma, Ajay Kumar

Abstract:

Epileptic seizure is a type of disease due to which electrical charge in the brain flows abruptly resulting in abnormal activity by the subject. One percent of total world population gets epileptic seizure attacks.Due to abrupt flow of charge, EEG (Electroencephalogram) waveforms change. On the display appear a lot of spikes and sharp waves in the EEG signals. Detection of epileptic seizure by using conventional methods is time-consuming. Many methods have been evolved that detect it automatically. The initial part of this paper provides the review of techniques used to detect epileptic seizure automatically. The automatic detection is based on the feature extraction and classification patterns. For better accuracy decomposition of the signal is required before feature extraction. A number of parameters are calculated by the researchers using different techniques e.g. approximate entropy, sample entropy, Fuzzy approximate entropy, intrinsic mode function, cross-correlation etc. to discriminate between a normal signal & an epileptic seizure signal.The main objective of this review paper is to present the variations in the EEG signals at both stages (i) Interictal (recording between the epileptic seizure attacks). (ii) Ictal (recording during the epileptic seizure), using most appropriate methods of analysis to provide better healthcare diagnosis. This research paper then investigates the effects of a noninvasive healing therapy on the subjects by studying the EEG signals using latest signal processing techniques. The study has been conducted with Reiki as a healing technique, beneficial for restoring balance in cases of body mind alterations associated with an epileptic seizure. Reiki is practiced around the world and is recommended for different health services as a treatment approach. Reiki is an energy medicine, specifically a biofield therapy developed in Japan in the early 20th century. It is a system involving the laying on of hands, to stimulate the body’s natural energetic system. Earlier studies have shown an apparent connection between Reiki and the autonomous nervous system. The Reiki sessions are applied by an experienced therapist. EEG signals are measured at baseline, during session and post intervention to bring about effective epileptic seizure control or its elimination altogether.

Keywords: EEG signal, Reiki, time consuming, epileptic seizure

Procedia PDF Downloads 402
6066 Theorem on Inconsistency of The Classical Logic

Authors: T. J. Stepien, L. T. Stepien

Abstract:

This abstract concerns an extremely fundamental issue. Namely, the fundamental problem of science is the issue of consistency. In this abstract, we present the theorem saying that the classical calculus of quantifiers is inconsistent in the traditional sense. At the beginning, we introduce a notation, and later we remind the definition of the consistency in the traditional sense. S1 is the set of all well-formed formulas in the calculus of quantifiers. RS1 denotes the set of all rules over the set S1. Cn(R, X) is the set of all formulas standardly provable from X by rules R, where R is a subset of RS1, and X is a subset of S1. The couple < R,X > is called a system, whenever R is a subset of RS1, and X is a subset of S1. Definition: The system < R,X > is consistent in the traditional sense if there does not exist any formula from the set S1, such that this formula and its negation are provable from X, by using rules from R. Finally, < R0+, L2 > denotes the classical calculus of quantifiers, where R0+ consists of Modus Ponens and the generalization rule. L2 is the set of all formulas valid in the classical calculus of quantifiers. The Main Result: The system < R0+, L2 > is inconsistent in the traditional sense.

Keywords: classical calculus of quantifiers, classical predicate calculus, generalization rule, consistency in the traditional sense, Modus Ponens

Procedia PDF Downloads 198
6065 Groundwater Utilization and Sustainability: A Case Study of Pydibheemavaram Industrial Area, India

Authors: G. Venkata Rao, R. Srinivasa Rao, B. Neelima Sri Priya

Abstract:

The over extraction of groundwater from the coastal aquifers, result in reduction of groundwater resource and lowering of water level. In general, the depletion of groundwater level enhances the landward migration of saltwater wedge. Now a days the ground water extraction increases by year to year because increased population and industrialization. The ground water is the only source of irrigation, domestic and Industrial purposes at Pydibhimavaram industrial area, which is located in the coastal belt of Srikakulam district, India of Latitudes 18.145N 83.627E and Longitudes 18.099N 83.674E. The present study has been attempted to calculate amount of water getting recharged into this aquifer, status of rainfall pattern for the past two decades and the runoff is calculated by using Khosla’s formula with available rainfall and temperature in the study area. A decision support model has been developed on the basis of Monthly Extractions of the water from the ground through bore wells and the Net Recharge of the aquifer. It is concluded that the amount of extractions is exceeding the amount of recharge from May to October in a given year which will in turn damage the water balance in the subsurface layers.

Keywords: aquifer, decision support model, groundwater extraction, run off estimation and rainfall

Procedia PDF Downloads 297
6064 Optimizing Boiler Combustion System in a Petrochemical Plant Using Neuro-Fuzzy Inference System and Genetic Algorithm

Authors: Yul Y. Nazaruddin, Anas Y. Widiaribowo, Satriyo Nugroho

Abstract:

Boiler is one of the critical unit in a petrochemical plant. Steam produced by the boiler is used for various processes in the plant such as urea and ammonia plant. An alternative method to optimize the boiler combustion system is presented in this paper. Adaptive Neuro-Fuzzy Inference System (ANFIS) approach is applied to model the boiler using real-time operational data collected from a boiler unit of the petrochemical plant. Nonlinear equation obtained is then used to optimize the air to fuel ratio using Genetic Algorithm, resulting an optimal ratio of 15.85. This optimal ratio is then maintained constant by ratio controller designed using inverse dynamics based on ANFIS. As a result, constant value of oxygen content in the flue gas is obtained which indicates more efficient combustion process.

Keywords: ANFIS, boiler, combustion process, genetic algorithm, optimization.

Procedia PDF Downloads 247
6063 Geometallurgy of Niobium Deposits: An Integrated Multi-Disciplined Approach

Authors: Mohamed Nasraoui

Abstract:

Spatial ore distribution, ore heterogeneity and their links with geological processes involved in Niobium concentration are all factors for consideration when bridging field observations to extraction scheme. Indeed, mineralogy changes of Nb-hosting phases, their textural relationships with hydrothermal or secondary minerals, play a key control over mineral processing. This study based both on filed work and ore characterization presents data from several Nb-deposits related to carbonatite complexes. The results obtained by a wide range of analytical techniques, including, XRD, XRF, ICP-MS, SEM, Microprobe, Spectro-CL, FTIR-DTA and Mössbauer spectroscopy, demonstrate how geometallurgical assessment, at all stage of mine development, can greatly assist in the design of a suitable extraction flowsheet and data reconciliation.

Keywords: carbonatites, Nb-geometallurgy, Nb-mineralogy, mineral processing.

Procedia PDF Downloads 160
6062 The Incesant Subversion of Judiciary by African Political Leaders

Authors: Joy Olayemi Gbala, Fatai Olatokunbo, Philip Cloud

Abstract:

Catastrophic dictatorship has been discovered to be the major leadership challenge that orchestrates stagnated and contrasted economy with dysfunctional democracy in Africa through willful misappropriation of resources and egregious subversion of the rule of law. Almost invariably, most African leaders inexplicably often become power drunk and addicted which usually leads to abuse of state power, abdication of constitutional duties, unjustly withdrawal of business license of operation, human right violation, election malpractices, financial corruption, disruptions of policies of democratic government transition, annulment of free and fair election, and disruptions of legal electoral procedures and unachievable dividends of democracy and many more. Owing to this, most African nations have gone and still go through political unrest and insurgencies leading to loss of lives and property, violent protests, detention of detractors and political activists and massive human displacement. This research work is concerned with, and investigates the causes, menace, consequences and impacts of subverting the rule of law in Africa on the economy and the development of the continent with a suggested practical solution to the plights.

Keywords: corruption, law, leadership, violation

Procedia PDF Downloads 151
6061 Diversity in Finance Literature Revealed through the Lens of Machine Learning: A Topic Modeling Approach on Academic Papers

Authors: Oumaima Lahmar

Abstract:

This paper aims to define a structured topography for finance researchers seeking to navigate the body of knowledge in their extrapolation of finance phenomena. To make sense of the body of knowledge in finance, a probabilistic topic modeling approach is applied on 6000 abstracts of academic articles published in three top journals in finance between 1976 and 2020. This approach combines both machine learning techniques and natural language processing to statistically identify the conjunctions between research articles and their shared topics described each by relevant keywords. The topic modeling analysis reveals 35 coherent topics that can well depict finance literature and provide a comprehensive structure for the ongoing research themes. Comparing the extracted topics to the Journal of Economic Literature (JEL) classification system, a significant similarity was highlighted between the characterizing keywords. On the other hand, we identify other topics that do not match the JEL classification despite being relevant in the finance literature.

Keywords: finance literature, textual analysis, topic modeling, perplexity

Procedia PDF Downloads 166
6060 Modeling and Analyzing Controversy in Large-Scale Cyber-Argumentation

Authors: Najla Althuniyan

Abstract:

Online discussions take place across different platforms. These discussions have the potential to extract crowd wisdom and capture the collective intelligence from a different perspective. However, certain phenomena, such as controversy, often appear in online argumentation that makes the discussion between participants heated. Heated discussions can be used to extract new knowledge. Therefore, detecting the presence of controversy is an essential task to determine if collective intelligence can be extracted from online discussions. This paper uses existing measures for estimating controversy quantitatively in cyber-argumentation. First, it defines controversy in different fields, and then it identifies the attributes of controversy in online discussions. The distributions of user opinions and the distance between opinions are used to calculate the controversial degree of a discussion. Finally, the results from each controversy measure are discussed and analyzed using an empirical study generated by a cyber-argumentation tool. This is an improvement over the existing measurements because it does not require ground-truth data or specific settings and can be adapted to distribution-based or distance-based opinions.

Keywords: online argumentation, controversy, collective intelligence, agreement analysis, collaborative decision-making, fuzzy logic

Procedia PDF Downloads 114
6059 Importance of Mathematical Modeling in Teaching Mathematics

Authors: Selahattin Gultekin

Abstract:

Today, in engineering departments, mathematics courses such as calculus, linear algebra and differential equations are generally taught by mathematicians. Therefore, during mathematicians’ classroom teaching there are few or no applications of the concepts to real world problems at all. Most of the times, students do not know whether the concepts or rules taught in these courses will be used extensively in their majors or not. This situation holds true of for all engineering and science disciplines. The general trend toward these mathematic courses is not good. The real-life application of mathematics will be appreciated by students when mathematical modeling of real-world problems are tackled. So, students do not like abstract mathematics, rather they prefer a solid application of the concepts to our daily life problems. The author highly recommends that mathematical modeling is to be taught starting in high schools all over the world In this paper, some mathematical concepts such as limit, derivative, integral, Taylor Series, differential equations and mean-value-theorem are chosen and their applications with graphical representations to real problems are emphasized.

Keywords: applied mathematics, engineering mathematics, mathematical concepts, mathematical modeling

Procedia PDF Downloads 312