Search results for: feature selection feature subset selection feature extraction/transformation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7128

Search results for: feature selection feature subset selection feature extraction/transformation

5928 Transient Free Laminar Convection in the Vicinity of a Thermal Conductive Vertical Plate

Authors: Anna Bykalyuk, Frédéric Kuznik, Kévyn Johannes

Abstract:

In this paper, the influence of a vertical plate’s thermal capacity is numerically investigated in order to evaluate the evolution of the thermal boundary layer structure, as well as the convective heat transfer coefficient and the velocity and temperature profiles. Whereas the heat flux of the heated vertical plate is evaluated under time depending boundary conditions. The main important feature of this problem is the unsteadiness of the physical phenomena. A 2D CFD model is developed with the Ansys Fluent 14.0 environment and is validated using unsteady data obtained for plasterboard studied under a dynamic temperature evolution. All the phenomena produced in the vicinity of the thermal conductive vertical plate (plasterboard) are analyzed and discussed. This work is the first stage of a holistic research on transient free convection that aims, in the future, to study the natural convection in the vicinity of a vertical plate containing Phase Change Materials (PCM).

Keywords: CFD modeling, natural convection, thermal conductive plate, time-depending boundary conditions

Procedia PDF Downloads 278
5927 Biocompatible Ionic Liquids in Liquid-Liquid Extraction of Lactic Acid: A Comparative Study

Authors: Konstantza Tonova, Ivan Svinyarov, Milen G. Bogdanov

Abstract:

Ionic liquids consisting of pairs of imidazolium or phosphonium cation and chloride or saccharinate anion were synthesized and compared with respect to their extraction efficiency towards the fermentative L-lactic acid. The acid partitioning in the equilibrated biphasic systems of ionic liquid and water was quantified through the extraction degree and the partition coefficient. The water transfer from the aqueous into the ionic liquid-rich phase was also always followed. The effect of pH, which determines the state of lactic acid in the aqueous source was studied. The effect of other salting-out substances that modify the ionic liquid/water equilibrium was also investigated in view to reveal the best liquid-liquid system with respect to low toxicity, high extraction and back extraction efficiencies and performance simplicity.

Keywords: ionic liquids, biphasic system, extraction, lactic acid

Procedia PDF Downloads 482
5926 Enhancement Dynamic Cars Detection Based on Optimized HOG Descriptor

Authors: Mansouri Nabila, Ben Jemaa Yousra, Motamed Cina, Watelain Eric

Abstract:

Research and development efforts in intelligent Advanced Driver Assistance Systems (ADAS) seek to save lives and reduce the number of on-road fatalities. For traffic and emergency monitoring, the essential but challenging task is vehicle detection and tracking in reasonably short time. This purpose needs first of all a powerful dynamic car detector model. In fact, this paper presents an optimized HOG process based on shape and motion parameters fusion. Our proposed approach mains to compute HOG by bloc feature from foreground blobs using configurable research window and pathway in order to overcome the shortcoming in term of computing time of HOG descriptor and improve their dynamic application performance. Indeed we prove in this paper that HOG by bloc descriptor combined with motion parameters is a very suitable car detector which reaches in record time a satisfactory recognition rate in dynamic outside area and bypasses several popular works without using sophisticated and expensive architectures such as GPU and FPGA.

Keywords: car-detector, HOG, motion, computing time

Procedia PDF Downloads 323
5925 Requirement Engineering Within Open Source Software Development: A Case Study

Authors: Kars Beek, Remco Groeneveld, Sjaak Brinkkemper

Abstract:

Although there is much literature available on requirement documentation in traditional software development, few studies have been conducted about this topic in open source software development. While open-source software development is becoming more important, the software development processes are often not as structured as corporate software development processes. Papers show that communities, creating open-source software, often lack structure and documentation. However, most recent studies about this topic are often ten or more years old. Therefore, this research has been conducted to determine if the lack of structure and documentation in requirement engineering is currently still the situation in these communities. Three open-source products have been chosen as subjects for conducting this research. The data for this research was gathered based on interviews, observations, and analyses of feature proposals and issue tracking tools. In this paper, we present a comparison and an analysis of the different methods used for requirements documentation to understand the current practices of requirements documentation in open source software development.

Keywords: case study, open source software, open source software development, requirement elicitation, requirement engineering

Procedia PDF Downloads 106
5924 Evaluation of Reliability Flood Control System Based on Uncertainty of Flood Discharge, Case Study Wulan River, Central Java, Indonesia

Authors: Anik Sarminingsih, Krishna V. Pradana

Abstract:

The failure of flood control system can be caused by various factors, such as not considering the uncertainty of designed flood causing the capacity of the flood control system is exceeded. The presence of the uncertainty factor is recognized as a serious issue in hydrological studies. Uncertainty in hydrological analysis is influenced by many factors, starting from reading water elevation data, rainfall data, selection of method of analysis, etc. In hydrological modeling selection of models and parameters corresponding to the watershed conditions should be evaluated by the hydraulic model in the river as a drainage channel. River cross-section capacity is the first defense in knowing the reliability of the flood control system. Reliability of river capacity describes the potential magnitude of flood risk. Case study in this research is Wulan River in Central Java. This river occurring flood almost every year despite some efforts to control floods such as levee, floodway and diversion. The flood-affected areas include several sub-districts, mainly in Kabupaten Kudus and Kabupaten Demak. First step is analyze the frequency of discharge observation from Klambu weir which have time series data from 1951-2013. Frequency analysis is performed using several distribution frequency models such as Gumbel distribution, Normal, Normal Log, Pearson Type III and Log Pearson. The result of the model based on standard deviation overlaps, so the maximum flood discharge from the lower return periods may be worth more than the average discharge for larger return periods. The next step is to perform a hydraulic analysis to evaluate the reliability of river capacity based on the flood discharge resulted from several methods. The selection of the design flood discharge of flood control system is the result of the method closest to bankfull capacity of the river.

Keywords: design flood, hydrological model, reliability, uncertainty, Wulan river

Procedia PDF Downloads 294
5923 On the Possibility of Real Time Characterisation of Ambient Toxicity Using Multi-Wavelength Photoacoustic Instrument

Authors: Tibor Ajtai, Máté Pintér, Noémi Utry, Gergely Kiss-Albert, Andrea Palágyi, László Manczinger, Csaba Vágvölgyi, Gábor Szabó, Zoltán Bozóki

Abstract:

According to the best knowledge of the authors, here we experimentally demonstrate first, a quantified correlation between the real-time measured optical feature of the ambient and the off-line measured toxicity data. Finally, using these correlations we are presenting a novel methodology for real time characterisation of ambient toxicity based on the multi wavelength aerosol phase photoacoustic measurement. Ambient carbonaceous particulate matter is one of the most intensively studied atmospheric constituent in climate science nowadays. Beyond their climatic impact, atmospheric soot also plays an important role as an air pollutant that harms human health. Moreover, according to the latest scientific assessments ambient soot is the second most important anthropogenic emission source, while in health aspect its being one of the most harmful atmospheric constituents as well. Despite of its importance, generally accepted standard methodology for the quantitative determination of ambient toxicology is not available yet. Dominantly, ambient toxicology measurement is based on the posterior analysis of filter accumulated aerosol with limited time resolution. Most of the toxicological studies are based on operational definitions using different measurement protocols therefore the comprehensive analysis of the existing data set is really limited in many cases. The situation is further complicated by the fact that even during its relatively short residence time the physicochemical features of the aerosol can be masked significantly by the actual ambient factors. Therefore, decreasing the time resolution of the existing methodology and developing real-time methodology for air quality monitoring are really actual issues in the air pollution research. During the last decades many experimental studies have verified that there is a relation between the chemical composition and the absorption feature quantified by Absorption Angström Exponent (AAE) of the carbonaceous particulate matter. Although the scientific community are in the common platform that the PhotoAcoustic Spectroscopy (PAS) is the only methodology that can measure the light absorption by aerosol with accurate and reliable way so far, the multi-wavelength PAS which are able to selectively characterise the wavelength dependency of absorption has become only available in the last decade. In this study, the first results of the intensive measurement campaign focusing the physicochemical and toxicological characterisation of ambient particulate matter are presented. Here we demonstrate the complete microphysical characterisation of winter time urban ambient including optical absorption and scattering as well as size distribution using our recently developed state of the art multi-wavelength photoacoustic instrument (4λ-PAS), integrating nephelometer (Aurora 3000) as well as single mobility particle sizer and optical particle counter (SMPS+C). Beyond this on-line characterisation of the ambient, we also demonstrate the results of the eco-, cyto- and genotoxicity measurements of ambient aerosol based on the posterior analysis of filter accumulated aerosol with 6h time resolution. We demonstrate a diurnal variation of toxicities and AAE data deduced directly from the multi-wavelength absorption measurement results.

Keywords: photoacoustic spectroscopy, absorption Angström exponent, toxicity, Ames-test

Procedia PDF Downloads 304
5922 Hierarchical Tree Long Short-Term Memory for Sentence Representations

Authors: Xiuying Wang, Changliang Li, Bo Xu

Abstract:

A fixed-length feature vector is required for many machine learning algorithms in NLP field. Word embeddings have been very successful at learning lexical information. However, they cannot capture the compositional meaning of sentences, which prevents them from a deeper understanding of language. In this paper, we introduce a novel hierarchical tree long short-term memory (HTLSTM) model that learns vector representations for sentences of arbitrary syntactic type and length. We propose to split one sentence into three hierarchies: short phrase, long phrase and full sentence level. The HTLSTM model gives our algorithm the potential to fully consider the hierarchical information and long-term dependencies of language. We design the experiments on both English and Chinese corpus to evaluate our model on sentiment analysis task. And the results show that our model outperforms several existing state of the art approaches significantly.

Keywords: deep learning, hierarchical tree long short-term memory, sentence representation, sentiment analysis

Procedia PDF Downloads 349
5921 Robustness of the Deep Chroma Extractor and Locally-Normalized Quarter Tone Filters in Automatic Chord Estimation under Reverberant Conditions

Authors: Luis Alvarado, Victor Poblete, Isaac Gonzalez, Yetzabeth Gonzalez

Abstract:

In MIREX 2016 (http://www.music-ir.org/mirex), the deep neural network (DNN)-Deep Chroma Extractor, proposed by Korzeniowski and Wiedmer, reached the highest score in an audio chord recognition task. In the present paper, this tool is assessed under acoustic reverberant environments and distinct source-microphone distances. The evaluation dataset comprises The Beatles and Queen datasets. These datasets are sequentially re-recorded with a single microphone in a real reverberant chamber at four reverberation times (0 -anechoic-, 1, 2, and 3 s, approximately), as well as four source-microphone distances (32, 64, 128, and 256 cm). It is expected that the performance of the trained DNN will dramatically decrease under these acoustic conditions with signals degraded by room reverberation and distance to the source. Recently, the effect of the bio-inspired Locally-Normalized Cepstral Coefficients (LNCC), has been assessed in a text independent speaker verification task using speech signals degraded by additive noise at different signal-to-noise ratios with variations of recording distance, and it has also been assessed under reverberant conditions with variations of recording distance. LNCC showed a performance so high as the state-of-the-art Mel Frequency Cepstral Coefficient filters. Based on these results, this paper proposes a variation of locally-normalized triangular filters called Locally-Normalized Quarter Tone (LNQT) filters. By using the LNQT spectrogram, robustness improvements of the trained Deep Chroma Extractor are expected, compared with classical triangular filters, and thus compensating the music signal degradation improving the accuracy of the chord recognition system.

Keywords: chord recognition, deep neural networks, feature extraction, music information retrieval

Procedia PDF Downloads 234
5920 Compliance and Assessment Process of Information Technology in Accounting, in Turkey

Authors: Kocakaya Eda, Argun Doğan

Abstract:

This study analyzed the present state of information technology in the field of accounting by bibliometric analysis of scientific studies on the impact on the transformation of e-billing and tax managementin Turkey. With comparative bibliometric analysis, the innovation and positive effects of the process that changed with e-transformation in the field of accounting with e-transformation in businesses and the information technologies used in accounting and tax management were analyzed comparatively. By evaluating the data obtained as a result of these analyzes, suggestions on the use of information technologies in accounting and tax management and the positive and negative effects of e-transformation on the analyzed activities of the enterprises were emphasized. With the e-transformation, which will be realized with the most efficient use of information technologies in Turkey. The synergy and efficiency of IT technology developments in avcoounting and finance should be revealed in the light of scientific data, from the smallest business to the largest economic enterprises.

Keywords: information technologies, E-invoice, E-Tax management, E-transformation, accounting programs

Procedia PDF Downloads 123
5919 Secure E-Voting Using Blockchain Technology

Authors: Barkha Ramteke, Sonali Ridhorkar

Abstract:

An election is an important event in all countries. Traditional voting has several drawbacks, including the expense of time and effort required for tallying and counting results, the cost of papers, arrangements, and everything else required to complete a voting process. Many countries are now considering online e-voting systems, but the traditional e-voting systems suffer a lack of trust. It is not known if a vote is counted correctly, tampered or not. A lack of transparency means that the voter has no assurance that his or her vote will be counted as they voted in elections. Electronic voting systems are increasingly using blockchain technology as an underlying storage mechanism to make the voting process more transparent and assure data immutability as blockchain technology grows in popularity. The transparent feature, on the other hand, may reveal critical information about applicants because all system users have the same entitlement to their data. Furthermore, because of blockchain's pseudo-anonymity, voters' privacy will be revealed, and third parties involved in the voting process, such as registration institutions, will be able to tamper with data. To overcome these difficulties, we apply Ethereum smart contracts into blockchain-based voting systems.

Keywords: blockchain, AMV chain, electronic voting, decentralized

Procedia PDF Downloads 139
5918 Studies on Mechanical Behavior of Kevlar/Kenaf/Graphene Reinforced Polymer Based Hybrid Composites

Authors: H. K. Shivanand, Ranjith R. Hombal, Paraveej Shirahatti, Gujjalla Anil Babu, S. ShivaPrakash

Abstract:

When it comes to the selection of materials the knowledge of materials science plays a vital role in selection and enhancements of materials properties. In the world of material science a composite material has the significant role based on its application. The composite materials are those in which two or more components having different physical and chemical properties are combined to create a new enhanced property substance. In this study three different materials (Kenaf, Kevlar and Graphene) been chosen based on their properties and a composite material is developed with help of vacuum bagging process. The fibers (Kenaf and Kevlar) and Resin(vinyl ester) ratio was maintained at 70:30 during the process and 0.5% 1% and 1.5% of Graphene was added during fabrication process. The material was machined to thedimension ofASTM standards(300×300mm and thickness 3mm)with help of water jet cutting machine. The composite materials were tested for Mechanical properties such as Interlaminar shear strength(ILSS) and Flexural strength. It is found that there is significant increase in material properties in the developed composite material.

Keywords: Kevlar, Kenaf, graphene, vacuum bagging process, Interlaminar shear strength test, flexural test

Procedia PDF Downloads 95
5917 Design and Optimization Fire Alarm System to Protect Gas Condensate Reservoirs With the Use of Nano-Technology

Authors: Hefzollah Mohammadian, Ensieh Hajeb, Mohamad Baqer Heidari

Abstract:

In this paper, for the protection and safety of tanks gases (flammable materials) and also due to the considerable economic value of the reservoir, the new system for the protection, the conservation and fire fighting has been cloned. The system consists of several parts: the Sensors to detect heat and fire with Nanotechnology (nano sensor), Barrier for isolation and protection from a range of two electronic zones, analyzer for detection and locating point of fire accurately, Main electronic board to announce fire, Fault diagnosis in different locations, such as relevant alarms and activate different devices for fire distinguish and announcement. An important feature of this system, high speed and capability of fire detection system in a way that is able to detect the value of the ambient temperature that can be adjusted. Another advantage of this system is autonomous and does not require human operator in place. Using nanotechnology, in addition to speeding up the work, reduces the cost of construction of the sensor and also the notification system and fire extinguish.

Keywords: analyser, barrier, heat resistance, general fault, general alarm, nano sensor

Procedia PDF Downloads 456
5916 Machine Learning Framework: Competitive Intelligence and Key Drivers Identification of Market Share Trends among Healthcare Facilities

Authors: Anudeep Appe, Bhanu Poluparthi, Lakshmi Kasivajjula, Udai Mv, Sobha Bagadi, Punya Modi, Aditya Singh, Hemanth Gunupudi, Spenser Troiano, Jeff Paul, Justin Stovall, Justin Yamamoto

Abstract:

The necessity of data-driven decisions in healthcare strategy formulation is rapidly increasing. A reliable framework which helps identify factors impacting a healthcare provider facility or a hospital (from here on termed as facility) market share is of key importance. This pilot study aims at developing a data-driven machine learning-regression framework which aids strategists in formulating key decisions to improve the facility’s market share which in turn impacts in improving the quality of healthcare services. The US (United States) healthcare business is chosen for the study, and the data spanning 60 key facilities in Washington State and about 3 years of historical data is considered. In the current analysis, market share is termed as the ratio of the facility’s encounters to the total encounters among the group of potential competitor facilities. The current study proposes a two-pronged approach of competitor identification and regression approach to evaluate and predict market share, respectively. Leveraged model agnostic technique, SHAP, to quantify the relative importance of features impacting the market share. Typical techniques in literature to quantify the degree of competitiveness among facilities use an empirical method to calculate a competitive factor to interpret the severity of competition. The proposed method identifies a pool of competitors, develops Directed Acyclic Graphs (DAGs) and feature level word vectors, and evaluates the key connected components at the facility level. This technique is robust since its data-driven, which minimizes the bias from empirical techniques. The DAGs factor in partial correlations at various segregations and key demographics of facilities along with a placeholder to factor in various business rules (for ex. quantifying the patient exchanges, provider references, and sister facilities). Identified are the multiple groups of competitors among facilities. Leveraging the competitors' identified developed and fine-tuned Random Forest Regression model to predict the market share. To identify key drivers of market share at an overall level, permutation feature importance of the attributes was calculated. For relative quantification of features at a facility level, incorporated SHAP (SHapley Additive exPlanations), a model agnostic explainer. This helped to identify and rank the attributes at each facility which impacts the market share. This approach proposes an amalgamation of the two popular and efficient modeling practices, viz., machine learning with graphs and tree-based regression techniques to reduce the bias. With these, we helped to drive strategic business decisions.

Keywords: competition, DAGs, facility, healthcare, machine learning, market share, random forest, SHAP

Procedia PDF Downloads 91
5915 Green Delivery Systems for Fruit Polyphenols

Authors: Boris M. Popović, Tatjana Jurić, Bojana Blagojević, Denis Uka, Ružica Ždero Pavlović

Abstract:

Green solvents are environmentally friendly and greatly improve the sustainability of chemical processes. There is a growing interest in the green extraction of polyphenols from fruits. In this study, we consider three Natural Deep Eutectic Solvents (NADES) systems based on choline chloride as a hydrogen bond acceptor and malic acid, urea, and fructose as hydrogen bond donors. NADES systems were prepared by heating and stirring, ultrasound, and microwave (MW) methods. Sour cherry pomace was used as a natural source of polyphenols. Polyphenol extraction from cherry pomace was performed by ultrasound-assisted extraction and microwave-assisted extraction and compared with conventional heat and stirring method extraction. It was found that MW-assisted preparation of NADES was the fastest, requiring less than 30 s. Also, MW extraction of polyphenols was the most rapid, with less than 5 min necessary for the extract preparation. All three NADES systems were highly efficient for anthocyanin extraction, but the most efficient was the system with malic acid as a hydrogen bond donor (yield of anthocyanin content was enhanced by 62.33% after MW extraction with NADES compared with the conventional solvent).

Keywords: anthocyanins, green extraction, NADES, polyphenols

Procedia PDF Downloads 95
5914 Predicting Machine-Down of Woodworking Industrial Machines

Authors: Matteo Calabrese, Martin Cimmino, Dimos Kapetis, Martina Manfrin, Donato Concilio, Giuseppe Toscano, Giovanni Ciandrini, Giancarlo Paccapeli, Gianluca Giarratana, Marco Siciliano, Andrea Forlani, Alberto Carrotta

Abstract:

In this paper we describe a machine learning methodology for Predictive Maintenance (PdM) applied on woodworking industrial machines. PdM is a prominent strategy consisting of all the operational techniques and actions required to ensure machine availability and to prevent a machine-down failure. One of the challenges with PdM approach is to design and develop of an embedded smart system to enable the health status of the machine. The proposed approach allows screening simultaneously multiple connected machines, thus providing real-time monitoring that can be adopted with maintenance management. This is achieved by applying temporal feature engineering techniques and training an ensemble of classification algorithms to predict Remaining Useful Lifetime of woodworking machines. The effectiveness of the methodology is demonstrated by testing an independent sample of additional woodworking machines without presenting machine down event.

Keywords: predictive maintenance, machine learning, connected machines, artificial intelligence

Procedia PDF Downloads 227
5913 Naturally Occurring Chemicals in Biopesticides' Resistance Control through Molecular Topology

Authors: Riccardo Zanni, Maria Galvez-Llompart, Ramon Garcia-Domenech, Jorge Galvez

Abstract:

Biopesticides, such as naturally occurring chemicals, pheromones, fungi, bacteria and insect predators are often a winning choice in crop protection because of their environmental friendly profile. They are considered to have lower toxicity than traditional pesticides. After almost a century of pesticides use, resistances to traditional insecticides are wide spread, while those to bioinsecticides have raised less attention, and resistance management is frequently neglected. This seems to be a crucial mistake since resistances have already occurred for many marketed biopesticides. With an eye to the future, we present here a selection of new natural occurring chemicals as potential bioinsecticides. The molecules were selected using a consolidated mathematical paradigm called molecular topology. Several QSAR equations were depicted and subsequently applied for the virtual screening of hundred thousands molecules of natural origin, which resulted in the selection of new potential bioinsecticides. The most innovative aspect of this work does not only reside in the importance of the identification of new molecules overcoming biopesticides’ resistances, but on the possibility to promote shared knowledge in the field of green chemistry through this unique in silico discipline named molecular topology.

Keywords: green chemistry, QSAR, molecular topology, biopesticide

Procedia PDF Downloads 316
5912 Decision Making, Reward Processing and Response Selection

Authors: Benmansour Nassima, Benmansour Souheyla

Abstract:

The appropriate integration of reward processing and decision making provided by the environment is vital for behavioural success and individuals’ well being in everyday life. Functional neurological investigation has already provided an inclusive image on affective and emotional (motivational) processing in the healthy human brain and has recently focused its interest also on the assessment of brain function in anxious and depressed individuals. This article offers an overview on the theoretical approaches that relate emotion and decision-making, and spotlights investigation with anxious or depressed individuals to reveal how emotions can interfere with decision-making. This research aims at incorporating the emotional structure based on response and stimulation with a Bayesian approach to decision-making in terms of probability and value processing. It seeks to show how studies of individuals with emotional dysfunctions bear out that alterations of decision-making can be considered in terms of altered probability and value subtraction. The utmost objective is to critically determine if the probabilistic representation of belief affords could be a critical approach to scrutinize alterations in probability and value representation in subjective with anxiety and depression, and draw round the general implications of this approach.

Keywords: decision-making, motivation, alteration, reward processing, response selection

Procedia PDF Downloads 479
5911 Hand Gesture Detection via EmguCV Canny Pruning

Authors: N. N. Mosola, S. J. Molete, L. S. Masoebe, M. Letsae

Abstract:

Hand gesture recognition is a technique used to locate, detect, and recognize a hand gesture. Detection and recognition are concepts of Artificial Intelligence (AI). AI concepts are applicable in Human Computer Interaction (HCI), Expert systems (ES), etc. Hand gesture recognition can be used in sign language interpretation. Sign language is a visual communication tool. This tool is used mostly by deaf societies and those with speech disorder. Communication barriers exist when societies with speech disorder interact with others. This research aims to build a hand recognition system for Lesotho’s Sesotho and English language interpretation. The system will help to bridge the communication problems encountered by the mentioned societies. The system has various processing modules. The modules consist of a hand detection engine, image processing engine, feature extraction, and sign recognition. Detection is a process of identifying an object. The proposed system uses Canny pruning Haar and Haarcascade detection algorithms. Canny pruning implements the Canny edge detection. This is an optimal image processing algorithm. It is used to detect edges of an object. The system employs a skin detection algorithm. The skin detection performs background subtraction, computes the convex hull, and the centroid to assist in the detection process. Recognition is a process of gesture classification. Template matching classifies each hand gesture in real-time. The system was tested using various experiments. The results obtained show that time, distance, and light are factors that affect the rate of detection and ultimately recognition. Detection rate is directly proportional to the distance of the hand from the camera. Different lighting conditions were considered. The more the light intensity, the faster the detection rate. Based on the results obtained from this research, the applied methodologies are efficient and provide a plausible solution towards a light-weight, inexpensive system which can be used for sign language interpretation.

Keywords: canny pruning, hand recognition, machine learning, skin tracking

Procedia PDF Downloads 185
5910 Effect of Ultrasound and Enzyme on the Extraction of Eurycoma longifolia (Tongkat Ali)

Authors: He Yuhai, Ahmad Ziad Bin Sulaiman

Abstract:

Tongkat Ali, or Eurycoma longifolia, is a traditional Malay and Orang Asli herb used as aphrodisiac, general tonic, anti-Malaria, and anti-Pyretic. It has been recognized as a cashcrop by Malaysia due to its high value for the pharmaceutical use. In Tongkat Ali, eurycomanone, a quassinoid is usually chosen as a marker phytochemical as it is the most abundant phytochemical. In this research, ultrasound and enzyme were used to enhance the extraction of Eurycomanone from Tongkat Ali. Ultrasonic assisted extraction (USE) enhances extraction by facilitating the swelling and hydration of the plant material, enlarging the plant pores, breaking the plant cell, reducing the plant particle size and creating cavitation bubbles that enhance mass transfer in both the washing and diffusion phase of extraction. Enzyme hydrolyses the cell wall of the plant, loosening the structure of the cell wall, releasing more phytochemicals from the plant cell, enhancing the productivity of the extraction. Possible effects of ultrasound on the activity of the enzyme during the hydrolysis of the cell wall is under the investigation by this research. The extracts was analysed by high performance liquid chromatography for the yields of Eurycomanone. In this whole process, the conventional water extraction was used as a control of comparing the performance of the ultrasound and enzyme assisted extraction.

Keywords: ultrasound, enzymatic, extraction, Eurycoma longifolia

Procedia PDF Downloads 418
5909 Structural Optimization of Shell and Arched Structures

Authors: Mitchell Gohnert, Ryan Bradley

Abstract:

This paper reviews some fundamental concepts of structural optimization, which are based on the type of materials used in construction and the shape of the structure. The first step in structural optimization is to break down all internal forces in a structure into fundamental stresses, which are tensions and compressions. Knowing the stress patterns directs our selection of structural shapes and the most appropriate type of construction material. In our selection of materials, it is essential to understand all construction materials have flaws, or micro-cracks, which reduce the capacity of the material, especially when subjected to tensions. Because of material defects, many construction materials perform significantly better when subjected to compressive forces. Structures are also more efficient if bending moments are eliminated. Bending stresses produce high peak stresses at each face of the member, and therefore, substantially more material is required to resist bending. The shape of the structure also has a profound effect on stress levels. Stress may be reduced dramatically by simply changing the shape. Catenary, triangular and linear shapes are the fundamental structural forms to achieve optimal stress flow. If the natural flow of stress matches the shape of the structures, the most optimal shape is determined.

Keywords: arches, economy of stresses, material strength, optimization, shells

Procedia PDF Downloads 118
5908 The Many Faces of Cancer and Knowing When to Say Stop

Authors: Diwei Lin, Amanda Jh. Tan

Abstract:

We present a very rare case of de novo large cell neuroendocrine carcinoma of the prostate (LCNEC) in an 84-year-old male on a background of high-grade, muscle-invasive transitional cell carcinoma of the bladder. While NE tumours account for 1% to 5% of all cases of prostate cancer and scattered NE cells can be found in 10% to 100% of prostate adenocarcinomas, pure LCNEC of the prostate is extremely rare. Most LCNEC of the prostate is thought to originate by clonal progression under the selection pressure of therapy and refractory to long-term hormonal treatment for adenocarcinoma of the prostate. De novo LCNEC is only described in case reports and is thought to develop via direct malignant transformation. Limited data in the English literature makes it difficult to accurately predict the prognosis of LCNEC of the prostate. However, current evidence suggesting that increasing NE differentiation in prostate adenocarcinoma is associated with a higher stage, high-grade disease, and a worse prognosis.

Keywords: large cell neuroendocrine cancer, prostate cancer, refractory cancer, medical and health sciences

Procedia PDF Downloads 422
5907 Solvent Extraction in Ionic Liquids: Structuration and Aggregation Effects on Extraction Mechanisms

Authors: Sandrine Dourdain, Cesar Lopez, Tamir Sukhbaatar, Guilhem Arrachart, Stephane Pellet-Rostaing

Abstract:

A promising challenge in solvent extraction is to replace the conventional organic solvents, with ionic liquids (IL). Depending on the extraction systems, these new solvents show better efficiency than the conventional ones. Although some assumptions based on ions exchanges have been proposed in the literature, these properties are not predictable because the involved mechanisms are still poorly understood. It is well established that the mechanisms underlying solvent extraction processes are based not only on the molecular chelation of the extractant molecules but also on their ability to form supra-molecular aggregates due to their amphiphilic nature. It is therefore essential to evaluate how IL affects the aggregation properties of the extractant molecules. Our aim is to evaluate the influence of IL structure and polarity on solvent extraction mechanisms, by looking at the aggregation of the extractant molecules in IL. We compare extractant systems that are well characterized in common solvents and show thanks to SAXS and SANS measurements, that in the absence of IL ion exchange mechanisms, extraction properties are related to aggregation.

Keywords: solvent extraction in Ionic liquid, aggregation, Ionic liquids structure, SAXS, SANS

Procedia PDF Downloads 158
5906 Extraction of Grapefruit Essential Oil from Grapefruit Peels

Authors: Adithya Subramanian, S. Ananthan, T. Prasanth, S. P. Selvabharathi

Abstract:

This project involves extraction of grapefruit essential oil from grapefruit peels using various oils like castor oil, gingelly oil, olive oil as carrier oils. The main aim of this project is to extract the oil which has numerous medicinal uses. The extraction can be performed by two methods. Project involves extraction of the oil with various carrier oil in a view to reduce the cost of production and the physical properties of the extracted oil are examined.

Keywords: essential oil, carrier oil, medicinal uses, cost of production

Procedia PDF Downloads 436
5905 Incorporating Spatial Selection Criteria with Decision-Maker Preferences of A Precast Manufacturing Plant

Authors: M. N. A. Azman, M. S. S. Ahamad

Abstract:

The Construction Industry Development Board of Malaysia has been actively promoting the use of precast manufacturing in the local construction industry over the last decade. In an era of rapid technological changes, precast manufacturing significantly contributes to improving construction activities and ensuring sustainable economic growth. Current studies on the location decision of precast manufacturing plants aimed to enhanced local economic development are scarce. To address this gap, the present research establishes a new set of spatial criteria, such as attribute maps and preference weights, derived from a survey of local industry decision makers. These data represent the input parameters for the MCE-GIS site selection model, for which the weighted linear combination method is used. Verification tests on the model were conducted to determine the potential precast manufacturing sites in the state of Penang, Malaysia. The tests yield a predicted area of 12.87 acres located within a designated industrial zone. Although, the model is developed specifically for precast manufacturing plant but nevertheless it can be employed to other types of industries by following the methodology and guidelines proposed in the present research.

Keywords: geographical information system, multi criteria evaluation, industrialised building system, civil engineering

Procedia PDF Downloads 289
5904 A Clustering-Based Approach for Weblog Data Cleaning

Authors: Amine Ganibardi, Cherif Arab Ali

Abstract:

This paper addresses the data cleaning issue as a part of web usage data preprocessing within the scope of Web Usage Mining. Weblog data recorded by web servers within log files reflect usage activity, i.e., End-users’ clicks and underlying user-agents’ hits. As Web Usage Mining is interested in End-users’ behavior, user-agents’ hits are referred to as noise to be cleaned-off before mining. Filtering hits from clicks is not trivial for two reasons, i.e., a server records requests interlaced in sequential order regardless of their source or type, website resources may be set up as requestable interchangeably by end-users and user-agents. The current methods are content-centric based on filtering heuristics of relevant/irrelevant items in terms of some cleaning attributes, i.e., website’s resources filetype extensions, website’s resources pointed by hyperlinks/URIs, http methods, user-agents, etc. These methods need exhaustive extra-weblog data and prior knowledge on the relevant and/or irrelevant items to be assumed as clicks or hits within the filtering heuristics. Such methods are not appropriate for dynamic/responsive Web for three reasons, i.e., resources may be set up to as clickable by end-users regardless of their type, website’s resources are indexed by frame names without filetype extensions, web contents are generated and cancelled differently from an end-user to another. In order to overcome these constraints, a clustering-based cleaning method centered on the logging structure is proposed. This method focuses on the statistical properties of the logging structure at the requested and referring resources attributes levels. It is insensitive to logging content and does not need extra-weblog data. The used statistical property takes on the structure of the generated logging feature by webpage requests in terms of clicks and hits. Since a webpage consists of its single URI and several components, these feature results in a single click to multiple hits ratio in terms of the requested and referring resources. Thus, the clustering-based method is meant to identify two clusters based on the application of the appropriate distance to the frequency matrix of the requested and referring resources levels. As the ratio clicks to hits is single to multiple, the clicks’ cluster is the smallest one in requests number. Hierarchical Agglomerative Clustering based on a pairwise distance (Gower) and average linkage has been applied to four logfiles of dynamic/responsive websites whose click to hits ratio range from 1/2 to 1/15. The optimal clustering set on the basis of average linkage and maximum inter-cluster inertia results always in two clusters. The evaluation of the smallest cluster referred to as clicks cluster under the terms of confusion matrix indicators results in 97% of true positive rate. The content-centric cleaning methods, i.e., conventional and advanced cleaning, resulted in a lower rate 91%. Thus, the proposed clustering-based cleaning outperforms the content-centric methods within dynamic and responsive web design without the need of any extra-weblog. Such an improvement in cleaning quality is likely to refine dependent analysis.

Keywords: clustering approach, data cleaning, data preprocessing, weblog data, web usage data

Procedia PDF Downloads 170
5903 Parkinson's Disease Gene Identification Using Physicochemical Properties of Amino Acids

Authors: Priya Arora, Ashutosh Mishra

Abstract:

Gene identification, towards the pursuit of mutated genes, leading to Parkinson’s disease, puts forward a challenge towards proactive cure of the disorder itself. Computational analysis is an effective technique for exploring genes in the form of protein sequences, as the theoretical and manual analysis is infeasible. The limitations and effectiveness of a particular computational method are entirely dependent on the previous data that is available for disease identification. The article presents a sequence-based classification method for the identification of genes responsible for Parkinson’s disease. During the initiation phase, the physicochemical properties of amino acids transform protein sequences into a feature vector. The second phase of the method employs Jaccard distances to select negative genes from the candidate population. The third phase involves artificial neural networks for making final predictions. The proposed approach is compared with the state of art methods on the basis of F-measure. The results confirm and estimate the efficiency of the method.

Keywords: disease gene identification, Parkinson’s disease, physicochemical properties of amino acid, protein sequences

Procedia PDF Downloads 141
5902 Investigating the Effective Parameters in Determining the Type of Traffic Congestion Pricing Schemes in Urban Streets

Authors: Saeed Sayyad Hagh Shomar

Abstract:

Traffic congestion pricing – as a strategy in travel demand management in urban areas to reduce traffic congestion, air pollution and noise pollution – has drawn many attentions towards itself. Unlike the satisfying findings in this method, there are still problems in determining the best functional congestion pricing scheme with regard to the situation. The so-called problems in this process will result in further complications and even the scheme failure. That is why having proper knowledge of the significance of congestion pricing schemes and the effective factors in choosing them can lead to the success of this strategy. In this study, first, a variety of traffic congestion pricing schemes and their components are introduced; then, their functional usage is discussed. Next, by analyzing and comparing the barriers, limitations and advantages, the selection criteria of pricing schemes are described. The results, accordingly, show that the selection of the best scheme depends on various parameters. Finally, based on examining the effective parameters, it is concluded that the implementation of area-based schemes (cordon and zonal) has been more successful in non-diversion of traffic. That is considering the topology of the cities and the fact that traffic congestion is often created in the city centers, area-based schemes would be notably functional and appropriate.

Keywords: congestion pricing, demand management, flat toll, variable toll

Procedia PDF Downloads 391
5901 A Geospatial Consumer Marketing Campaign Optimization Strategy: Case of Fuzzy Approach in Nigeria Mobile Market

Authors: Adeolu O. Dairo

Abstract:

Getting the consumer marketing strategy right is a crucial and complex task for firms with a large customer base such as mobile operators in a competitive mobile market. While empirical studies have made efforts to identify key constructs, no geospatial model has been developed to comprehensively assess the viability and interdependency of ground realities regarding the customer, competition, channel and the network quality of mobile operators. With this research, a geo-analytic framework is proposed for strategy formulation and allocation for mobile operators. Firstly, a fuzzy analytic network using a self-organizing feature map clustering technique based on inputs from managers and literature, which depicts the interrelationships amongst ground realities is developed. The model is tested with a mobile operator in the Nigeria mobile market. As a result, a customer-centric geospatial and visualization solution is developed. This provides a consolidated and integrated insight that serves as a transparent, logical and practical guide for strategic, tactical and operational decision making.

Keywords: geospatial, geo-analytics, self-organizing map, customer-centric

Procedia PDF Downloads 184
5900 A Framework for Automated Nuclear Waste Classification

Authors: Seonaid Hume, Gordon Dobie, Graeme West

Abstract:

Detecting and localizing radioactive sources is a necessity for safe and secure decommissioning of nuclear facilities. An important aspect for the management of the sort-and-segregation process is establishing the spatial distributions and quantities of the waste radionuclides, their type, corresponding activity, and ultimately classification for disposal. The data received from surveys directly informs decommissioning plans, on-site incident management strategies, the approach needed for a new cell, as well as protecting the workforce and the public. Manual classification of nuclear waste from a nuclear cell is time-consuming, expensive, and requires significant expertise to make the classification judgment call. Also, in-cell decommissioning is still in its relative infancy, and few techniques are well-developed. As with any repetitive and routine tasks, there is the opportunity to improve the task of classifying nuclear waste using autonomous systems. Hence, this paper proposes a new framework for the automatic classification of nuclear waste. This framework consists of five main stages; 3D spatial mapping and object detection, object classification, radiological mapping, source localisation based on gathered evidence and finally, waste classification. The first stage of the framework, 3D visual mapping, involves object detection from point cloud data. A review of related applications in other industries is provided, and recommendations for approaches for waste classification are made. Object detection focusses initially on cylindrical objects since pipework is significant in nuclear cells and indeed any industrial site. The approach can be extended to other commonly occurring primitives such as spheres and cubes. This is in preparation of stage two, characterizing the point cloud data and estimating the dimensions, material, degradation, and mass of the objects detected in order to feature match them to an inventory of possible items found in that nuclear cell. Many items in nuclear cells are one-offs, have limited or poor drawings available, or have been modified since installation, and have complex interiors, which often and inadvertently pose difficulties when accessing certain zones and identifying waste remotely. Hence, this may require expert input to feature match objects. The third stage, radiological mapping, is similar in order to facilitate the characterization of the nuclear cell in terms of radiation fields, including the type of radiation, activity, and location within the nuclear cell. The fourth stage of the framework takes the visual map for stage 1, the object characterization from stage 2, and radiation map from stage 3 and fuses them together, providing a more detailed scene of the nuclear cell by identifying the location of radioactive materials in three dimensions. The last stage involves combining the evidence from the fused data sets to reveal the classification of the waste in Bq/kg, thus enabling better decision making and monitoring for in-cell decommissioning. The presentation of the framework is supported by representative case study data drawn from an application in decommissioning from a UK nuclear facility. This framework utilises recent advancements of the detection and mapping capabilities of complex radiation fields in three dimensions to make the process of classifying nuclear waste faster, more reliable, cost-effective and safer.

Keywords: nuclear decommissioning, radiation detection, object detection, waste classification

Procedia PDF Downloads 201
5899 A Hybrid Fuzzy Clustering Approach for Fertile and Unfertile Analysis

Authors: Shima Soltanzadeh, Mohammad Hosain Fazel Zarandi, Mojtaba Barzegar Astanjin

Abstract:

Diagnosis of male infertility by the laboratory tests is expensive and, sometimes it is intolerable for patients. Filling out the questionnaire and then using classification method can be the first step in decision-making process, so only in the cases with a high probability of infertility we can use the laboratory tests. In this paper, we evaluated the performance of four classification methods including naive Bayesian, neural network, logistic regression and fuzzy c-means clustering as a classification, in the diagnosis of male infertility due to environmental factors. Since the data are unbalanced, the ROC curves are most suitable method for the comparison. In this paper, we also have selected the more important features using a filtering method and examined the impact of this feature reduction on the performance of each methods; generally, most of the methods had better performance after applying the filter. We have showed that using fuzzy c-means clustering as a classification has a good performance according to the ROC curves and its performance is comparable to other classification methods like logistic regression.

Keywords: classification, fuzzy c-means, logistic regression, Naive Bayesian, neural network, ROC curve

Procedia PDF Downloads 340