Search results for: sequential extraction process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16972

Search results for: sequential extraction process

13462 Modelization of Land Degradation by Desertification Using Medalus Method, Case Study of the Wilaya of Saida, Algeria

Authors: Fekir Youcef, Mederbal Khalladi, M. A. Hamadouche, D. Anteur

Abstract:

Algeria is one of the countries that are highly affected by desertification which is the consequence of several factors. For this purpose, there is a need to study this problem by quantitative approaches. In this study, we apply the MEDALUS method (Mediterranean Desertification and Land Use) to a watershed located in Saida town in semi-arid environment in the south west of Algeria. The method is based on sensitive areas identification by making use of the different parameters that may affect the desertification process such as vegetation, soil, climate and management. Spatial analyses are strong tools that allow modelization of each indicator. Results show that according to European standards, a large scale of the watershed falls into critical classes. And therefore, the modelization approach can be an effective way to study and understand the desertification showing an example of the project of the green dam that limits the desertification process to affect the north areas off Algeria.

Keywords: Algeria, desertification, MEDALUS, modelization

Procedia PDF Downloads 389
13461 Image Inpainting Model with Small-Sample Size Based on Generative Adversary Network and Genetic Algorithm

Authors: Jiawen Wang, Qijun Chen

Abstract:

The performance of most machine-learning methods for image inpainting depends on the quantity and quality of the training samples. However, it is very expensive or even impossible to obtain a great number of training samples in many scenarios. In this paper, an image inpainting model based on a generative adversary network (GAN) is constructed for the cases when the number of training samples is small. Firstly, a feature extraction network (F-net) is incorporated into the GAN network to utilize the available information of the inpainting image. The weighted sum of the extracted feature and the random noise acts as the input to the generative network (G-net). The proposed network can be trained well even when the sample size is very small. Secondly, in the phase of the completion for each damaged image, a genetic algorithm is designed to search an optimized noise input for G-net; based on this optimized input, the parameters of the G-net and F-net are further learned (Once the completion for a certain damaged image ends, the parameters restore to its original values obtained in the training phase) to generate an image patch that not only can fill the missing part of the damaged image smoothly but also has visual semantics.

Keywords: image inpainting, generative adversary nets, genetic algorithm, small-sample size

Procedia PDF Downloads 130
13460 Multi-Template Molecularly Imprinted Polymer: Synthesis, Characterization and Removal of Selected Acidic Pharmaceuticals from Wastewater

Authors: Lawrence Mzukisi Madikizela, Luke Chimuka

Abstract:

Removal of organics from wastewater offers a better water quality, therefore, the purpose of this work was to investigate the use of molecularly imprinted polymer (MIP) for the elimination of selected organics from water. A multi-template MIP for the adsorption of naproxen, ibuprofen and diclofenac was synthesized using a bulk polymerization method. A MIP was synthesized at 70°C by employing 2-vinylpyridine, ethylene glycol dimethacrylate, toluene and 1,1’-azobis-(cyclohexanecarbonitrile) as functional monomer, cross-linker, porogen and initiator, respectively. Thermogravimetric characterization indicated that the polymer backbone collapses at 250°C and scanning electron microscopy revealed the porous and roughness nature of the MIP after elution of templates. The performance of the MIP in aqueous solutions was evaluated by optimizing several adsorption parameters. The optimized adsorption conditions were 50 mg of MIP, extraction time of 10 min, a sample pH of 4.6 and the initial concentration of 30 mg/L. The imprinting factors obtained for naproxen, ibuprofen and diclofenac were 1.25, 1.42, and 2.01, respectively. The order of selectivity for the MIP was; diclofenac > ibuprofen > naproxen. MIP showed great swelling in water with an initial swelling rate of 2.62 g/(g min). The synthesized MIP proved to be able to adsorb naproxen, ibuprofen and diclofenac from contaminated deionized water, wastewater influent and effluent.

Keywords: adsorption, molecularly imprinted polymer, multi template, pharmaceuticals

Procedia PDF Downloads 303
13459 Production of Biogas from Organic Wastes Using Plastic Biodigesternoura

Authors: Oladipo Oluwaseun Peter

Abstract:

Daily consumption of crude oil is alarming as a result of increasing demand for energy. Waste generation tends to rise with the level of economic advancement of a nation. Hence, this project work researches how wastes which could pose toxic if left unattended to can be processed through biodigestion in order to generate biofuel which could serve as a good substitute for petroleum, a non renewable energy source, so as to reduce over-dependence on petroleum and to prevent environmental pollution. Anaerobic digestion was carried out on organic wastes comprising brewery spent grains, rice husks and poultry droppings in a plastic biodigester of 1000 liters volume using the poultry droppings as a natural inoculums source. The feed composition in ratio 5:3:2, spent grain, rice husks and poultry droppings were mixed with water in the ratio 1:6. Thus, 600 Kg of water was used to prepare the slurry with 100 Kg of feed materials. A plastic biodigester was successfully constructed, and the problem of corrosion and rusting were completely overcome as a result of the use of non-corroding materials of construction. A reasonable quantity of biogas, 33.63m3, was generated over a period of 60 days of biodigestion. The bioslurry was processed through two different process routes; evaporation and filteration. Evaporation process of analysis shows high values of 0.64%, 2.11% and 0.034% for nitrogen, phosphorous and potassium respectively, while filteration process gives 00.61%, 1.93% and 0.026% for nitrogen, phosphorous and potassium respectively.

Keywords: biodigestion, biofuel, digestion, slurry, biogas

Procedia PDF Downloads 377
13458 Study of the Antimicrobial Activity of the Extract of the Eucalyptus camaldulensis stemming from the Algerian Northeast

Authors: Meksem Nabila, Bordjiba Ouahiba, Meraghni Messaouda, Meksem Amara Leila, Djebar Mohhamed Reda

Abstract:

The problems of protection of the cultures are being more and more important that they interest great number of farmers and scientists because of the excessive use of the organic phytosanitary products of synthesis that causes fatal damages on the environment. To reduce the inconveniences produced by these pesticides, the use of "biopesticides" originated from plants could be an alternative. The aim of this work is the valuation of a botanical species: Eucalyptus camaldulensis from Northeastern Algeria which extracts are supposed to have an antimicrobial activity, similar to pesticides. The extraction of secondary metabolites from the leaves of E. camaldulensis was realized using methanol and water, and measurements of total polyphenols were made by spectrometric method. Determination of the antimicrobial activity of the extracts at issue was realized in vitro on phyto-pathogenic fungal and bacterial stumps. Tests of comparison were included in the essays by using the chemical pesticidal products of synthesis. The obtained results show that the plant contains polyphenols with an efficiency mattering of the order of 22 %. These polyphenols have a strong fungicidal and bactericidal pesticidal activity against various microbial stumps and the values of the zones of inhibition are more important compared with that obtained in the presence of the chemicals of synthesis (fungicide).

Keywords: eucalyptus camaldulensis, biopesticide, polyphenols, antimicrobial activity

Procedia PDF Downloads 432
13457 Facilitation of Digital Culture and Creativity through an Ideation Strategy: A Case Study with an Incumbent Automotive Manufacturer

Authors: K. Ö. Kartal, L. Maul, M. Hägele

Abstract:

With the development of new technologies come additional opportunities for the founding of companies and new markets to be created. The barriers to entry are lowered and technology makes old business models obsolete. Incumbent companies have to be adaptable to this quickly changing environment. They have to start the process of digital maturation and they have to be able to adapt quickly to new and drastic changes that might arise. One of the biggest barriers for organizations in order to do so is their culture. This paper shows the core elements of a corporate culture that supports the process of digital maturation in incumbent organizations. Furthermore, it is explored how ideation and innovation can be used in a strategy in order to facilitate these core elements of culture that promote digital maturity. Focus areas are identified for the design of ideation strategies, with the aim to make the facilitation and incitation process more effective, short to long term. Therefore, one in-depth case study is conducted with data collection from interviews, observation, document review and surveys. The findings indicate that digital maturity is connected to cultural shift and 11 relevant elements of digital culture are identified which have to be considered. Based on these 11 core elements, five focus areas that need to be regarded in the design of a strategy that uses ideation and innovation to facilitate the cultural shift are identified. These are: Focus topics, rewards and communication, structure and frequency, regions and new online formats.

Keywords: digital transformation, innovation management, ideation strategy, creativity culture, change

Procedia PDF Downloads 196
13456 A Goal-Driven Crime Scripting Framework

Authors: Hashem Dehghanniri

Abstract:

Crime scripting is a simple and effective crime modeling technique that aims to improve understanding of security analysts about security and crime incidents. Low-quality scripts provide a wrong, incomplete, or sophisticated understanding of the crime commission process, which oppose the purpose of their application, e.g., identifying effective and cost-efficient situational crime prevention (SCP) measures. One important and overlooked factor in generating quality scripts is the crime scripting method. This study investigates the problems within the existing crime scripting practices and proposes a crime scripting approach that contributes to generating quality crime scripts. It was validated by experienced crime scripters. This framework helps analysts develop better crime scripts and contributes to their effective application, e.g., SCP measures identification or policy-making.

Keywords: attack modelling, crime commission process, crime script, situational crime prevention

Procedia PDF Downloads 126
13455 Neighborhood Sustainability Assessment Tools: A Conceptual Framework for Their Use in Building Adaptive Capacity to Climate Change

Authors: Sally Naji, Julie Gwilliam

Abstract:

Climate change remains a challenging matter for the human and the built environment in the 21st century, where the need to consider adaptation to climate change in the development process is paramount. However, there remains a lack of information regarding how we should prepare responses to this issue, such as through developing organized and sophisticated tools enabling the adaptation process. This study aims to build a systematic framework approach to investigate the potentials that Neighborhood Sustainability Assessment tools (NSA) might offer in enabling both the analysis of the emerging adaptive capacity to climate change. The analysis of the framework presented in this paper aims to discuss this issue in three main phases. The first part attempts to link sustainability and climate change, in the context of adaptive capacity. It is argued that in deciding to promote sustainability in the context of climate change, both the resilience and vulnerability processes become central. However, there is still a gap in the current literature regarding how the sustainable development process can respond to climate change. As well as how the resilience of practical strategies might be evaluated. It is suggested that the integration of the sustainability assessment processes with both the resilience thinking process, and vulnerability might provide important components for addressing the adaptive capacity to climate change. A critical review of existing literature is presented illustrating the current lack of work in this field, integrating these three concepts in the context of addressing the adaptive capacity to climate change. The second part aims to identify the most appropriate scale at which to address the built environment for the climate change adaptation. It is suggested that the neighborhood scale can be considered as more suitable than either the building or urban scales. It then presents the example of NSAs, and discusses the need to explore their potential role in promoting the adaptive capacity to climate change. The third part of the framework presents a comparison among three example NSAs, BREEAM Communities, LEED-ND, and CASBEE-UD. These three tools have been selected as the most developed and comprehensive assessment tools that are currently available for the neighborhood scale. This study concludes that NSAs are likely to present the basis for an organized framework to address the practical process for analyzing and yet promoting Adaptive Capacity to Climate Change. It is further argued that vulnerability (exposure & sensitivity) and resilience (Interdependence & Recovery) form essential aspects to be addressed in the future assessment of NSA’s capability to adapt to both short and long term climate change impacts. Finally, it is acknowledged that further work is now required to understand impact assessment in terms of the range of physical sectors (Water, Energy, Transportation, Building, Land Use and Ecosystems), Actor and stakeholder engagement as well as a detailed evaluation of the NSA indicators, together with a barriers diagnosis process.

Keywords: adaptive capacity, climate change, NSA tools, resilience, sustainability

Procedia PDF Downloads 381
13454 A Modular Framework for Enabling Analysis for Educators with Different Levels of Data Mining Skills

Authors: Kyle De Freitas, Margaret Bernard

Abstract:

Enabling data mining analysis among a wider audience of educators is an active area of research within the educational data mining (EDM) community. The paper proposes a framework for developing an environment that caters for educators who have little technical data mining skills as well as for more advanced users with some data mining expertise. This framework architecture was developed through the review of the strengths and weaknesses of existing models in the literature. The proposed framework provides a modular architecture for future researchers to focus on the development of specific areas within the EDM process. Finally, the paper also highlights a strategy of enabling analysis through either the use of predefined questions or a guided data mining process and highlights how the developed questions and analysis conducted can be reused and extended over time.

Keywords: educational data mining, learning management system, learning analytics, EDM framework

Procedia PDF Downloads 326
13453 Valorization of Natural Vegetable Substances from Tunisia: Purification of Two Food Additives, Anthocyanins and Locust Bean Gum

Authors: N. Bouzouita, A. Snoussi , H. Ben Haj Koubaier, I. Essaidi, M. M. Chaabouni, S. Zgoulli, P. Thonart

Abstract:

Color is one of the most important quality attributes for the food industry. Grape marc, a complex lignocellulosic material is one of the most abundant and worth less byproduct, generated after the pressing process. The development of the process of purification by micro filtration, ultra filtration, nano filtration and drying by atomization of the anthocyanins of Tunisian origin is the aim of this work. Locust bean gum is the ground endosperm of the seeds of carob fruit; owing to its remarkable water-binding properties, it is widely used to improve the texture of food and largely employed in food industry. The purification of LGB causes drastically reduced ash and proteins contents but important increase for galactomannan.

Keywords: Carob, food additives, grape pomace, locust bean gum, natural colorant, nano filtration, thickener, ultra filtration

Procedia PDF Downloads 333
13452 The Development and Validation of the Awareness to Disaster Risk Reduction Questionnaire for Teachers

Authors: Ian Phil Canlas, Mageswary Karpudewan, Joyce Magtolis, Rosario Canlas

Abstract:

This study reported the development and validation of the Awareness to Disaster Risk Reduction Questionnaire for Teachers (ADRRQT). The questionnaire is a combination of Likert scale and open-ended questions that were grouped into two parts. The first part included questions relating to the general awareness on disaster risk reduction. Whereas, the second part comprised questions regarding the integration of disaster risk reduction in the teaching process. The entire process of developing and validating of the ADRRQT was described in this study. Statistical and qualitative findings revealed that the ADRRQT is significantly valid and reliable and has the potential of measuring awareness to disaster risk reduction of stakeholders in the field of teaching. Moreover, it also shows the potential to be adopted in other fields.

Keywords: awareness, development, disaster risk reduction, questionnaire, validation

Procedia PDF Downloads 228
13451 Using Single Decision Tree to Assess the Impact of Cutting Conditions on Vibration

Authors: S. Ghorbani, N. I. Polushin

Abstract:

Vibration during machining process is crucial since it affects cutting tool, machine, and workpiece leading to a tool wear, tool breakage, and an unacceptable surface roughness. This paper applies a nonparametric statistical method, single decision tree (SDT), to identify factors affecting on vibration in machining process. Workpiece material (AISI 1045 Steel, AA2024 Aluminum alloy, A48-class30 Gray Cast Iron), cutting tool (conventional, cutting tool with holes in toolholder, cutting tool filled up with epoxy-granite), tool overhang (41-65 mm), spindle speed (630-1000 rpm), feed rate (0.05-0.075 mm/rev) and depth of cut (0.05-0.15 mm) were used as input variables, while vibration was the output parameter. It is concluded that workpiece material is the most important parameters for natural frequency followed by cutting tool and overhang.

Keywords: cutting condition, vibration, natural frequency, decision tree, CART algorithm

Procedia PDF Downloads 336
13450 Application Quality Function Deployment (QFD) Tool in Design of Aero Pumps Based on System Engineering

Authors: Z. Soleymani, M. Amirzadeh

Abstract:

Quality Function Deployment (QFD) was developed in 1960 in Japan and introduced in 1983 in America and Europe. The paper presents a real application of this technique in a way that the method of applying QFD in design and production aero fuel pumps has been considered. While designing a product and in order to apply system engineering process, the first step is identification customer needs then its transition to engineering parameters. Since each change in deign after production process leads to extra human costs and also increase in products quality risk, QFD can make benefits in sale by meeting customer expectations. Since the needs identified as well, the use of QFD tool can lead to increase in communications and less deviation in design and production phases, finally it leads to produce the products with defined technical attributes.

Keywords: customer voice, engineering parameters, gear pump, QFD

Procedia PDF Downloads 249
13449 Interpretation of Heritage Revitalization

Authors: Jarot Mahendra

Abstract:

The primary objective of this paper is to provide a view in the interpretation of the revitalization of heritage buildings. This objective is achieved by analyzing the concept of interpretation that is oriented in the perspective of law, urban spatial planning, and stakeholder perspective, and then develops the theoretical framework of interpretation in the cultural resources management through issues of identity, heritage as a process, and authenticity in heritage. The revitalization of heritage buildings with the interpretation of these three issues is that interpretation can be used as a communication process to express the meaning and relation of heritage to the community so as to avoid the conflict that will arise and develop as a result of different perspectives of stakeholders. Using case studies in Indonesia, this study focuses on the revitalization of heritage sites in the National Gallery of Indonesia (GNI). GNI is a cultural institution that uses several historical buildings that have been designated as heritage and have not been designated as a heritage according to the regulations applicable in Indonesia, in carrying out its function as the center of Indonesian art development and art museums. The revitalization of heritage buildings is taken as a step to meet space needs in running the current GNI function. In the revitalization master plan, there are physical interventions on the building of heritage and the removal of some historic buildings which will then be built new buildings at that location. The research matrix was used to map out the main elements of the study (the concept of GNI revitalization, heritage as identity, heritage as a process, and authenticity in the heritage). Expert interviews and document studies are the main tools used in collecting data. Qualitative data is then analyzed through content analysis and template analysis. This study identifies the significance of historic buildings (heritage buildings and buildings not defined as heritage) as an important value of history, architecture, education, and culture. The significance becomes the basis for revisiting the revitalization master plan which is then reviewed according to applicable regulations and the spatial layout of Jakarta. The interpretation that is built is (1) GNI is one of the elements of the embodiment of the National Cultural Center in the context of the region, where there are National Monument, National Museum and National Library in the same area, so the heritage not only gives identity to the past culture but the culture of current community; (2) The heritage should be seen as a dynamic cultural process towards the cultural change of community, where heritage must develop along with the urban development, so that the heritage buildings can remain alive and side by side with modern buildings but still observe the principles of preservation of heritage; (3) The authenticity of heritage should be able to balance the cultural heritage conservation approach with urban development, where authenticity can serve as a 'Value Transmitter' so that authenticity can be used to evaluate, preserve and manage heritage buildings by considering tangible and intangible aspects.

Keywords: authenticity, culture process, identity, interpretation, revitalization

Procedia PDF Downloads 148
13448 Monitoring the Drying and Grinding Process during Production of Celitement through a NIR-Spectroscopy Based Approach

Authors: Carolin Lutz, Jörg Matthes, Patrick Waibel, Ulrich Precht, Krassimir Garbev, Günter Beuchle, Uwe Schweike, Peter Stemmermann, Hubert B. Keller

Abstract:

Online measurement of the product quality is a challenging task in cement production, especially in the production of Celitement, a novel environmentally friendly hydraulic binder. The mineralogy and chemical composition of clinker in ordinary Portland cement production is measured by X-ray diffraction (XRD) and X ray fluorescence (XRF), where only crystalline constituents can be detected. But only a small part of the Celitement components can be measured via XRD, because most constituents have an amorphous structure. This paper describes the development of algorithms suitable for an on-line monitoring of the final processing step of Celitement based on NIR-data. For calibration intermediate products were dried at different temperatures and ground for variable durations. The products were analyzed using XRD and thermogravimetric analyses together with NIR-spectroscopy to investigate the dependency between the drying and the milling processes on one and the NIR-signal on the other side. As a result, different characteristic parameters have been defined. A short overview of the Celitement process and the challenging tasks of the online measurement and evaluation of the product quality will be presented. Subsequently, methods for systematic development of near-infrared calibration models and the determination of the final calibration model will be introduced. The application of the model on experimental data illustrates that NIR-spectroscopy allows for a quick and sufficiently exact determination of crucial process parameters.

Keywords: calibration model, celitement, cementitious material, NIR spectroscopy

Procedia PDF Downloads 500
13447 Innovation in Information Technology Services: Framework to Improve the Effectiveness and Efficiency of Information Technology Service Management Processes, Projects and Decision Support Management

Authors: Pablo Cardozo Herrera

Abstract:

In a dynamic market of Information Technology (IT) Service and with high quality demands and high performance requirements in decreasing costs, it is imperative that IT companies invest organizational effort in order to increase the effectiveness of their Information Technology Service Management (ITSM) processes through the improvement of ITSM project management and through solid support to the strategic decision-making process of IT directors. In this article, the author presents an analysis of common issues of IT companies around the world, with strategic needs of information unmet that provoke their ITSM processes and projects management that do not achieve the effectiveness and efficiency expected of their results. In response to the issues raised, the author proposes a framework consisting of an innovative theoretical framework model of ITSM management and a technological solution aligned to the Information Technology Infrastructure Library (ITIL) good practices guidance and ISO/IEC 20000-1 requirements. The article describes a research that proves the proposed framework is able to integrate, manage and coordinate in a holistic way, measurable and auditable, all ITSM processes and projects of IT organization and utilize the effectiveness assessment achieved for their strategic decision-making process increasing the process maturity level and improving the capacity of an efficient management.

Keywords: innovation in IT services, ITSM processes, ITIL and ISO/IEC 20000-1, IT service management, IT service excellence

Procedia PDF Downloads 397
13446 Modeling of Age Hardening Process Using Adaptive Neuro-Fuzzy Inference System: Results from Aluminum Alloy A356/Cow Horn Particulate Composite

Authors: Chidozie C. Nwobi-Okoye, Basil Q. Ochieze, Stanley Okiy

Abstract:

This research reports on the modeling of age hardening process using adaptive neuro-fuzzy inference system (ANFIS). The age hardening output (Hardness) was predicted using ANFIS. The input parameters were ageing time, temperature and percentage composition of cow horn particles (CHp%). The results show the correlation coefficient (R) of the predicted hardness values versus the measured values was of 0.9985. Subsequently, values outside the experimental data points were predicted. When the temperature was kept constant, and other input parameters were varied, the average relative error of the predicted values was 0.0931%. When the temperature was varied, and other input parameters kept constant, the average relative error of the hardness values predictions was 80%. The results show that ANFIS with coarse experimental data points for learning is not very effective in predicting process outputs in the age hardening operation of A356 alloy/CHp particulate composite. The fine experimental data requirements by ANFIS make it more expensive in modeling and optimization of age hardening operations of A356 alloy/CHp particulate composite.

Keywords: adaptive neuro-fuzzy inference system (ANFIS), age hardening, aluminum alloy, metal matrix composite

Procedia PDF Downloads 154
13445 A Study of Anthraquinone Dye Removal by Using Chitosan Nanoparticles

Authors: Pyar S. Jassal, Sonal Gupta, Neema Chand, Rajni Johar

Abstract:

In present study, Low molecular weight chitosan naoparticles (LMWCNP) were synthesized by using low molecular weight chitosan (LMWC) and sodium tripolyphosphate for the adsorption of anthraquinone dyes from waste water. The ionic-gel technique was used for this purpose. Size of nanoparticles was determined by “Scherrer equation”. The absorbance was carried out with UV-visible spectrophotometer for Acid Green 25 (AG25) and Reactive Blue 4 (RB4) dyes solutions at λmax 644 and λmax 598 nm respectively. The removal of dyes was dependent on the pH and the optimum adsorption was between pH 2 to 9. The extraction of dyes was linearly dependent on temperature. The equilibrium parameters, RL was calculated by using the Langmuir isotherm and shows that adsorption of dyes is favorable on the LMWCNP. The XRD images of LMWC show a crystalline nature whereas LMWCNP is amorphous one. The thermo gravimetric analysis (TGA) shows that LMWCNP thermally more stable than LMWC. As the contact time increases, percentage removal of Acid Green 25 and Reactive Blue 4 dyes also increases. TEM images reveal the size of the LMWCNP were in the range of 45-50 nm. The capacity of AG25 dye on LMWC was 5.23 mg/g, it compared with LMWCNP capacity which was 6.83 mg/g respectively. The capacity of RB4 dye on LMWC was 2.30 mg/g and 2.34 mg/g was on LMWCNP.

Keywords: low molecular weight chitosan nanoparticles, anthraquinone dye, removal efficiency, adsorption isotherm

Procedia PDF Downloads 135
13444 An AK-Chart for the Non-Normal Data

Authors: Chia-Hau Liu, Tai-Yue Wang

Abstract:

Traditional multivariate control charts assume that measurement from manufacturing processes follows a multivariate normal distribution. However, this assumption may not hold or may be difficult to verify because not all the measurement from manufacturing processes are normal distributed in practice. This study develops a new multivariate control chart for monitoring the processes with non-normal data. We propose a mechanism based on integrating the one-class classification method and the adaptive technique. The adaptive technique is used to improve the sensitivity to small shift on one-class classification in statistical process control. In addition, this design provides an easy way to allocate the value of type I error so it is easier to be implemented. Finally, the simulation study and the real data from industry are used to demonstrate the effectiveness of the propose control charts.

Keywords: multivariate control chart, statistical process control, one-class classification method, non-normal data

Procedia PDF Downloads 423
13443 Detection and Classification of Myocardial Infarction Using New Extracted Features from Standard 12-Lead ECG Signals

Authors: Naser Safdarian, Nader Jafarnia Dabanloo

Abstract:

In this paper we used four features i.e. Q-wave integral, QRS complex integral, T-wave integral and total integral as extracted feature from normal and patient ECG signals to detection and localization of myocardial infarction (MI) in left ventricle of heart. In our research we focused on detection and localization of MI in standard ECG. We use the Q-wave integral and T-wave integral because this feature is important impression in detection of MI. We used some pattern recognition method such as Artificial Neural Network (ANN) to detect and localize the MI. Because these methods have good accuracy for classification of normal and abnormal signals. We used one type of Radial Basis Function (RBF) that called Probabilistic Neural Network (PNN) because of its nonlinearity property, and used other classifier such as k-Nearest Neighbors (KNN), Multilayer Perceptron (MLP) and Naive Bayes Classification. We used PhysioNet database as our training and test data. We reached over 80% for accuracy in test data for localization and over 95% for detection of MI. Main advantages of our method are simplicity and its good accuracy. Also we can improve accuracy of classification by adding more features in this method. A simple method based on using only four features which extracted from standard ECG is presented which has good accuracy in MI localization.

Keywords: ECG signal processing, myocardial infarction, features extraction, pattern recognition

Procedia PDF Downloads 456
13442 Persistent Bacteremia in Cases of Endodontic Re-Treatments

Authors: Ilma Robo, Manola Kelmendi, Kleves Elezi, Nevila Alliu

Abstract:

The most important stage in deciding whether to re-treat or not endodontically is to find the reason for the clinical in-success. Therefore, endodontic re-treatment aims to eliminate the etiology of the pathology, where the main ones are the bacteria remaining in the inter-radicular spaces or the presence of other irritants that can be not only bacterial toxins but also the elements that keep the batteries fixed or extra-canal toxins such as extraction outside the apex of the canal filling. Shortcomings of endodontic treatment can be corrected, if possible, only with endodontic re-treatment that is initially attempted orthograde, and if clinical endodontic success is not achieved again, it can be performed retrograde or surgically. The elements that do not help in this direction are the anatomical deformations in the canal network of the tooth roots, in the presence of the delta at the apex of the tooth root, in the isthmuses present, all of which can be explained by the endodontic canal anatomical morphology. Actually, even if the causative endodontic bacteria remains isolated and without an exit in the healthy periodontal tissues, then this can also be a clinical endodontic success, regardless of the fact that the endodontic isolation occurred only in the exits such as the apex or the accessory canals. Clinical endodontic in-success occurs only when bacterial residues emerge or provide an exit in the healthy periradicular tissues or along the entire length of the canal where the accessory canals exit.

Keywords: endodontic success, E. foecalis, nanoparticles, laser diode, antibacterial, antiseptic

Procedia PDF Downloads 50
13441 Recovery of Copper from Edge Trims of Printed Circuit Boards Using Acidithiobacillus Ferrooxidans: Bioleaching

Authors: Shashi Arya, Nand L. Singh, Samiksha Singh, Pradeep K. Mishra, Siddh N. Upadhyay

Abstract:

The enormous generation of E- waste and its recycling have greater environmental concern especially in developing countries like India. A major part of this waste comprises printed circuit boards (PCBs). Edge trims of PCBs have high copper content ranging between 25-60%. The extraction of various metals out of these PCBs is more or less a proven technology, wherein various hazardous chemicals are being used in the resource recovery, resulting into secondary pollution. The current trend of extracting of valuable metals is the utilization of microbial strains to eliminate the problem of a secondary pollutant. Keeping the above context in mind, this work aims at the enhanced recovery of copper from edge trims, through bioleaching using bacterial strain Acidithiobacillus ferrooxidans. The raw material such as motherboards, hard drives, floppy drives and DVD drives were obtained from the warehouse of the University. More than 90% copper could be extracted through bioleaching using Acidithiobacillus ferrooxidans. Inoculate concentration has merely insignificant effect over copper recovery above 20% inoculate concentration. Higher concentration of inoculation has the only initial advantage up to 2-4 days. The complete recovery has been obtained between 14- 24 days.

Keywords: acidithiobacillus ferrooxidans, bioleaching, e-waste, printed circuit boards

Procedia PDF Downloads 330
13440 Efficiency of a Semantic Approach in Teaching Foreign Languages

Authors: Genady Shlomper

Abstract:

During the process of language teaching, each teacher faces some general and some specific problems. Some of these problems are mutual to all languages because they yield to the rules of cognition, conscience, perception, understanding and memory; to the physiological and psychological principles pertaining to the human race irrespective of origin and nationality. Still, every language is a distinctive system, possessing individual properties and an obvious identity, as a result of a development in specific natural, geographical, cultural and historical conditions. The individual properties emerge in the script, in the phonetics, morphology and syntax. All these problems can and should be a subject of a detailed research and scientific analysis, mainly from practical considerations and language teaching requirements. There are some formidable obstacles in the language acquisition process. Among the first to be mentioned is the existence of concepts and entire categories in foreign languages, which are absent in the language of the students. Such phenomena reflect specific ways of thinking and the world-outlook, which were shaped during the evolution. Hindi is the national language of India, which belongs to the group of Indo-Iranian languages from the Indo-European family of languages. The lecturer has gained experience in teaching Hindi language to native speakers of Uzbek, Russian and Hebrew languages. He will show the difficulties in the field of phonetics, morphology and syntax, which the students have to deal with during the acquisition of the language. In the proposed lecture the lecturer will share his experience in making the process of language teaching more efficient by using non-formal semantic approach.

Keywords: applied linguistics, foreign language teaching, language teaching methodology, semantics

Procedia PDF Downloads 356
13439 Multivariate Simulations of the Process of Forming the Automotive Connector Forging from ZK60 Alloy

Authors: Anna Dziubinska

Abstract:

The article presents the results of numerical simulations of the new forging process of the automotive connector forging from cast preform. The high-strength ZK60 alloy (belonging to the Mg-Zn-Zr group of Mg alloys) was selected for numerical tests. Currently, this part of the industry is produced by multi-stage forging consisting of operations: bending, preforming, and finishing. The use of the cast preform would enable forging this component in one operation. However, obtaining specific mechanical properties requires inducing a certain level of strain within the forged part. Therefore, the design of the preform, its shape, and volume are of paramount importance. In work presented in this article, preforms of different shapes were designed and assessed using Finite Element (FE) analysis. The research was funded by the Polish National Agency for Academic Exchange within the framework of the Bekker programme.

Keywords: automotive connector, forging, magnesium alloy, numerical simulation, preform, ZK60

Procedia PDF Downloads 132
13438 A General Framework for Measuring the Internal Fraud Risk of an Enterprise Resource Planning System

Authors: Imran Dayan, Ashiqul Khan

Abstract:

Internal corporate fraud, which is fraud carried out by internal stakeholders of a company, affects the well-being of the organisation just like its external counterpart. Even if such an act is carried out for the short-term benefit of a corporation, the act is ultimately harmful to the entity in the long run. Internal fraud is often carried out by relying upon aberrations from usual business processes. Business processes are the lifeblood of a company in modern managerial context. Such processes are developed and fine-tuned over time as a corporation grows through its life stages. Modern corporations have embraced technological innovations into their business processes, and Enterprise Resource Planning (ERP) systems being at the heart of such business processes is a testimony to that. Since ERP systems record a huge amount of data in their event logs, the logs are a treasure trove for anyone trying to detect any sort of fraudulent activities hidden within the day-to-day business operations and processes. This research utilises the ERP systems in place within corporations to assess the likelihood of prospective internal fraud through developing a framework for measuring the risks of fraud through Process Mining techniques and hence finds risky designs and loose ends within these business processes. This framework helps not only in identifying existing cases of fraud in the records of the event log, but also signals the overall riskiness of certain business processes, and hence draws attention for carrying out a redesign of such processes to reduce the chance of future internal fraud while improving internal control within the organisation. The research adds value by applying the concepts of Process Mining into the analysis of data from modern day applications of business process records, which is the ERP event logs, and develops a framework that should be useful to internal stakeholders for strengthening internal control as well as provide external auditors with a tool of use in case of suspicion. The research proves its usefulness through a few case studies conducted with respect to big corporations with complex business processes and an ERP in place.

Keywords: enterprise resource planning, fraud risk framework, internal corporate fraud, process mining

Procedia PDF Downloads 335
13437 Prediction of Compressive Strength of Concrete from Early Age Test Result Using Design of Experiments (Rsm)

Authors: Salem Alsanusi, Loubna Bentaher

Abstract:

Response Surface Methods (RSM) provide statistically validated predictive models that can then be manipulated for finding optimal process configurations. Variation transmitted to responses from poorly controlled process factors can be accounted for by the mathematical technique of propagation of error (POE), which facilitates ‘finding the flats’ on the surfaces generated by RSM. The dual response approach to RSM captures the standard deviation of the output as well as the average. It accounts for unknown sources of variation. Dual response plus propagation of error (POE) provides a more useful model of overall response variation. In our case, we implemented this technique in predicting compressive strength of concrete of 28 days in age. Since 28 days is quite time consuming, while it is important to ensure the quality control process. This paper investigates the potential of using design of experiments (DOE-RSM) to predict the compressive strength of concrete at 28th day. Data used for this study was carried out from experiment schemes at university of Benghazi, civil engineering department. A total of 114 sets of data were implemented. ACI mix design method was utilized for the mix design. No admixtures were used, only the main concrete mix constituents such as cement, coarse-aggregate, fine aggregate and water were utilized in all mixes. Different mix proportions of the ingredients and different water cement ratio were used. The proposed mathematical models are capable of predicting the required concrete compressive strength of concrete from early ages.

Keywords: mix proportioning, response surface methodology, compressive strength, optimal design

Procedia PDF Downloads 267
13436 Simultaneous Removal of Arsenic and Toxic Metals from Contaminated Soil: a Pilot-Scale Demonstration

Authors: Juan Francisco Morales Arteaga, Simon Gluhar, Anela Kaurin, Domen Lestan

Abstract:

Contaminated soils are recognized as one of the most pressing global environmental problems. As is one of the most hazardous elements: chronic exposure to arsenic has devastating effects on health, cardiovascular diseases, cancer, and eventually death. Pb, Zn and Cd are very highly toxic metals that affect almost every organ in the body. With this in mind, new technologies for soil remediation processes are urgently needed. Calcareous artificially contaminated soil containing 231 mg kg-1 As and historically contaminated with Pb, Zn and Cd was washed with a 1:1.5 solid-liquid ratio of 90 mM EDTA, 100 mM oxalic acid, and 50 mM sodium dithionite to remove 59, 75, 29, and 53% of As, Pb, Zn, and Cd, respectively. To reduce emissions of residual EDTA and chelated metals from the remediated soil, zero valent iron (ZVI) was added (1% w/w) to the slurry of the washed soil immediately prior to rinsing. Experimental controls were conducted without the addition of ZVI after remediation. The use of ZVI reduced metal leachability and minimized toxic emissions 21 days after remediation. After this time, NH4NO3 extraction was performed to determine the mobility of toxic elements in the soil. In addition, Unified Human BioaccessibilityMethod (UBM) was performed to quantify the bioaccessibility levels of metals in stimulated human gastric and gastrointestinal phases.

Keywords: soil remediation, soil science, soil washing, toxic metals removal

Procedia PDF Downloads 175
13435 Scaling-Down an Agricultural Waste Biogas Plant Fermenter

Authors: Matheus Pessoa, Matthias Kraume

Abstract:

Scale-Down rules in process engineering help us to improve and develop Industrial scale parameters into lab scale. Several scale-down rules available in the literature like Impeller Power Number, Agitation device Power Input, Substrate Tip Speed, Reynolds Number and Cavern Development were investigated in order to stipulate the rotational speed to operate an 11 L working volume lab-scale bioreactor within industrial process parameters. Herein, xanthan gum was used as a fluid with a representative viscosity of a hypothetical biogas plant, with H/D = 1 and central agitation, fermentation broth using sewage sludge and sugar beet pulp as substrate. The results showed that the cavern development strategy was the best method for establishing a rotational speed for the bioreactor operation, while the other rules presented values out of reality for this article proposes.

Keywords: anaerobic digestion, cavern development, scale down rules, xanthan gum

Procedia PDF Downloads 493
13434 A Multi-criteria Decision Method For The Recruitment Of Academic Personnel Based On The Analytical Hierarchy Process And The Delphi Method In A Neutrosophic Environment (Full Text)

Authors: Antonios Paraskevas, Michael Madas

Abstract:

For a university to maintain its international competitiveness in education, it is essential to recruit qualitative academic staff as it constitutes its most valuable asset. This selection demonstrates a significant role in achieving strategic objectives, particularly by emphasizing a firm commitment to exceptional student experience and innovative teaching and learning practices of high quality. In this vein, the appropriate selection of academic staff establishes a very important factor of competitiveness, efficiency and reputation of an academic institute. Within this framework, our work demonstrates a comprehensive methodological concept that emphasizes on the multi-criteria nature of the problem and on how decision makers could utilize our approach in order to proceed to the appropriate judgment. The conceptual framework introduced in this paper is built upon a hybrid neutrosophic method based on the Neutrosophic Analytical Hierarchy Process (N-AHP), which uses the theory of neutrosophy sets and is considered suitable in terms of significant degree of ambiguity and indeterminacy observed in decision-making process. To this end, our framework extends the N-AHP by incorporating the Neutrosophic Delphi Method (N-DM). By applying the N-DM, we can take into consideration the importance of each decision-maker and their preferences per evaluation criterion. To the best of our knowledge, the proposed model is the first which applies Neutrosophic Delphi Method in the selection of academic staff. As a case study, it was decided to use our method to a real problem of academic personnel selection, having as main goal to enhance the algorithm proposed in previous scholars’ work, and thus taking care of the inherit ineffectiveness which becomes apparent in traditional multi-criteria decision-making methods when dealing with situations alike. As a further result, we prove that our method demonstrates greater applicability and reliability when compared to other decision models.

Keywords: analytical hierarchy process, delphi method, multi-criteria decision maiking method, neutrosophic set theory, personnel recruitment

Procedia PDF Downloads 200
13433 Ideological Manipulations and Cultural-Norm Constraints

Authors: Masoud Hassanzade Novin, Bahloul Salmani

Abstract:

Translation cannot be considered as a simple linguistic act. Through the rise of descriptive approach in the late 1970s and 1980s, translation process managed to meet the requirements of social aspects as well as linguistic approaches. To have the translation considered as the cross-cultural communication through which various cultures communicate in ideological and cultural constraints, the contrastive analysis was conducted in this paper to reveal the distortions imposed in the translated texts. The corpus of the study involved the novel 1984 written by George Orwell and its Persian translated texts which were analyzed through the qualitative type of the research based on critical discourse analysis (CDA) and Toury's norms as well as Lefever's concepts of ideology. Results of the study revealed the point that ideology and the cultural constraints were considered as an important stimulus which can control the process of the translation.

Keywords: critical discourse analysis, ideology, norms, translated texts

Procedia PDF Downloads 336