Search results for: standard procedures process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20233

Search results for: standard procedures process

19753 Extraction, Characterization and Application of Natural Dyes from the Fresh Rind of Index Colour 5 Mangosteen (Garcinia mangostana L.)

Authors: Basitah Taif

Abstract:

This study was to explore and utilize the fresh rind of mangosteen Index Colour 5 as an upcoming raw material for the production of natural dyes. Rind from the fresh mangosteen Index Colour 5 was utilized to extract the dyes. The established extracts were experimented on silk fabrics via three types of mordanting and dyeing procedures; pre-mordanting, simultaneous mordanting and post-mordanting. As a result, the applications of the freeze-drying methodology and mechanizable equipment have helped to produce excellent range of natural colours. Silk fabric treated simultaneously with mordanting and dyeing with extract dye Index Colour 5 produced a brilliant shade of the red colour and the colour from this index is also discovered sensitive to light and washing during the fastness tests. The preliminary evaluation and instrumentation analysis allowed us to examine whether the application of different mordanting and dyeing procedures with the same extract samples and concentrations affected the colours and shades of the fabric samples.

Keywords: natural dye, freeze-drying, Garcinia mangostana Linn, mordanting

Procedia PDF Downloads 453
19752 An Evaluation on the Methodology of Manufacturing High Performance Organophilic Clay at the Most Efficient and Cost Effective Process

Authors: Siti Nur Izati Azmi, Zatil Afifah Omar, Kathi Swaran, Navin Kumar

Abstract:

Organophilic Clays, also known as Organoclays, is used as a viscosifier in Oil based Drilling fluids. Most often, Organophilic clay are produced from modified Sodium and Calcium based Bentonite. Many studies and data show that Organophilic Clay using Hectorite based clays provide the best yield and good fluid loss properties in an oil-based drilling fluid at a higher cost. In terms of the manufacturing process, the two common methods of manufacturing organophilic clays are a Wet Process and a Dry Process. Wet process is known to produce better performance product at a higher cost while Dry Process shorten the production time. Hence, the purpose of this study is to evaluate the various formulation of an organophilic clay and its performance vs. the cost, as well as to determine the most efficient and cost-effective method of manufacturing organophilic clays.

Keywords: organophilic clay, viscosifier, wet process, dry process

Procedia PDF Downloads 211
19751 H.263 Based Video Transceiver for Wireless Camera System

Authors: Won-Ho Kim

Abstract:

In this paper, a design of H.263 based wireless video transceiver is presented for wireless camera system. It uses standard WIFI transceiver and the covering area is up to 100m. Furthermore the standard H.263 video encoding technique is used for video compression since wireless video transmitter is unable to transmit high capacity raw data in real time and the implemented system is capable of streaming at speed of less than 1Mbps using NTSC 720x480 video.

Keywords: wireless video transceiver, video surveillance camera, H.263 video encoding digital signal processing

Procedia PDF Downloads 354
19750 Impact of Calcium Carbide Waste Dumpsites on Soil Chemical and Microbial Characteristics

Authors: C. E. Ihejirika, M. I. Nwachukwu, R. F. Njoku-Tony, O. C. Ihejirika, U. O. Enwereuzoh, E. O. Imo, D. C. Ashiegbu

Abstract:

Disposal of industrial solid wastes in the environment is a major environmental challenge. This study investigated the effects of calcium carbide waste dumpsites on soil quality. Soil samples were collected with hand auger from three different dumpsites at varying depths and made into composite samples. Samples were subjected to standard analytical procedures. pH varied from 10.38 to 8.28, nitrate from 5.6mg/kg to 9.3mg/kg, phosphate from 8.8mg/kg to 12.3mg/kg, calcium carbide reduced from 10% to to 3%. Calcium carbide was absent in control soil samples. Bacterial counts from dumpsites ranged from 1.8 x 105cfu/g - 2.5 x 105cfu/g while fungal ranged from 0.8 x 103cfu/g - 1.4 x 103cfu/g. Bacterial isolates included Pseudomonas spp, Flavobacterium spp, and Achromobacter spp, while fungal isolates include Penicillium notatum, Aspergillus niger, and Rhizopus stolonifer. No organism was isolated from the dumpsites at soil depth of 0-15 cm, while there were isolates from other soil depths. Toxicity might be due to alkaline condition of the dumpsite. Calcium carbide might be bactericidal and fungicidal leading to cellular physiology, growth retardation, death, general loss of biodiversity and reduction of ecosystem processes. Detoxification of calcium carbide waste before disposal on soil might be the best option in management.

Keywords: biodiversity, calcium-carbide, denitrification, toxicity

Procedia PDF Downloads 535
19749 Implementation of Synthesis and Quality Control Procedures of ¹⁸F-Fluoromisonidazole Radiopharmaceutical

Authors: Natalia C. E. S. Nascimento, Mercia L. Oliveira, Fernando R. A. Lima, Leonardo T. C. do Nascimento, Marina B. Silveira, Brigida G. A. Schirmer, Andrea V. Ferreira, Carlos Malamut, Juliana B. da Silva

Abstract:

Tissue hypoxia is a common characteristic of solid tumors leading to decreased sensitivity to radiotherapy and chemotherapy. In the clinical context, tumor hypoxia assessment employing the positron emission tomography (PET) tracer ¹⁸F-fluoromisonidazole ([¹⁸F]FMISO) is helpful for physicians for planning and therapy adjusting. The aim of this work was to implement the synthesis of 18F-FMISO in a TRACERlab® MXFDG module and also to establish the quality control procedure. [¹⁸F]FMISO was synthesized at Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN/Brazil) using an automated synthesizer (TRACERlab® MXFDG, GE) adapted for the production of [¹⁸F]FMISO. The FMISO chemical standard was purchased from ABX. 18O- enriched water was acquired from Center of Molecular Research. Reagent kits containing eluent solution, acetonitrile, ethanol, 2.0 M HCl solution, buffer solution, water for injections and [¹⁸F]FMISO precursor (dissolved in 2 ml acetonitrile) were purchased from ABX. The [¹⁸F]FMISO samples were purified by Solid Phase Extraction method. The quality requirements of [¹⁸F]FMISO are established in the European Pharmacopeia. According to that reference, quality control of [¹⁸F]FMISO should include appearance, pH, radionuclidic identity and purity, radiochemical identity and purity, chemical purity, residual solvents, bacterial endotoxins, and sterility. The duration of the synthesis process was 53 min, with radiochemical yield of (37.00 ± 0.01) % and the specific activity was more than 70 GBq/µmol. The syntheses were reproducible and showed satisfactory results. In relation to the quality control analysis, the samples were clear and colorless at pH 6.0. The spectrum emission, measured by using a High-Purity Germanium Detector (HPGe), presented a single peak at 511 keV and the half-life, determined by the decay method in an activimeter, was (111.0 ± 0.5) min, indicating no presence of radioactive contaminants, besides the desirable radionuclide (¹⁸F). The samples showed concentration of tetrabutylammonium (TBA) < 50μg/mL, assessed by visual comparison to TBA standard applied in the same thin layer chromatographic plate. Radiochemical purity was determined by high performance liquid chromatography (HPLC) and the results were 100%. Regarding the residual solvents tested, ethanol and acetonitrile presented concentration lower than 10% and 0.04%, respectively. Healthy female mice were injected via lateral tail vein with [¹⁸F]FMISO, microPET imaging studies (15 min) were performed after 2 h post injection (p.i), and the biodistribution was analyzed in five-time points (30, 60, 90, 120 and 180 min) after injection. Subsequently, organs/tissues were assayed for radioactivity with a gamma counter. All parameters of quality control test were in agreement to quality criteria confirming that [¹⁸F]FMISO was suitable for use in non-clinical and clinical trials, following the legal requirements for the production of new radiopharmaceuticals in Brazil.

Keywords: automatic radiosynthesis, hypoxic tumors, pharmacopeia, positron emitters, quality requirements

Procedia PDF Downloads 181
19748 Evaluation of Ensemble Classifiers for Intrusion Detection

Authors: M. Govindarajan

Abstract:

One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed with homogeneous ensemble classifier using bagging and heterogeneous ensemble classifier using arcing and their performances are analyzed in terms of accuracy. A Classifier ensemble is designed using Radial Basis Function (RBF) and Support Vector Machine (SVM) as base classifiers. The feasibility and the benefits of the proposed approaches are demonstrated by the means of standard datasets of intrusion detection. The main originality of the proposed approach is based on three main parts: preprocessing phase, classification phase, and combining phase. A wide range of comparative experiments is conducted for standard datasets of intrusion detection. The performance of the proposed homogeneous and heterogeneous ensemble classifiers are compared to the performance of other standard homogeneous and heterogeneous ensemble methods. The standard homogeneous ensemble methods include Error correcting output codes, Dagging and heterogeneous ensemble methods include majority voting, stacking. The proposed ensemble methods provide significant improvement of accuracy compared to individual classifiers and the proposed bagged RBF and SVM performs significantly better than ECOC and Dagging and the proposed hybrid RBF-SVM performs significantly better than voting and stacking. Also heterogeneous models exhibit better results than homogeneous models for standard datasets of intrusion detection. 

Keywords: data mining, ensemble, radial basis function, support vector machine, accuracy

Procedia PDF Downloads 238
19747 Using the Bootstrap for Problems Statistics

Authors: Brahim Boukabcha, Amar Rebbouh

Abstract:

The bootstrap method based on the idea of exploiting all the information provided by the initial sample, allows us to study the properties of estimators. In this article we will present a theoretical study on the different methods of bootstrapping and using the technique of re-sampling in statistics inference to calculate the standard error of means of an estimator and determining a confidence interval for an estimated parameter. We apply these methods tested in the regression models and Pareto model, giving the best approximations.

Keywords: bootstrap, error standard, bias, jackknife, mean, median, variance, confidence interval, regression models

Procedia PDF Downloads 373
19746 Metal-Oxide-Semiconductor-Only Process Corner Monitoring Circuit

Authors: Davit Mirzoyan, Ararat Khachatryan

Abstract:

A process corner monitoring circuit (PCMC) is presented in this work. The circuit generates a signal, the logical value of which depends on the process corner only. The signal can be used in both digital and analog circuits for testing and compensation of process variations (PV). The presented circuit uses only metal-oxide-semiconductor (MOS) transistors, which allow increasing its detection accuracy, decrease power consumption and area. Due to its simplicity the presented circuit can be easily modified to monitor parametrical variations of only n-type and p-type MOS (NMOS and PMOS, respectively) transistors, resistors, as well as their combinations. Post-layout simulation results prove correct functionality of the proposed circuit, i.e. ability to monitor the process corner (equivalently die-to-die variations) even in the presence of within-die variations.

Keywords: detection, monitoring, process corner, process variation

Procedia PDF Downloads 512
19745 Comparative Analysis of Various Waste Oils for Biodiesel Production

Authors: Olusegun Ayodeji Olagunju, Christine Tyreesa Pillay

Abstract:

Biodiesel from waste sources is regarded as an economical and most viable fuel alternative to depleting fossil fuels. In this work, biodiesel was produced from three different sources of waste cooking oil; from cafeterias, which is vegetable-based using the transesterification method. The free fatty acids (% FFA) of the feedstocks were conducted successfully through the titration method. The results for sources 1, 2, and 3 were 0.86 %, 0.54 % and 0.20 %, respectively. The three variables considered in this process were temperature, reaction time, and catalyst concentration within the following range: 50 oC – 70 oC, 30 min – 90 min, and 0.5 % – 1.5 % catalyst. Produced biodiesel was characterized using ASTM standard methods for biodiesel property testing to determine the fuel properties, including kinematic viscosity, specific gravity, flash point, pour point, cloud point, and acid number. The results obtained indicate that the biodiesel yield from source 3 was greater than the other sources. All produced biodiesel fuel properties are within the standard biodiesel fuel specifications ASTM D6751. The optimum yield of biodiesel was obtained at 98.76%, 96.4%, and 94.53% from source 3, source 2, and source 1, respectively at optimum operating variables of 65 oC temperature, 90 minutes reaction time, and 0.5 wt% potassium hydroxide.

Keywords: waste cooking oil, biodiesel, free fatty acid content, potassium hydroxide catalyst, optimization analysis

Procedia PDF Downloads 65
19744 An Automatic Speech Recognition of Conversational Telephone Speech in Malay Language

Authors: M. Draman, S. Z. Muhamad Yassin, M. S. Alias, Z. Lambak, M. I. Zulkifli, S. N. Padhi, K. N. Baharim, F. Maskuriy, A. I. A. Rahim

Abstract:

The performance of Malay automatic speech recognition (ASR) system for the call centre environment is presented. The system utilizes Kaldi toolkit as the platform to the entire library and algorithm used in performing the ASR task. The acoustic model implemented in this system uses a deep neural network (DNN) method to model the acoustic signal and the standard (n-gram) model for language modelling. With 80 hours of training data from the call centre recordings, the ASR system can achieve 72% of accuracy that corresponds to 28% of word error rate (WER). The testing was done using 20 hours of audio data. Despite the implementation of DNN, the system shows a low accuracy owing to the varieties of noises, accent and dialect that typically occurs in Malaysian call centre environment. This significant variation of speakers is reflected by the large standard deviation of the average word error rate (WERav) (i.e., ~ 10%). It is observed that the lowest WER (13.8%) was obtained from recording sample with a standard Malay dialect (central Malaysia) of native speaker as compared to 49% of the sample with the highest WER that contains conversation of the speaker that uses non-standard Malay dialect.

Keywords: conversational speech recognition, deep neural network, Malay language, speech recognition

Procedia PDF Downloads 312
19743 'Light up for All': Building Knowledge on Universal Design through Direct User Contact in Design Workshops

Authors: E. Ielegems, J. Herssens, J. Vanrie

Abstract:

Designers require knowledge and data about a diversity of users throughout the design process to create inclusive design solutions which are usable, understandable and desirable by everyone. Besides understanding users’ needs and expectations, the ways in which users perceive and experience the built environment contain valuable knowledge for architects. Since users’ perceptions and experiences are mainly tacit by nature, they are much more difficult to express in words and therefore more difficult to externalise. Nevertheless, literature confirms the importance of articulating embodied knowledge from users throughout the design process. Hence, more insight is needed into the ways architects can build knowledge on Universal Design through direct user contact. In a project called ‘light up for all’ architecture students are asked to design a light switch and socket, elegant, usable and understandable to the greatest extent possible by everyone. Two workshops with user/experts are organised in the first stages of the design process in which students could gain insight into users’ experiences through direct contact. Three data collection techniques are used to analyse the teams’ design processes. First, students were asked to keep a design diary, reporting design activities, personal experiences, and thoughts about users throughout the design process. Second, one of the authors observed workshops taking field notes. Finally, focus groups are conducted with the design teams after the design process was finished. By means of analysing collected qualitative data, we first identify different design aspects that make the teams’ proposals more inclusive than standard design solutions. For this paper, we specifically focus on aspects that externalise embodied user knowledge from users’ experiences. Subsequently, we look at designers’ approaches to learn about these specific aspects throughout the design process. Results show that in some situations, designers perceive contradicting knowledge between observations and verbal conversations, which shows the value of direct user contact. Additionally, findings give indications on values and limitations of working with selected prototypes as ‘boundary objects’ when externalising users’ experiences. These insights may help researchers to better understand designers’ process of eliciting embodied user knowledge. This way, research can offer more effective support to architects, which may result in better incorporating users’ experiences so that the built environment gradually can become more inclusive for all.

Keywords: universal design, architecture, design process, embodied user knowledge

Procedia PDF Downloads 131
19742 Optimizing Productivity and Quality through the Establishment of a Learning Management System for an Agency-Based Graduate School

Authors: Maria Corazon Tapang-Lopez, Alyn Joy Dela Cruz Baltazar, Bobby Jones Villanueva Domdom

Abstract:

The requisite for an organization implementing quality management system to sustain its compliance to the requirements and commitment for continuous improvement is even higher. It is expected that the offices and units has high and consistent compliance to the established processes and procedures. The Development Academy of the Philippines has been operating under project management to which is has a quality management certification. To further realize its mandate as a think-tank and capacity builder of the government, DAP expanded its operation and started to grant graduate degree through its Graduate School of Public and Development Management (GSPDM). As the academic arm of the Academy, GSPDM offers graduate degree programs on public management and productivity & quality aligned to the institutional trusts. For a time, the documented procedures and processes of a project management seem to fit the Graduate School. However, there has been a significant growth in the operations of the GSPDM in terms of the graduate programs offered that directly increase the number of students. There is an apparent necessity to align the project management system into a more educational system otherwise it will no longer be responsive to the development that are taking place. The strongly advocate and encourage its students to pursue internal and external improvement to cope up with the challenges of providing quality service to their own clients and to our country. If innovation will not take roots in the grounds of GSPDM, then how will it serve the purpose of “walking the talk”? This research was conducted to assess the diverse flow of the existing internal operations and processes of the DAP’s project management and GSPDM’s school management that will serve as basis to develop a system that will harmonize into one, the Learning Management System. The study documented the existing process of GSPDM following the project management phases of conceptualization & development, negotiation & contracting, mobilization, implementation, and closure into different flow charts of the key activities. The primary source of information as respondents were the different groups involved into the delivery of graduate programs - the executive, learning management team and administrative support offices. The Learning Management System (LMS) shall capture the unique and critical processes of the GSPDM as a degree-granting unit of the Academy. The LMS is the harmonized project management and school management system that shall serve as the standard system and procedure for all the programs within the GSPDM. The unique processes cover the three important areas of school management – student, curriculum, and faculty. The required processes of these main areas such as enrolment, course syllabus development, and faculty evaluation were appropriately placed within the phases of the project management system. Further, the research shall identify critical reports and generate manageable documents and records to ensure accuracy, consistency and reliable information. The researchers had an in-depth review of the DAP-GSDPM’s mandate, analyze the various documents, and conducted series of focused group discussions. A comprehensive review on flow chart system prior and various models of school management systems were made. Subsequently, the final output of the research is a work instructions manual that will be presented to the Academy’s Quality Management Council and eventually an additional scope for ISO certification. The manual shall include documented forms, iterative flow charts and program Gantt chart that will have a parallel development of automated systems.

Keywords: productivity, quality, learning management system, agency-based graduate school

Procedia PDF Downloads 311
19741 Enhancing Model Interoperability and Reuse by Designing and Developing a Unified Metamodel Standard

Authors: Arash Gharibi

Abstract:

Mankind has always used models to solve problems. Essentially, models are simplified versions of reality, whose need stems from having to deal with complexity; many processes or phenomena are too complex to be described completely. Thus a fundamental model requirement is that it contains the characteristic features that are essential in the context of the problem to be solved or described. Models are used in virtually every scientific domain to deal with various problems. During the recent decades, the number of models has increased exponentially. Publication of models as part of original research has traditionally been in in scientific periodicals, series, monographs, agency reports, national journals and laboratory reports. This makes it difficult for interested groups and communities to stay informed about the state-of-the-art. During the modeling process, many important decisions are made which impact the final form of the model. Without a record of these considerations, the final model remains ill-defined and open to varying interpretations. Unfortunately, the details of these considerations are often lost or in case there is any existing information about a model, it is likely to be written intuitively in different layouts and in different degrees of detail. In order to overcome these issues, different domains have attempted to implement their own approaches to preserve their models’ information in forms of model documentation. The most frequently cited model documentation approaches show that they are domain specific, not to applicable to the existing models and evolutionary flexibility and intrinsic corrections and improvements are not possible with the current approaches. These issues are all because of a lack of unified standards for model documentation. As a way forward, this research will propose a new standard for capturing and managing models’ information in a unified way so that interoperability and reusability of models become possible. This standard will also be evolutionary, meaning members of modeling realm could contribute to its ongoing developments and improvements. In this paper, the current 3 of the most common metamodels are reviewed and according to pros and cons of each, a new metamodel is proposed.

Keywords: metamodel, modeling, interoperability, reuse

Procedia PDF Downloads 188
19740 Comprehensive Assessment of Energy Efficiency within the Production Process

Authors: S. Kreitlein, N. Eder, J. Franke

Abstract:

The importance of energy efficiency within the production process increases steadily. Unfortunately, so far no tools for a comprehensive assessment of energy efficiency within the production process exist. Therefore the Institute for Factory Automation and Production Systems of the Friedrich-Alexander-University Erlangen-Nuremberg has developed two methods with the goal of achieving transparency and a quantitative assessment of energy efficiency: EEV (Energy Efficiency Value) and EPE (Energetic Process Efficiency). This paper describes the basics and state of the art as well as the developed approaches.

Keywords: energy efficiency, energy efficiency value, energetic process efficiency, production

Procedia PDF Downloads 718
19739 SMART: Solution Methods with Ants Running by Types

Authors: Nicolas Zufferey

Abstract:

Ant algorithms are well-known metaheuristics which have been widely used since two decades. In most of the literature, an ant is a constructive heuristic able to build a solution from scratch. However, other types of ant algorithms have recently emerged: the discussion is thus not limited by the common framework of the constructive ant algorithms. Generally, at each generation of an ant algorithm, each ant builds a solution step by step by adding an element to it. Each choice is based on the greedy force (also called the visibility, the short term profit or the heuristic information) and the trail system (central memory which collects historical information of the search process). Usually, all the ants of the population have the same characteristics and behaviors. In contrast in this paper, a new type of ant metaheuristic is proposed, namely SMART (for Solution Methods with Ants Running by Types). It relies on the use of different population of ants, where each population has its own personality.

Keywords: ant algorithms, evolutionary procedures, metaheuristics, optimization, population-based methods

Procedia PDF Downloads 358
19738 Thermoluminescence Characteristic of Nanocrystalline BaSO4 Doped with Europium

Authors: Kanika S. Raheja, A. Pandey, Shaila Bahl, Pratik Kumar, S. P. Lochab

Abstract:

The subject of undertaking for this paper is the study of BaSO4 nanophosphor doped with Europium in which mainly the concentration of the rare earth impurity Eu (0.05, 0.1, 0.2, 0.5, and 1 mol %) has been varied. A comparative study of the thermoluminescence(TL) properties of the given nanophosphor has also been done using a well-known standard dosimetry material i.e. TLD-100.Firstly, a number of samples were prepared successfully by the chemical co-precipitation method. The whole lot was then compared to a well established standard material (TLD-100) for its TL sensitivity property. BaSO4:Eu ( 0.2 mol%) showed the highest sensitivity out of the lot. It was also found that when compared to the standard TLD-100, BaSo4:Eu (0.2mol%) showed surprisingly high sensitivity for a large range of doses. The TL response curve for all prepared samples has also been studied over a wide range of doses i.e 10Gy to 2kGy for gamma radiation. Almost all the samples of BaSO4:Eu showed a remarkable linearity for a broad range of doses, which is a characteristic feature of a fine TL dosimeter. The graph remained linear even beyond 1kGy for gamma radiation. Thus, the given nanophosphor has been successfully optimised for the concentration of the dopant material to achieve its highest TL sensitivity. Further, the comparative study with the standard material revealed that the current optimised sample shows an astonishingly better TL sensitivity and a phenomenal linear response curve for an incredibly wide range of doses for gamma radiation (Co-60) as compared to the standard TLD-100, which makes the current optimised BaSo4:Eu quite promising as an efficient gamma radiation dosimeter. Lastly, the present phosphor has been optimised for its annealing temperature to acquire the best results while also studying its fading and reusability properties.

Keywords: gamma radiation, nanoparticles, radiation dosimetry, thermoluminescence

Procedia PDF Downloads 420
19737 Modified CUSUM Algorithm for Gradual Change Detection in a Time Series Data

Authors: Victoria Siriaki Jorry, I. S. Mbalawata, Hayong Shin

Abstract:

The main objective in a change detection problem is to develop algorithms for efficient detection of gradual and/or abrupt changes in the parameter distribution of a process or time series data. In this paper, we present a modified cumulative (MCUSUM) algorithm to detect the start and end of a time-varying linear drift in mean value of a time series data based on likelihood ratio test procedure. The design, implementation and performance of the proposed algorithm for a linear drift detection is evaluated and compared to the existing CUSUM algorithm using different performance measures. An approach to accurately approximate the threshold of the MCUSUM is also provided. Performance of the MCUSUM for gradual change-point detection is compared to that of standard cumulative sum (CUSUM) control chart designed for abrupt shift detection using Monte Carlo Simulations. In terms of the expected time for detection, the MCUSUM procedure is found to have a better performance than a standard CUSUM chart for detection of the gradual change in mean. The algorithm is then applied and tested to a randomly generated time series data with a gradual linear trend in mean to demonstrate its usefulness.

Keywords: average run length, CUSUM control chart, gradual change detection, likelihood ratio test

Procedia PDF Downloads 283
19736 Using Squeezed Vacuum States to Enhance the Sensitivity of Ground Based Gravitational Wave Interferometers beyond the Standard Quantum Limit

Authors: Giacomo Ciani

Abstract:

This paper reviews the impact of quantum noise on modern gravitational wave interferometers and explains how squeezed vacuum states are used to push the noise below the standard quantum limit. With the first detection of gravitational waves from a pair of colliding black holes in September 2015 and subsequent detections including that of gravitational waves from a pair of colliding neutron stars, the ground-based interferometric gravitational wave observatories LIGO and VIRGO have opened the era of gravitational-wave and multi-messenger astronomy. Improving the sensitivity of the detectors is of paramount importance to increase the number and quality of the detections, fully exploiting this new information channel about the universe. Although still in the commissioning phase and not at nominal sensitivity, these interferometers are designed to be ultimately limited by a combination of shot noise and quantum radiation pressure noise, which define an envelope known as the standard quantum limit. Despite the name, this limit can be beaten with the use of advanced quantum measurement techniques, with the use of squeezed vacuum states being currently the most mature and promising. Different strategies for implementation of the technology in the large-scale detectors, in both their frequency-independent and frequency-dependent variations, are presented, together with an analysis of the main technological issues and expected sensitivity gain.

Keywords: gravitational waves, interferometers, squeezed vacuum, standard quantum limit

Procedia PDF Downloads 142
19735 Graph Cuts Segmentation Approach Using a Patch-Based Similarity Measure Applied for Interactive CT Lung Image Segmentation

Authors: Aicha Majda, Abdelhamid El Hassani

Abstract:

Lung CT image segmentation is a prerequisite in lung CT image analysis. Most of the conventional methods need a post-processing to deal with the abnormal lung CT scans such as lung nodules or other lesions. The simplest similarity measure in the standard Graph Cuts Algorithm consists of directly comparing the pixel values of the two neighboring regions, which is not accurate because this kind of metrics is extremely sensitive to minor transformations such as noise or other artifacts problems. In this work, we propose an improved version of the standard graph cuts algorithm based on the Patch-Based similarity metric. The boundary penalty term in the graph cut algorithm is defined Based on Patch-Based similarity measurement instead of the simple intensity measurement in the standard method. The weights between each pixel and its neighboring pixels are Based on the obtained new term. The graph is then created using theses weights between its nodes. Finally, the segmentation is completed with the minimum cut/Max-Flow algorithm. Experimental results show that the proposed method is very accurate and efficient, and can directly provide explicit lung regions without any post-processing operations compared to the standard method.

Keywords: graph cuts, lung CT scan, lung parenchyma segmentation, patch-based similarity metric

Procedia PDF Downloads 160
19734 Towards Incorporating Context Awareness into Business Process Management

Authors: Xiaohui Zhao, Shahan Mafuz

Abstract:

Context-aware technologies provide system applications with the awareness of environmental conditions, customer behaviour, object movements, etc. Further, with such capability system applications can be smart to adapt intelligently their responses to the changing conditions. Concerning business operations, this promises businesses that their business processes can run more intelligently, adaptively and flexibly, and thereby either improve customer experience, enhance reliability of service delivery, or lower operational cost, to make the business more competitive and sustainable. Aiming at realizing such context-aware business process management, this paper firstly explores its potential benefit and then identifies some gaps between the current business process management support and the expected. In addition, some preliminary solutions are also discussed with context definition, rule-based process execution, run-time process evolution, etc. A framework is also presented to give a conceptual architecture of context-aware business process management system to guide system implementation.

Keywords: business process adaptation, business process evolution, business process modelling, and context awareness

Procedia PDF Downloads 403
19733 Evaluating Forecasts Through Stochastic Loss Order

Authors: Wilmer Osvaldo Martinez, Manuel Dario Hernandez, Juan Manuel Julio

Abstract:

We propose to assess the performance of k forecast procedures by exploring the distributions of forecast errors and error losses. We argue that non systematic forecast errors minimize when their distributions are symmetric and unimodal, and that forecast accuracy should be assessed through stochastic loss order rather than expected loss order, which is the way it is customarily performed in previous work. Moreover, since forecast performance evaluation can be understood as a one way analysis of variance, we propose to explore loss distributions under two circumstances; when a strict (but unknown) joint stochastic order exists among the losses of all forecast alternatives, and when such order happens among subsets of alternative procedures. In spite of the fact that loss stochastic order is stronger than loss moment order, our proposals are at least as powerful as competing tests, and are robust to the correlation, autocorrelation and heteroskedasticity settings they consider. In addition, since our proposals do not require samples of the same size, their scope is also wider, and provided that they test the whole loss distribution instead of just loss moments, they can also be used to study forecast distributions as well. We illustrate the usefulness of our proposals by evaluating a set of real world forecasts.

Keywords: forecast evaluation, stochastic order, multiple comparison, non parametric test

Procedia PDF Downloads 79
19732 Experience Report about the Inclusion of People with Disabilities in the Process of Testing an Accessible System for Learning Management

Authors: Marcos Devaner, Marcela Alves, Cledson Braga, Fabiano Alves, Wilton Bezerra

Abstract:

This article discusses the inclusion of people with disabilities in the process of testing an accessible system solution for distance education. The accessible system, team profile, methodologies and techniques covered in the testing process are presented. The testing process shown in this paper was designed from the experience with user. The testing process emerged from lessons learned from past experiences and the end user is present at all stages of the tests. Also, lessons learned are reported and how it was possible the maturing of the team and the methods resulting in a simple, productive and effective process.

Keywords: experience report, accessible systems, software testing, testing process, systems, e-learning

Procedia PDF Downloads 380
19731 Comparison between Some of Robust Regression Methods with OLS Method with Application

Authors: Sizar Abed Mohammed, Zahraa Ghazi Sadeeq

Abstract:

The use of the classic method, least squares (OLS) to estimate the linear regression parameters, when they are available assumptions, and capabilities that have good characteristics, such as impartiality, minimum variance, consistency, and so on. The development of alternative statistical techniques to estimate the parameters, when the data are contaminated with outliers. These are powerful methods (or resistance). In this paper, three of robust methods are studied, which are: Maximum likelihood type estimate M-estimator, Modified Maximum likelihood type estimate MM-estimator and Least Trimmed Squares LTS-estimator, and their results are compared with OLS method. These methods applied to real data taken from Duhok company for manufacturing furniture, the obtained results compared by using the criteria: Mean Squared Error (MSE), Mean Absolute Percentage Error (MAPE) and Mean Sum of Absolute Error (MSAE). Important conclusions that this study came up with are: a number of typical values detected by using four methods in the furniture line and very close to the data. This refers to the fact that close to the normal distribution of standard errors, but typical values in the doors line data, using OLS less than that detected by the powerful ways. This means that the standard errors of the distribution are far from normal departure. Another important conclusion is that the estimated values of the parameters by using the lifeline is very far from the estimated values using powerful methods for line doors, gave LTS- destined better results using standard MSE, and gave the M- estimator better results using standard MAPE. Moreover, we noticed that using standard MSAE, and MM- estimator is better. The programs S-plus (version 8.0, professional 2007), Minitab (version 13.2) and SPSS (version 17) are used to analyze the data.

Keywords: Robest, LTS, M estimate, MSE

Procedia PDF Downloads 225
19730 Comparison between Hardy-Cross Method and Water Software to Solve a Pipe Networking Design Problem for a Small Town

Authors: Ahmed Emad Ahmed, Zeyad Ahmed Hussein, Mohamed Salama Afifi, Ahmed Mohammed Eid

Abstract:

Water has a great importance in life. In order to deliver water from resources to the users, many procedures should be taken by the water engineers. One of the main procedures to deliver water to the community is by designing pressurizer pipe networks for water. The main aim of this work is to calculate the water demand of a small town and then design a simple water network to distribute water resources among the town with the smallest losses. Literature has been mentioned to cover the main point related to water distribution. Moreover, the methodology has introduced two approaches to solve the research problem, one by the iterative method of Hardy-cross and the other by water software Pipe Flow. The results have introduced two main designs to satisfy the same research requirements. Finally, the researchers have concluded that the use of water software provides more abilities and options for water engineers.

Keywords: looping pipe networks, hardy cross networks accuracy, relative error of hardy cross method

Procedia PDF Downloads 149
19729 Valorization of Beer Brewing Wastes by Composting

Authors: M. E. Silva, I. Brás

Abstract:

The aim of this work was to study the viability of recycling the residual yeast and diatomaceous earth (RYDE) slurry generated by the beer brewing industry by composting with animal manures, as well as to evaluate the quality of the composts obtained. Two pilot composting trials were carried out with different mixes: cow manure/RYDE slurry (Pile CM) and sheep manure/RYDE slurry (Pile SM). For all piles, wood chips were applied as bulking agent. The process was monitored by evaluating standard physical and chemical parameters. The compost quality was assessed by the heavy metals content and phytotoxicity. Both piles reached a thermophilic phase in the first day, however having different trends. The pH showed a slight alkaline character. The C/N reached values lower than 19 at the end of composting process. Generally, all the piles exhibited absence of heavy metals. However, the pile SM exhibited phytotoxicity. This study showed that RYDE slurry can be valorized by composting with cow manure.

Keywords: beer brewing wastes, compost, valorization, quality

Procedia PDF Downloads 437
19728 Chemical Bath Deposition Technique (CBD) of Cds Used in Closed Space Sublimation (CSS) of CdTe Solar Cell

Authors: Zafar Mahmood, Fahimullah Babar, Surriyia Naz, Hafiz Ur Rehman

Abstract:

Cadmium Sulphide (CdS) was deposited on a Tec 15 glass substrate with the help of CBD (chemical bath deposition process) and then cadmium telluride CdTe was deposited on CdS with the help of CSS (closed spaced sublimation technique) for the construction of a solar cell. The thicknesses of all the deposited materials were measured with the help of Elipsometry. The IV graphs were drawn in order to observe the current voltage output. The efficiency of the cell was graphed with the fill factor as well (graphs not given here).The efficiency came out to be approximately 16.5 % and the CIGS (copper- indium –gallium- selenide) maximum efficiency is 20 %.The efficiency of a solar cell can further be enhanced by adapting quality materials, good experimental devices and proper procedures. The grain size was analyzed with the help of scanning electron microscope using RBS (Rutherford backscattering spectroscopy).

Keywords: CBD, CdS, CdTe, CSS

Procedia PDF Downloads 353
19727 The Comparative Effect of Practicing Self-Assessment and Critical Thinking Skills on EFL Learners’ Writing Ability

Authors: Behdokht Mall-Amiri, Sara Farzaminejad

Abstract:

The purpose of the present study was to discover which of the two writing activities, a self-assessment questioner or a critical thinking skills handout, is more effective on Iranian EFL learners’ writing ability. To fulfill the purpose of the study, a sample of 120 undergraduate students of English SAT for a standardized sample of PET. Eighty-two students whose scores fell one standard deviation above and below the sample mean were selected and randomly divided into two equal groups. One group practiced self-assessment and the other group practiced critical thinking skills while they were learning process writing. A writing posttest was finally administered to the students in both groups and the mean rank scores were compared by t-test. The result led to the rejection of the null hypothesis, indicating that practicing critical thinking skills had a significantly higher effect on the writing ability. The implications of the study for students and teachers as well as course book designers are discussed.

Keywords: writing ability, process writing, critical thinking skills, self-assessment

Procedia PDF Downloads 322
19726 Development of new Ecological Cleaning Process of Metal Sheets

Authors: L. M. López López, J. V. Montesdeoca Contreras, A. R. Cuji Fajardo, L. E. Garzón Muñoz, J. I. Fajardo Seminario

Abstract:

In this article a new method of cleaning process of metal sheets for household appliances was developed, using low-pressure cold plasma. In this context, this research consist in analyze the results of metal sheets cleaning process using plasma and compare with pickling process to determinate the efficiency of each process and the level of contamination produced. Surface Cleaning was evaluated by measuring the contact angle with deionized water, diiodo methane and ethylene glycol, for the calculus of the surface free energy by means of the Fowkes theories and Wu. Showing that low-pressure cold plasma is very efficient both in cleaning process how in environment impact.

Keywords: efficient use of plasma, ecological impact of plasma, metal sheets cleaning means, plasma cleaning process.

Procedia PDF Downloads 341
19725 Case-Based Reasoning Approach for Process Planning of Internal Thread Cold Extrusion

Authors: D. Zhang, H. Y. Du, G. W. Li, J. Zeng, D. W. Zuo, Y. P. You

Abstract:

For the difficult issues of process selection, case-based reasoning technology is applied to computer aided process planning system for cold form tapping of internal threads on the basis of similarity in the process. A model is established based on the analysis of process planning. Case representation and similarity computing method are given. Confidence degree is used to evaluate the case. Rule-based reuse strategy is presented. The scheme is illustrated and verified by practical application. The case shows the design results with the proposed method are effective.

Keywords: case-based reasoning, internal thread, cold extrusion, process planning

Procedia PDF Downloads 496
19724 The Importance of Patenting and Technology Exports as Indicators of Economic Development

Authors: Hugo Rodríguez

Abstract:

The patenting of inventions is the result of an organized effort to achieve technological improvement and its consequent positive impact on the population's standard of living. Technology exports, either of high-tech goods or of Information and Communication Technology (ICT) services, represent the level of acceptance that world markets have of that technology acquired or developed by a country, either in public or private settings. A quantitative measure of the above variables is expected to have a positive and relevant impact on the level of economic development of the countries, measured on this first occasion through their level of Gross Domestic Product (GDP). And in that sense, it not only explains the performance of an economy but the difference between nations. We present an econometric model where we seek to explain the difference between the GDP levels of 178 countries through their different performance in the outputs of the technological production process. We take the variables of Patenting, ICT Exports and High Technology Exports as results of the innovation process. This model achieves an explanatory power for four annual cuts (2000, 2005, 2010 and 2015) equivalent to an adjusted r2 of 0.91, 0.87, 0.91 and 0.96, respectively.

Keywords: Development, exports, patents, technology

Procedia PDF Downloads 100